felix's blog

US Justice Dept. defends warrantless cell phone tracking

The FBI and other police agencies don't need to obtain a search warrant to learn the locations of Americans' cell phones, the U.S. Department of Justice told a federal appeals court in Philadelphia on Friday.

A Justice Department attorney told the Third Circuit Court of Appeals that there is no constitutional problem with obtaining records from cellular providers that can reveal the approximate locations of handheld and mobile devices. (See CNET's previous article.)

Source: CNET

The Second Index. Search Engines, Personalization and Surveillance (Deep Search)

Introduction1

Google’s well-known mission is “to organize the world’s information”. It is, however, impossible to organize the world’s information without an operating model of the world. Melvil(le) Dewey (1851-1931), working at the height of Western colonial power, could simply take the Victorian world view as the basis for a universal classification system, which, for example, put all “religions other than Christianity” into a single category (no. 290). Such a biased classification scheme, for all its ongoing usefulness in libraries, cannot work in the irreducibly multi-cultural world of global communication. In fact, no uniform classification scheme can work, given the impossibility of agreeing on a single cultural framework through which to define the categories.2 This, in addition to the scaling issues, is the reason why internet directories, as pioneered by Yahoo! and the Open Directory Project (demoz)3, broke down after a short-lived period of success.

Search engines side-step this problem by flexibly reorganizing the index in relation to each query and using the self-referential method of link analysis to construct the ranking of the query list (see Katja Mayer in this volume). This ranking is said to be objective, reflecting the actual topology of the network that emerges unplanned through collective action. Knowing this topology, search engines favor link-rich nodes over link-poor outliers. This objectivity is one of the core elements of search engines, since it both scales well and increases the users’ trust in the system.

Deep Search: Introduction

Introduction

It’s hard to avoid search engines these days. They have established themselves as essential tools for navigating the dynamic, expanding informational landscape that is the Internet. The Internet’s distributed structure doesn’t allow for any centralized index or catalog; consequently, search engines fulfill a central task of making digital information accessible and thus usable. It’s hard to imagine our lives without their ever-expanding digital dimension; and so it’s increasingly hard to imagine life without search engines. This book looks at the long history of the struggle to impose order on the always-fragile information universe, addresses key social and political issues raised by today’s search engines, and envisions approaches that would break with the current paradigms they impose.

Nachahmung, Transformation und Autorfunktion

Dieser Text ist publiziert in: Kroeger, Odin, Günther Friesinger, Paul Lohberger, Eberhard Ortland und Thomas Ballhausen, Hg. Geistiges Eigentum und Originalität: Zur Politik der Wissens- und Kulturproduktion. Wien: Turia + Kant, 2011, s. 19-32.

Abstract: Employing concepts developed by Gabriel Tarde and Bruno Latour, this article investigates at how a new function of the author is being defined in digital media. What is found to emerge is a practical alternative to the dichotomy between notions of possessive individualism (underlying copyright law) and simplified notions of the death of the author. Here, authorship functions less as a means to establish rigid ownership and control, but serves more as a dynamic system of accountability and reputation building capable of structuring open, collaborative processes.

Einleitung

Vor nunmehr 40 Jahren verkündete Roland Barthes den „Tod des Autors“ (vgl. Barthes, 1969/2000). Was zu diesem Zeitpunkt eine notwendige Abkehr einer durch den Anglo-Amerikanischen „new criticism“ bereits überholten aber in Frankreich immer noch bestimmenden Form der Autor-zentrierten Literaturkritik war, ist bald zu einem Cliché und damit zu einer Sackgasse verkommen. Vor die Wahl gestellt zwischen einer konventionellen Ausprägung der Autorschaft – Cartesianisches Ego übersetzt in bürgerliche Subjektivität untermauert durch das Urheberrecht – und einer diffundierten Autorschaft – aufgegangen im „Murmeln des Diskurses“ wie es Michel Foucault ausdrückte (vgl. 1972/1991) – erfreute sich die erstere in der Praxis einer hartnäckigen Beliebtheit. Dazu trugen auch die Kulturindustrie und der Kunstmarkt bei, deren ideologischen Kern eine übersteigerte Autorschaft ausmacht, die sie aus nachvollziehbarem Eigeninteresse bis heute mit großem Nachdruck propagieren. Die theoretische Dekonstruktion hingegen ging zum einem kaum über die Feststellung der Dezentrierung, Verteilung oder Zerstreuung der Autorschaft hinaus. Zum anderen wurde Praxis des kulturellen Schaffen, die Werke und damit Autoren hervorbringt, lange Zeit nur an den Rändern verändert. Collage, Assemblage und Appropriation als künstlerische Methoden wurden jeweils schnell assimiliert. Explizit anti-autorielle Praktiken blieben politisch und kulturell marginal (vgl. Cramer, 2008), und oftmals verhaftet im widersprüchlichen romantischen Gedanken, wonach das Zurücktreten des Schöpfers zur Steigerung der Erhabenheit des Werkes (und damit, indirekt aber umso wirksamer, des symbolischen Kapitals des Autors) führe.

Deep Search: The Politics of Search Beyond Google

I'm extremely proud that our book on the politics of search engines is out now. I know I'm totally biased, but I really think this is a great collection of essays. And it's available in English as well as in German. Two separate books, each containing original and translated texts.

Konrad Becker/Felix Stalder (eds.). Deep Search. The Politics of Search beyond Google. Studienverlag. Distributed by Transaction Publishers, New Jersey 2009 pp. 214, ISBN 978-3706547956

“This collection gets to the heart of the most important issues concerning our global information ecosystem: Will the ‘soft power’ of one, two, or three corporations exert inordinate yet undetectable influence over what we consider important, beautiful, or true? What are the possibilities for resistance? What are the proper avenues for law, policy, and personal choices? This book walks us through these challenges like no other before it.”

Siva Vaidhyanathan, University of Virginia, author of The Googlization of Everything

Google vs Mozilla

The Register has an interesting article on the growing tensions between Google and Mozilla. It highlights the dangers of monopoly and the fundamental differences between non-profit and for-profit corporations and their outlook on the world.

"I look at Google and I don't see a lot of alignment with the big picture of the internet," says Asa Dotzler, the ten-year Mozilla vet who was among the team of three or four who founded the Firefox project back in 2002.

"Google is essentially an advertising company. That's where they make their money. They provide a wonderful service - primarily their search service - but it serves their advertising goals. It serves their revenue goals. The more they can know about their users, the more effective they believe they can advertise, the more money they believe they can make. That is most fundamental."

End of the piratebay tracker, but not of piratebay

Ars Technica writes:

The Pirate Bay announced today that it was killing its popular BitTorrent tracker, relying instead on the truly decentralized DHT protocol to handle file location and downloading. For TPB, it's all about removing centralized points of control.

See also Torrentfreak's take on the story and it's explanation of the decentralized technology (DHT, magnet links) replacing the central tracker.

Content recognition engine

There is very little information about the back-end of Youtube (provided by a company called audible magic), which watermarks content to screen for copyright violation. But there's an interesting snippet by Viacom's general counsel.

Fricklas points to the recent MTV music awards, where Kanye West rushed the stage, grabbed the mic, and delivered his Internet-meme-producing-line, "I'mma let you finish, but…" Viacom quickly uploaded the evening's footage into the content recognition engines of sites like YouTube, which can then block exact uploads of the same footage or allow rightsholders to monetize it with ads. Viacom used the tool to block copies of the clip, but not without offering a solution of its own: the clip was hosted on Viacom websites and was embeddable and linkable.

It also points to a more flexible strategie: Block exact copies, earn money from other people's mash-ups (who themselves don't earn money).

Source: Ars Technica,Viacom's top lawyer: suing P2P users "felt like terrorism" November 16, 2009

Funding the first copy

If we want to enable free access to knowledge goods, we need to find ways to finance the first copy. The industrial business model has been to regard the costs of the first copy as up-front investment that is later recouped by controlling access to subsequent copies. This model is clearly broken, if everyone can make copies, or if the resulting price of the copies is so high, that people who need them, cannot afford it. As is the case with many drugs, particularly, but not only, in the developing world. In the case of the latter, the big idea is to move away from the patent system (which grants exclusivity as an incentive to invest in research) to a system of prices. As James Love and Tim Hubbard write in an extensive research paper:

Reforming the way we pay for R&D on new medicines involves a simple but powerful idea. Rather than give drug developers the exclusive rights to sell products, the government would award innovators money: large monetary “prizes” tied to the actual impact of the invention on improvements in health care outcomes that successful products actually deliver.

Source: James Love and Tim Hubbard. "The Big Idea: Prizes to Stimulate R&D for New Medicines." Chicago-Kent Law Review, Volume 82, Number 3 (2007)

U.S. Spies Buy Stake in Firm That Monitors Blogs, Tweets

Source: Wired.com

In-Q-Tel, the investment arm of the CIA and the wider intelligence community, is putting cash into Visible Technologies, a software firm that specializes in monitoring social media. It’s part of a larger movement within the spy services to get better at using ”open source intelligence” — information that’s publicly available, but often hidden in the flood of TV shows, newspaper articles, blog posts, online videos and radio reports generated every day.
<...>
The intelligence community has been interested in social media for years. In-Q-Tel has sunk money into companies like Attensity, which recently announced its own web 2.0-monitoring service. The agencies have their own, password-protected blogs and wikis — even a MySpace for spooks. The Office of the Director of National Intelligence maintains an Open Source Center, which combs publicly available information, including web 2.0 sites. Doug Naquin, the Center’s Director, told an audience of intelligence professionals in October 2007 that “we’re looking now at YouTube, which carries some unique and honest-to-goodness intelligence…. We have groups looking at what they call ‘citizens media’: people taking pictures with their cell phones and posting them on the internet. Then there’s social media, phenomena like MySpace and blogs.”