The Second Index. Search Engines, Personalization and Surveillance (Deep Search)


Google’s well-known mission is “to organize the world’s information”. It is, however, impossible to organize the world’s information without an operating model of the world. Melvil(le) Dewey (1851-1931), working at the height of Western colonial power, could simply take the Victorian world view as the basis for a universal classification system, which, for example, put all “religions other than Christianity” into a single category (no. 290). Such a biased classification scheme, for all its ongoing usefulness in libraries, cannot work in the irreducibly multi-cultural world of global communication. In fact, no uniform classification scheme can work, given the impossibility of agreeing on a single cultural framework through which to define the categories.2 This, in addition to the scaling issues, is the reason why internet directories, as pioneered by Yahoo! and the Open Directory Project (demoz)3, broke down after a short-lived period of success.

Search engines side-step this problem by flexibly reorganizing the index in relation to each query and using the self-referential method of link analysis to construct the ranking of the query list (see Katja Mayer in this volume). This ranking is said to be objective, reflecting the actual topology of the network that emerges unplanned through collective action. Knowing this topology, search engines favor link-rich nodes over link-poor outliers. This objectivity is one of the core elements of search engines, since it both scales well and increases the users’ trust in the system.

Deep Search: Introduction


It’s hard to avoid search engines these days. They have established themselves as essential tools for navigating the dynamic, expanding informational landscape that is the Internet. The Internet’s distributed structure doesn’t allow for any centralized index or catalog; consequently, search engines fulfill a central task of making digital information accessible and thus usable. It’s hard to imagine our lives without their ever-expanding digital dimension; and so it’s increasingly hard to imagine life without search engines. This book looks at the long history of the struggle to impose order on the always-fragile information universe, addresses key social and political issues raised by today’s search engines, and envisions approaches that would break with the current paradigms they impose.

Nachahmung, Transformation und Autorfunktion

Dieser Text ist publiziert in: Kroeger, Odin, Günther Friesinger, Paul Lohberger, Eberhard Ortland und Thomas Ballhausen, Hg. Geistiges Eigentum und Originalität: Zur Politik der Wissens- und Kulturproduktion. Wien: Turia + Kant, 2011, s. 19-32.

Abstract: Employing concepts developed by Gabriel Tarde and Bruno Latour, this article investigates at how a new function of the author is being defined in digital media. What is found to emerge is a practical alternative to the dichotomy between notions of possessive individualism (underlying copyright law) and simplified notions of the death of the author. Here, authorship functions less as a means to establish rigid ownership and control, but serves more as a dynamic system of accountability and reputation building capable of structuring open, collaborative processes.


Vor nunmehr 40 Jahren verkündete Roland Barthes den „Tod des Autors“ (vgl. Barthes, 1969/2000). Was zu diesem Zeitpunkt eine notwendige Abkehr einer durch den Anglo-Amerikanischen „new criticism“ bereits überholten aber in Frankreich immer noch bestimmenden Form der Autor-zentrierten Literaturkritik war, ist bald zu einem Cliché und damit zu einer Sackgasse verkommen. Vor die Wahl gestellt zwischen einer konventionellen Ausprägung der Autorschaft – Cartesianisches Ego übersetzt in bürgerliche Subjektivität untermauert durch das Urheberrecht – und einer diffundierten Autorschaft – aufgegangen im „Murmeln des Diskurses“ wie es Michel Foucault ausdrückte (vgl. 1972/1991) – erfreute sich die erstere in der Praxis einer hartnäckigen Beliebtheit. Dazu trugen auch die Kulturindustrie und der Kunstmarkt bei, deren ideologischen Kern eine übersteigerte Autorschaft ausmacht, die sie aus nachvollziehbarem Eigeninteresse bis heute mit großem Nachdruck propagieren. Die theoretische Dekonstruktion hingegen ging zum einem kaum über die Feststellung der Dezentrierung, Verteilung oder Zerstreuung der Autorschaft hinaus. Zum anderen wurde Praxis des kulturellen Schaffen, die Werke und damit Autoren hervorbringt, lange Zeit nur an den Rändern verändert. Collage, Assemblage und Appropriation als künstlerische Methoden wurden jeweils schnell assimiliert. Explizit anti-autorielle Praktiken blieben politisch und kulturell marginal (vgl. Cramer, 2008), und oftmals verhaftet im widersprüchlichen romantischen Gedanken, wonach das Zurücktreten des Schöpfers zur Steigerung der Erhabenheit des Werkes (und damit, indirekt aber umso wirksamer, des symbolischen Kapitals des Autors) führe.

Deep Search: The Politics of Search Beyond Google

I'm extremely proud that our book on the politics of search engines is out now. I know I'm totally biased, but I really think this is a great collection of essays. And it's available in English as well as in German. Two separate books, each containing original and translated texts.

Konrad Becker/Felix Stalder (eds.). Deep Search. The Politics of Search beyond Google. Studienverlag. Distributed by Transaction Publishers, New Jersey 2009 pp. 214, ISBN 978-3706547956

“This collection gets to the heart of the most important issues concerning our global information ecosystem: Will the ‘soft power’ of one, two, or three corporations exert inordinate yet undetectable influence over what we consider important, beautiful, or true? What are the possibilities for resistance? What are the proper avenues for law, policy, and personal choices? This book walks us through these challenges like no other before it.”

Siva Vaidhyanathan, University of Virginia, author of The Googlization of Everything

Neun Thesen zur Remix-Kultur

Heute ist mein umfangreiches Essay zur Remix-Kultur bei online gegangen. Es berührt viele Themen, die auch die Digitale Allmend behandelt.

Der Remix ist die kulturelle Form der Netzwerkgesellschaft. Felix Stalder beleuchtet in neun Thesen medienhistorische, technologische, politische, rechtliche, kulturtheorische, soziale und ökonomische Dynamiken, die den Aufstieg und die aktuelle Entwicklung des Remix prägen. In den Konflikten, die damit verbunden sind, spiegelt sich die Tiefe des aktuellen gesellschaftlichen Wandels.

Ganzes Essay als PDF (600 kb)

Update (Nov. 2009): Here is a pretty extensive summary of the paper in english.

Commons: A rough definition

Last week, I spent a few days at a small but intense workshop where we were looking at a the political dimensions of various forms of commons. The discussions were open and far ranging. I tried to distill some of these into a definition of commons that tries to take its various dimensions into considerations and separates structural from political issues. Far from perfect....


A commons is a resource held as joint property by a community. Thus, it is distinct from private property (held by natural or legal persons) or public property (held by the state). Typical for commons is that the management of the resource is oriented towards use-value for its members, rather than towards exchange-value within society at large. The separation between producers and consumers is minimized. Thus, commons are also distinct from other forms of collective ownership (such as co-operatives) that produce for the market.

All commons are social institutions, they depend on a community to create and maintain it. A resource that is freely available to all but not managed in a meaningful way by a self-aware community (e.g., the fish in the open sea) are not a commons. Like in all communities, questions of membership (boundaries) and internal decision-making are subject to ongoing, more or less conflictual, negotiations.

It is these questions that define the political quality of the commons, which can serve as defensive mechanism against market encroachment (e.g., in the case of indigenous commons), as a project of exclusion (e.g., in far-right conceptions of the body national) or as the basis of open cooperation (e.g., in the case of Free and Open Source Software).

les incoherents

La Mona Lisa fumant une pipeI'm doing research on the early practices of remixing and came across this gem from the late 19th century, by Eugène Bataille a member of an art group called "les incoherènts" which I had never heard of, quite frankly. Yet they did many of the things that later the dadaist and surealists would do, a full generation earlier.

Clay Shirky responds to my review

A few weeks ago, I published a review of Clay Shirky's new book "Here comes Everybody" on Metamute.

In the mean time, Simon Collister was able to ask Shirky about my review, where I criticized him for talking only about non-controversial issues and omitting major questions such as copyright and business models / profiling / privacy.

In his response, Shirky focused only on the question of copyright, claiming, strangely, that while he had written a lot about in the past it is was not a real issue in the big picture. Here is what he had to say:

local copy (12MB)

Who will profit from EU plans to extend musical copyright?

As Ars Technica reports, the Open Rights Group has published an estimate how revenue -- created if musical copyright in the EU would be extended by another 45 years -- is going to be distributed:

Only the labels and the top performers would benefit, while the great majority of musicians would little -- less then € 30 -- or nothing at all. This makes it abundantly clear that the main argument for the extension of the terms, helping aging musicians, is utterly insincere. Yet another indication that the expansion of copyright does not benefit creators, but amounts to a subsidy of the exploiters. (see also Berndt Hugenholtz on the subject, or more generally, Martin Kretschmer's work on IP based incomes of creative producers.)

Shift from p2p to Video Streaming?

Ars Technica reports on changing traffic patterns, with streaming video rising while p2p traffic, overall, stagnating, now accounting for only one quarter of the overall traffic. They conclude:

The shift, as it take hold around the world, benefits everyone. For content owners, the gain is obvious: the vast majority of high-traffic streaming content is legal and licensed (Dr. Who, Battlestar Galactica, Colbert Report, etc.). This stands in contrast with P2P, of course, and even though user-generated content sites like YouTube still have copyright issues, those issues are "above water" and easy to deal with.

For users, legitimate on-demand access to huge troves of high-quality video removes the risk of lawsuits, but it also has other beneficial effects. For one thing, the P2P blocking/delaying/filtering schemes being trotted out around the world don't affect most of these services. ISPs have gotten away with such blocks using the argument that most P2P is illegal anyway; without that support, it will politically be much harder to block or limit access to legal streaming in the same way.

In a way, this seems to follow suit with a more general trend related to web2.0 of centralizing infrastructure, thus the ability of big organisations, media companies, to reassert control. Very troubling.

Syndicate content