felix's blog

GPL Upheld in Court.

Source: Andy Updegrove: A Big Victory for F/OSS: Jacobsen v. Katzer is Settled Friday, February 19 2010

A long running case of great significance to the legal underpinnings of free and open source/open source software (F/OSS) has just settled on terms favorable to the F/OSS developer.  The settlement follows a recent ruling by a U.S. Federal District Court judge that affirmed several key rights of F/OSS developers under existing law.
<..>
Significantly for the F/OSS cause,  the court found in favor of Jacobsen (the F/OSS developer) on three key points:

1.  The code in question was sufficiently original to be entitled to copyright protection.  While not unique to F/OSS code, this was a legal issue on which Jacobsen had to prevail in order to assert claims under copyright law.

2.  While the JMRI Project made its code available for free, there was "evidence in the record attributing a monetary value for the actual work performed by the contributors to the JMRI project," thus laying the basis for monetary damages.

3.  The removal of the copyright and authorship data contained in the pirated code was a violation of the Digital Millennium Copyright Act, thus providing a basis for suit for that action in violation of the JMRI license.

Point 2 is particularly important.

Predicting Human Mobility through phone tracking

A range of applications, from predicting the spread of human and electronic viruses to city planning and resource management in mobile communications, depend on our ability to foresee the whereabouts and mobility of individuals, raising a fundamental question: To what degree is human behavior predictable? Here we explore the limits of predictability in human dynamics by studying the mobility patterns of anonymized mobile phone users. By measuring the entropy of each individual’s trajectory, we find a 93% potential predictability in user mobility across the whole user base. Despite the significant differences in the travel patterns, we find a remarkable lack of variability in predictability, which is largely independent of the distance users cover on a regular basis.

Source: Chaoming Song; Zehui Qu; Nicholas Blumm, Albert-László Barabási: Limits of Predictability in Human Mobility, Science 19 February 2010: Vol. 327. no. 5968, pp. 1018 - 1021

Copyright Reform ACT: New definition of fair use

Ars Technica writes:

The CRA is a new project from Public Knowledge, with much of the heavy lifting being done by the Cyberlaw Clinic at Stanford and the Technology & Public Policy Clinic at UC-Berkeley. While Berkeley's noted copyright scholar Pam Samuelson works up a new "model statute" for copyright law in the digital age, Public Knowledge hopes to make smaller interim fixes to copyright law that won't require the same dramatic reworking.

This week, it released the first of these reform ideas (PDF) focusing on the principle of fair use. In addition to "criticism" and "news reporting" and the rest of the items in the fair use preamble, the CRA proposes the addition of three more: "incidental uses, non-consumptive uses, and personal, non-commercial uses." They might sound minor, but these suggestions are bound to provoke controversy.

US Justice Dept. defends warrantless cell phone tracking

The FBI and other police agencies don't need to obtain a search warrant to learn the locations of Americans' cell phones, the U.S. Department of Justice told a federal appeals court in Philadelphia on Friday.

A Justice Department attorney told the Third Circuit Court of Appeals that there is no constitutional problem with obtaining records from cellular providers that can reveal the approximate locations of handheld and mobile devices. (See CNET's previous article.)

Source: CNET

The Second Index. Search Engines, Personalization and Surveillance (Deep Search)

Introduction1

Google’s well-known mission is “to organize the world’s information”. It is, however, impossible to organize the world’s information without an operating model of the world. Melvil(le) Dewey (1851-1931), working at the height of Western colonial power, could simply take the Victorian world view as the basis for a universal classification system, which, for example, put all “religions other than Christianity” into a single category (no. 290). Such a biased classification scheme, for all its ongoing usefulness in libraries, cannot work in the irreducibly multi-cultural world of global communication. In fact, no uniform classification scheme can work, given the impossibility of agreeing on a single cultural framework through which to define the categories.2 This, in addition to the scaling issues, is the reason why internet directories, as pioneered by Yahoo! and the Open Directory Project (demoz)3, broke down after a short-lived period of success.

Search engines side-step this problem by flexibly reorganizing the index in relation to each query and using the self-referential method of link analysis to construct the ranking of the query list (see Katja Mayer in this volume). This ranking is said to be objective, reflecting the actual topology of the network that emerges unplanned through collective action. Knowing this topology, search engines favor link-rich nodes over link-poor outliers. This objectivity is one of the core elements of search engines, since it both scales well and increases the users’ trust in the system.

Deep Search: Introduction

Introduction

It’s hard to avoid search engines these days. They have established themselves as essential tools for navigating the dynamic, expanding informational landscape that is the Internet. The Internet’s distributed structure doesn’t allow for any centralized index or catalog; consequently, search engines fulfill a central task of making digital information accessible and thus usable. It’s hard to imagine our lives without their ever-expanding digital dimension; and so it’s increasingly hard to imagine life without search engines. This book looks at the long history of the struggle to impose order on the always-fragile information universe, addresses key social and political issues raised by today’s search engines, and envisions approaches that would break with the current paradigms they impose.

Nachahmung, Transformation und Autorfunktion

Dieser Text ist publiziert in: Kroeger, Odin, Günther Friesinger, Paul Lohberger, Eberhard Ortland und Thomas Ballhausen, Hg. Geistiges Eigentum und Originalität: Zur Politik der Wissens- und Kulturproduktion. Wien: Turia + Kant, 2011, s. 19-32.

Abstract: Employing concepts developed by Gabriel Tarde and Bruno Latour, this article investigates at how a new function of the author is being defined in digital media. What is found to emerge is a practical alternative to the dichotomy between notions of possessive individualism (underlying copyright law) and simplified notions of the death of the author. Here, authorship functions less as a means to establish rigid ownership and control, but serves more as a dynamic system of accountability and reputation building capable of structuring open, collaborative processes.

Einleitung

Vor nunmehr 40 Jahren verkündete Roland Barthes den „Tod des Autors“ (vgl. Barthes, 1969/2000). Was zu diesem Zeitpunkt eine notwendige Abkehr einer durch den Anglo-Amerikanischen „new criticism“ bereits überholten aber in Frankreich immer noch bestimmenden Form der Autor-zentrierten Literaturkritik war, ist bald zu einem Cliché und damit zu einer Sackgasse verkommen. Vor die Wahl gestellt zwischen einer konventionellen Ausprägung der Autorschaft – Cartesianisches Ego übersetzt in bürgerliche Subjektivität untermauert durch das Urheberrecht – und einer diffundierten Autorschaft – aufgegangen im „Murmeln des Diskurses“ wie es Michel Foucault ausdrückte (vgl. 1972/1991) – erfreute sich die erstere in der Praxis einer hartnäckigen Beliebtheit. Dazu trugen auch die Kulturindustrie und der Kunstmarkt bei, deren ideologischen Kern eine übersteigerte Autorschaft ausmacht, die sie aus nachvollziehbarem Eigeninteresse bis heute mit großem Nachdruck propagieren. Die theoretische Dekonstruktion hingegen ging zum einem kaum über die Feststellung der Dezentrierung, Verteilung oder Zerstreuung der Autorschaft hinaus. Zum anderen wurde Praxis des kulturellen Schaffen, die Werke und damit Autoren hervorbringt, lange Zeit nur an den Rändern verändert. Collage, Assemblage und Appropriation als künstlerische Methoden wurden jeweils schnell assimiliert. Explizit anti-autorielle Praktiken blieben politisch und kulturell marginal (vgl. Cramer, 2008), und oftmals verhaftet im widersprüchlichen romantischen Gedanken, wonach das Zurücktreten des Schöpfers zur Steigerung der Erhabenheit des Werkes (und damit, indirekt aber umso wirksamer, des symbolischen Kapitals des Autors) führe.

Deep Search: The Politics of Search Beyond Google

I'm extremely proud that our book on the politics of search engines is out now. I know I'm totally biased, but I really think this is a great collection of essays. And it's available in English as well as in German. Two separate books, each containing original and translated texts.

Konrad Becker/Felix Stalder (eds.). Deep Search. The Politics of Search beyond Google. Studienverlag. Distributed by Transaction Publishers, New Jersey 2009 pp. 214, ISBN 978-3706547956

“This collection gets to the heart of the most important issues concerning our global information ecosystem: Will the ‘soft power’ of one, two, or three corporations exert inordinate yet undetectable influence over what we consider important, beautiful, or true? What are the possibilities for resistance? What are the proper avenues for law, policy, and personal choices? This book walks us through these challenges like no other before it.”

Siva Vaidhyanathan, University of Virginia, author of The Googlization of Everything

Google vs Mozilla

The Register has an interesting article on the growing tensions between Google and Mozilla. It highlights the dangers of monopoly and the fundamental differences between non-profit and for-profit corporations and their outlook on the world.

"I look at Google and I don't see a lot of alignment with the big picture of the internet," says Asa Dotzler, the ten-year Mozilla vet who was among the team of three or four who founded the Firefox project back in 2002.

"Google is essentially an advertising company. That's where they make their money. They provide a wonderful service - primarily their search service - but it serves their advertising goals. It serves their revenue goals. The more they can know about their users, the more effective they believe they can advertise, the more money they believe they can make. That is most fundamental."

End of the piratebay tracker, but not of piratebay

Ars Technica writes:

The Pirate Bay announced today that it was killing its popular BitTorrent tracker, relying instead on the truly decentralized DHT protocol to handle file location and downloading. For TPB, it's all about removing centralized points of control.

See also Torrentfreak's take on the story and it's explanation of the decentralized technology (DHT, magnet links) replacing the central tracker.