Seit bald 150 Jahren soll das moderne Urheberrecht (UHR) den Umgang mit „Werke[n] der Literatur, Wissenschaft und Kunst“ (§ 1, des aktuellen deutschen Urheberrechtsgesetz) regeln. Rund 100 Jahre zurück reichen die Anfänge der avantgardistischen Subversion bürgerlich-romantischer Konzeptionen von „KünstlerIn“ und „Werk“, die auch dem UHR zugrunde liegen. Durch die Einführung des Prinzips Zufall, die Verwendung bestehender kultureller Artefakte, die Betonung der Rolle des Unterbewussten, oder die direkte Intervention in soziale Prozesse wurde die Vorstellung des autonomen, aus sich selbst schöpfenden Subjekts von allen Seiten untergraben. Dessen ungeachtet propagierte der expandierende Kunstmarkt genau dieses KünstlerInnenbild. Vor diesem Hintergrund ist es fast erstaunlich, dass das Urheberrecht selbst erst sehr spät im Feld der Kunst direkt relevant wurde. Der erste große Gerichtsfall fand 1990-92 statt. Der Photograph Art Rogers verklagte den Künstler Jeff Koons, weil dieser nach Vorlage einer Postkarte von Rogers (Puppies, 1986) eine Skulptur (A String of Puppies, 1988) anfertigen ließ und in der Ausstellung Ushering in Banality präsentiert hatte. Im Kern ging es um die Frage, ob es sich hierbei um eine legitime künstlerische Strategie (die sich auf das US-amerikanische Recht des fair use berufen könne) oder um eine unautorisierte Werkbearbeitung handle. Koons verlor den Prozess sowohl in der ersten als auch in der zweiten Instanz und sah sich bald mit einer Fülle von Klagen konfrontiert, die konzeptuell vergleichbare Werke betrafen.
Most of the existing research on Wikipedia is at the aggregate level, looking at total number of edits for an article, for example, or how many unique contributors participated in its creation," said Ram, who is a McClelland Professor of MIS in the Eller College.
"What was missing was an explanation for why some articles are of high quality and others are not," she said. "We investigated the relationship between collaboration and data quality."
Wikipedia has an internal quality rating system for entries, with featured articles at the top, followed by A, B, and C-level entries. Ram and Liu randomly collected 400 articles at each quality level and applied a data provenance model they developed in an earlier paper.
"We used data mining techniques and identified various patterns of collaboration based on the provenance or, more specifically, who does what to Wikipedia articles," Ram says. "These collaboration patterns either help increase quality or are detrimental to data quality."
Ram and Liu identified seven specific roles that Wikipedia contributors play.
Starters, for example, create sentences but seldom engage in other actions. Content justifiers create sentences and justify them with resources and links. Copy editors contribute primarily though modifying existing sentences. Some users – the all-round contributors – perform many different functions.
"We then clustered the articles based on these roles and examined the collaboration patterns within each cluster to see what kind of quality resulted," Ram said. "We found that all-round contributors dominated the best-quality entries. In the entries with the lowest quality, starters and casual contributors dominated."
I'm reading: The ecstasy of influence: A plagiarism, by Jonathan Lethem, (Harper's Magazine, Feb.2007). Which is how influences flow freely in art and most artists have no problem admitting this (why should they?). Today's strong claims of copyright are based on what he calls "source hypocrisy" (denial of one's sources, refusal to allow one's work to become the source for someone else). In most cases, artists themselves are less hyporcritics than the corporations, trusts, foundations administring their work. Lethem also mentions that Bob Dylan never refused the permission for a sample.
A large, diverse society cannot survive without property; a large, diverse, and modern society cannot flourish without some form of intellectual property. But it takes little reflection to grasp that there is ample value that the term “property” doesn't capture. And works of art exist simultaneously in two economies, a market economy and a gift economy. The cardinal difference between gift and commodity exchange is that a gift establishes a feeling-bond between two people, whereas the sale of a commodity leaves no necessary connection. (....) But a gift makes a connection. There are many examples, the candy or cigarette offered to a stranger who shares a seat on the plane, the few words that indicate goodwill between passengers on the late-night bus. These tokens establish the simplest bonds of social life, but the model they offer may be extended to the most complicated of unions—marriage, parenthood, mentorship. If a value is placed on these (often essentially unequal) exchanges, they degenerate into something else.
- X-axis: licensing permissiveness (freedom score)
- Y-axis: volume (number of CC-licensed items) per capita
- Bubble size: absolute volume
- Region colors: Orange -> Europe, Blue -> Asia, Green -> South Americmore information: http://wiki.creativecommons.org/Metrics
The US is missing.
Source: Andy Updegrove: A Big Victory for F/OSS: Jacobsen v. Katzer is Settled Friday, February 19 2010
A long running case of great significance to the legal underpinnings of free and open source/open source software (F/OSS) has just settled on terms favorable to the F/OSS developer. The settlement follows a recent ruling by a U.S. Federal District Court judge that affirmed several key rights of F/OSS developers under existing law.
Significantly for the F/OSS cause, the court found in favor of Jacobsen (the F/OSS developer) on three key points:
1. The code in question was sufficiently original to be entitled to copyright protection. While not unique to F/OSS code, this was a legal issue on which Jacobsen had to prevail in order to assert claims under copyright law.
2. While the JMRI Project made its code available for free, there was "evidence in the record attributing a monetary value for the actual work performed by the contributors to the JMRI project," thus laying the basis for monetary damages.
3. The removal of the copyright and authorship data contained in the pirated code was a violation of the Digital Millennium Copyright Act, thus providing a basis for suit for that action in violation of the JMRI license.
Point 2 is particularly important.
A range of applications, from predicting the spread of human and electronic viruses to city planning and resource management in mobile communications, depend on our ability to foresee the whereabouts and mobility of individuals, raising a fundamental question: To what degree is human behavior predictable? Here we explore the limits of predictability in human dynamics by studying the mobility patterns of anonymized mobile phone users. By measuring the entropy of each individual’s trajectory, we find a 93% potential predictability in user mobility across the whole user base. Despite the significant differences in the travel patterns, we find a remarkable lack of variability in predictability, which is largely independent of the distance users cover on a regular basis.
Source: Chaoming Song; Zehui Qu; Nicholas Blumm, Albert-László Barabási: Limits of Predictability in Human Mobility, Science 19 February 2010: Vol. 327. no. 5968, pp. 1018 - 1021
The CRA is a new project from Public Knowledge, with much of the heavy lifting being done by the Cyberlaw Clinic at Stanford and the Technology & Public Policy Clinic at UC-Berkeley. While Berkeley's noted copyright scholar Pam Samuelson works up a new "model statute" for copyright law in the digital age, Public Knowledge hopes to make smaller interim fixes to copyright law that won't require the same dramatic reworking.
This week, it released the first of these reform ideas (PDF) focusing on the principle of fair use. In addition to "criticism" and "news reporting" and the rest of the items in the fair use preamble, the CRA proposes the addition of three more: "incidental uses, non-consumptive uses, and personal, non-commercial uses." They might sound minor, but these suggestions are bound to provoke controversy.
The FBI and other police agencies don't need to obtain a search warrant to learn the locations of Americans' cell phones, the U.S. Department of Justice told a federal appeals court in Philadelphia on Friday.
A Justice Department attorney told the Third Circuit Court of Appeals that there is no constitutional problem with obtaining records from cellular providers that can reveal the approximate locations of handheld and mobile devices. (See CNET's previous article.)
Google’s well-known mission is “to organize the world’s information”. It is, however, impossible to organize the world’s information without an operating model of the world. Melvil(le) Dewey (1851-1931), working at the height of Western colonial power, could simply take the Victorian world view as the basis for a universal classiﬁcation system, which, for example, put all “religions other than Christianity” into a single category (no. 290). Such a biased classiﬁcation scheme, for all its ongoing usefulness in libraries, cannot work in the irreducibly multi-cultural world of global communication. In fact, no uniform classiﬁcation scheme can work, given the impossibility of agreeing on a single cultural framework through which to deﬁne the categories.2 This, in addition to the scaling issues, is the reason why internet directories, as pioneered by Yahoo! and the Open Directory Project (demoz)3, broke down after a short-lived period of success.
Search engines side-step this problem by ﬂexibly reorganizing the index in relation to each query and using the self-referential method of link analysis to construct the ranking of the query list (see Katja Mayer in this volume). This ranking is said to be objective, reﬂecting the actual topology of the network that emerges unplanned through collective action. Knowing this topology, search engines favor link-rich nodes over link-poor outliers. This objectivity is one of the core elements of search engines, since it both scales well and increases the users’ trust in the system.
It’s hard to avoid search engines these days. They have established themselves as essential tools for navigating the dynamic, expanding informational landscape that is the Internet. The Internet’s distributed structure doesn’t allow for any centralized index or catalog; consequently, search engines fulﬁll a central task of making digital information accessible and thus usable. It’s hard to imagine our lives without their ever-expanding digital dimension; and so it’s increasingly hard to imagine life without search engines. This book looks at the long history of the struggle to impose order on the always-fragile information universe, addresses key social and political issues raised by today’s search engines, and envisions approaches that would break with the current paradigms they impose.