21 September – 12 October 2019
This exhibition brings together 16 practices through which artists articulate their own forms of (digital) commons. From online archives, to digital tools/infrastructure and educational formats, the projects envision a (post-)digital culture in which notions of collaboration, free access to knowledge, sustainable use of shared resources and data privacy are central.
For the exhibition, artists have developed a SCORE relating to their practice. A SCORE can have different meanings: It can be a general instruction, a working instruction, a performance instruction or an operating instruction. In any case, it is meant to lead to a realization of an intended action and as such is an interface between a human actor and an object/material/machine. And a SCORE can also be linked to a technical HOWTO document, in that it contains information on how to perform a specific task.
This is my contribution to the catalogue for the exhibition "Entangled Realities – Living with Artificial Intelligence" showing at HEK, Basel 09.05.2019 - 11.08.2019.
In day-to-day life, most technologies are black boxes to me.1 I don’t really know how they work, yet I have a reliable sense of the relationship between the input, say pressing a button, and the output, the elevator arriving. What happens in between, whether simple local circuitry or a far-away data centre is involved, I don’t know and I don’t care. Treating complex systems as black boxes is a way of reducing complexity and this is often a very sensible thing to do. However, not all black boxes are equally black, and the depth of the blackness matters quite significantly, not the least in terms of the power relations produced through the technology. The application of artificial intelligence has a tendency to produce particularly dark shades of black. In order to find ways to deal with these applications so that they do not undermine democracy, it is important to differentiate between technical and social shades to avoid that these applications contribute further to an already high concentration of power in the hands of a few technology firms. Art, with its unique ability to create new aesthetics, languages and imaginations, can play an important role in this battle.
Abstract: One of the consequences of digitization is a deepening crisis of epistemology, caused by the proliferation of social, biological and machinic actors that overwhelm established methods of generating and organizing knowledge. Machine-driven analysis of large data sets is introducing a new way of doing science. In this, it is answering to this crisis while, at the same time, deepening it. Continuing to claim ‘scientific objectivity’ is becoming ever more impossible and in practice is likely to serve as a way to abdicate responsibility for the actual research and its consequences. Rather, we should seek to highlight the positionality and partiality of any claim, also and in particular in data science, thus rendering more obvious the need to combine competing claims into an understanding of the world that is not so much inter- but rather multi-subjective.
Keywords: epistemology, digitality, data science, reproducibility crisis, multi-subjectivity
One of the consequences of digitization is a deepening crisis of epistemology, caused by the proliferation of social, biological and machinic actors that overwhelm established methods of generating and organizing knowledge (Stalder 2018). And, since there is a close relationship between epistemology and politics, between ways of knowing and ways of managing the world, we are also in a deep political crisis. This manifest itself not the least in a populist rejection of ‘science’ and ‘facts’ (Manjoo 2008). This crisis of the established – let’s call it modern-liberal – epistemic-political order has created a space for the establishment of a new one, which doesn’t yet have a name, even if its outlines are already visible.
15 years ago, Mark Zuckerberg launched Facebook, then still called thefacebook, as a network for students at Harvard University. Today, almost 2.7 billion people use its services. And for 15 years he has been stressing like a prayer wheel that "connecting" and "sharing" make the world a better place and that Facebook stands for the epochal transition from oppressive hierarchical bureaucracies to liberating horizontal networks.
Today, he's pretty much on his own with that statement. On the one hand, Facebook Inc. has grown into an overpowering, opaque company that has incorporated 72 companies to date, including Instagram (2012), WhatsApp (2014), and virtual reality developer Oculus VR (2014). Moreover, the ownership structure is such that Zuckerberg can exercise almost unlimited power. On the other hand, Facebook is accused of facilitating the dissemination of false or manipulative information and thus contributing to the division of societies and the intensification of conflicts, for example in Great Britain, Sri Lanka, the USA, and Myanmar.
How could a harmless idea - people should be able to communicate easily and quickly with their friends and acquaintances - unfold such a destructive force? The answer is less to be found in the idea of horizontal communication itself or in digital media in general, but in the specific way Facebook implements this idea.
Vor 15 Jahren lancierte Mark Zuckerberg Facebook, damals noch thefacebook, als Netzwerk für Studierende der Universität Harvard. Heute nutzen knapp 2.7 Milliarden Menschen seine Services. Und seit 15 Jahren betont er gebetsmühlenartig, dass «connecting» und «sharing» die Welt besser mache und dass Facebook für den epochalen Übergang von unterdrückenden hierarchischen Bürokratien hin zu befreienden horizontalen Netzwerken stehe.
Mit dieser Behauptung steht er heute ziemlich alleine da. Zum einen ist Facebook Inc. heute selbst zu einem übermächtigen, intransparenten Konzern gewachsen, der sich bis heute 72 Firmen einverleibte, darunter Instagram (2012), WhatsApp (2014) und und den Virtual Reality Entwickler Oculus VR (2014). Zudem sind die Besitzverhältnisse sind so strukturiert, dass Zuckerberg fast unbeschränkte Macht ausüben kann. Zum anderen wird Facebook beschuldigt, der Verbreitung von falschen oder manipulativen Informationen Vorschub zu Leisten und so zur Spaltung der Gesellschaften und zur Intensivierung von Konflikten, etwa in Grossbritannien, in Sri Lanka, in den USA und in Myanmar beizutragen.
Wie konnte eine eine harmlose Idee – Menschen sollen einfach und schnell mit ihren Freunden und Bekannten kommunizieren können – eine solch destruktive Kraft entfalten? Die Antwort darauf ist weniger in der Idee der horizontalen Kommunikation selbst oder in den digitalen Medien im allgemeinen zu finden, sondern in der spezifischen Art und Weise, wie Facebook diese Idee umsetzt.
In this talk, I try to make sense of our current techno-cultural moment in which reality is disappearing and reappearing in confusing ways. I point out two major developments relevant here. First, as capitalism expanded, it has created a system of communication that cannot, and does not want to, deal with meaning. And, second, as the complexity of the socio-techno-biological connections of society increases, the old ways of organizing knowledge have become increasingly unable to provide an accurate understanding of the world we are living in. In their place, a new type of knowledge regime is emerging that produces its own form of meaninglessness.
I use the analysis of the crisis of meaning as a starting point to also think about some of the "counter-measures" that might be appropriate to overcome this crisis.
Following this was a great presentation by Kate Crawford and Trevor Paglen on how machines are trained to "see" and how this creates what they call "predator vision". You can see their talk (starting at 00:33:00) and the discussion (starting at 01:05:00) on HKW's Facebook page.
This text was written as a contribution to the workshop Public in the Making, 18-20 October 2018 İstanbul, part of the project Trans making: art, culture and economy to democratize society. A well layouted PDF is also available. Thanks to Fatih Aydoğdu for the invitation.
Democracy, even in its most rudimentary definition, contains two elements.1 The first is that of public deliberation and contestation of the issues affecting the “demos” (the people) as a collectivity. The precondition here is the availability of an easily accessible, shared space in which different opinions and attitudes can be expressed, compared and peacefully fought over, as a way for the members of the collectivity (the “citizens”) to form their opinions and plot out diverging futures. The second element allows to express those opinions and interests in a way that leads to a decision regarding the future that is binding for, and accepted by, all. Usually, this is done by voting either “yes” or “no” to a specified proposal for action, or by selecting representatives from a group of pre-selected candidates, often, but not necessarily, organized as political parties that stand for competing visions of the future. If the first element is weakened, voting is transformed to a ritual of submission and propaganda in the machinery of dictatorship, if the latter is reduced, democracy turns into post-democracy where issues are debated but decision making is outsourced to “experts” or “the market” (Crouch 2004).
Katja Mayer und Felix Stalder auf der PrivacyWeek #3
Datensouveränität wird oftmals mit Datenschutz gleichgesetzt, dabei öffnet der Begriff vielfältige Möglichkeiten den Umgang mit Daten neu zu gestalten. Besonders interessant sind Ansätze die neue Institutionen hervorbringen um kollektiv Daten zu nutzen und so eine gemeinschaftliche Form von Souveränität zu stärken. Die Soziologin Katja Mayer und der Kultur- und Medienwissenschaftler Felix Stalder erörtern die Thematik anhand von Beispielen aus den Bereichen Open Science und Gesundheit sowie städtischen Technologieprojekten.