Beautiful and open accessAbstract: One of the consequences of digitization is a deepening crisis of epistemology, caused by the proliferation of social, biological and machinic actors that overwhelm established methods of generating and organizing knowledge. Machine-driven analysis of large data sets is introducing a new way of doing science. In this, it is answering to this crisis while, at the same time, deepening it. Continuing to claim ‘scientific objectivity’ is becoming ever more impossible and in practice is likely to serve as a way to abdicate responsibility for the actual research and its consequences. Rather, we should seek to highlight the positionality and partiality of any claim, also and in particular in data science, thus rendering more obvious the need to combine competing claims into an understanding of the world that is not so much inter- but rather multi-subjective.

Keywords: epistemology, digitality, data science, reproducibility crisis, multi-subjectivity

One of the consequences of digitization is a deepening crisis of epistemology, caused by the proliferation of social, biological and machinic actors that overwhelm established methods of generating and organizing knowledge (Stalder 2018). And, since there is a close relationship between epistemology and politics, between ways of knowing and ways of managing the world, we are also in a deep political crisis. This manifest itself not the least in a populist rejection of ‘science’ and ‘facts’ (Manjoo 2008). This crisis of the established – let’s call it modern-liberal – epistemic-political order has created a space for the establishment of a new one, which doesn’t yet have a name, even if its outlines are already visible.

The epistemology of the modern-liberal era

The basic structure of epistemic-political order that created the modern era in the West was established in the mid 17th century. Not only defined the peace treaty of Westphalia in 1648 the secular nation-state as the pinnacle of power and ultimate sovereign, but the Royal Society in London, founded in 1660, established a new mode of asserting matters of fact. Basically, matters of fact were henceforth to be asserted by the observation of independent individuals, organized as communities of peers. These communities were bounded in two respects. First, the domain of knowledge in which the peers could assert facts with authority was limited to what would later be called a scientific discipline, over time, these boundaries got ever more narrow as the number of disciplines increased. Second, it was bounded by an agreement on the methods of knowing, these methods would define the other dimension of ‘discipline’ (Schapin and Schaffer 1985).

The first boundary not only led to the establishment of different scientific disciplines but also a separation of powers, so to speak, between science, politics, and religion. Each with its own internal segmentation, but, above all, separated from each other. The second boundary, the agreement of methods, rather than on outcomes, constructed science not only as an open-ended enterprise capable of revising its own paradigms (Kuhn 1962), but also demanded from its practitioners that they had no interest in specific outcomes, rather that they would accept whatever the method yielded. And the results were to be accepted, if, and only if, other members of the community shared the same observation. This was made easier, or perhaps even possible at all, by the aforementioned separation of domains. The knowledge thus produced concerned the ‘other’, that is, nature and it was possible to be disinterested towards the ‘other’. Thus, it became possible that, say, a Jewish Marxist chemist could easily reach consensus with, say, a Christian conservative chemist, as far as chemistry was concerned.

Along with the methods, a new place for the observation of nature was created, the lab. The main advantage of the lab was that it was a controllable environment, that is, in it, it was possible to reduce the complexity and isolate a limited number of relationships to be manipulated and observed in a reproducible manner. The fact that the natural environment outside the lab was far more complex was acknowledged through the formula of ceteris paribus, the assumption that while a set of elements were manipulated, ‘all other things being equal.’

Thus, the modern scientific practice has been based on principles of ‘inter-subjectivity’ (the position of the observer played no role in the observation), ‘distance’ (in the double sense that the observer was disconnected from the observation and that the object of the observation, nature, was the ‘other’), ‘disinterestedness’ (the results of the observation did not directly affect the observer) and ‘reduction’ (theory-driven separation of important from unimportant variables).

Limits of the modern epistemology

For a long time now, all of these principles have come under sustained critique (Lyotard 1984). Second-order cybernetics showed that the observer is part of the system that he/she/it is observing (Von Foerster 2003). Increased scope and complexity of the problems made the idea of the ‘view from nowhere’ highly questionable, if not impossible. But if the observer is inside the problem, then the position of the observer becomes crucial and resulting observation cannot be complete but is necessarily partial and needs to be complemented with other partial observations. In cultural studies, this is called ‘positionality’, meaning that any statement, even statement of fact, is related to a position from which this statement is made (Hall 1990, 18). If the observer is inside the problem then the problem domain can no longer be constructed as the ‘other’. This means that there can be no disinterested description, but matters of fact become, as Bruno Latour put it, “matters of concern” (Latour 2004). If we take climate science as an example, then every statement about the climate is also a statement about the society that is now understood as producing this climate. Hence every description becomes a prescription. Thus, the most urgent question turns from how an external object really ‘is’, to how we can, have to, or want to relate to it and in this relation, how we transform the very thing we are observing. Thus, the principle of knowledge moves from independent truth to dependent utility. Which immediately raises the question: useful for whom?

Last, but not least, with a rising dynamism and complexity, that is a sharp increase in the number of actors interacting with one another and ways in which this interaction can unfold, the question becomes ever more critical: which variables are the important ones, and which are the one that can be ignored? The effects of this increasing difficulty of distinguishing between variables to include and variables to exclude are, on the one hand, a crisis of replicability that seems to be plaguing the sciences, and, on the other hand, the mounting costs of ignored variables reasserting themselves in things like climate change.

None of this is new. Second order cybernetics is from the early 1970s, Lyotard’s observation of the transformation of science from seeking truth to creating utility is from 1980, Latour’s fundamental critique of the modern epistemology that is separating of society and nature is from 1988.

Deepening the crisis and going beyond it

But what is new, then? Machine-driven analysis of large data sets is introducing a new way of doing science. In this, it is answering to this crisis while, at the same time, deepening it. This is the case even if it works according to its own program (I will ignore here practical issues such as quality of data, issues of modeling and so on). For example, the claim to be able to processes large quantities of unstructured data, can be seen as avoiding the problem of reductionism. Rather than relying on a sample size of questionable representativeness, or on a controlled laboratory environment, or on theory-driven hypotheses, the approach (at least in its ideal) is to take in all data without any prior separation of important from unimportant aspects of the problem. This separation is done now through machine learning, and the fewer assumptions go into the processes, the higher the chance to find something new. Yet, the opacity and the complexity of the tools of analysis re-introduces problems of replicability with a vengeance. Because, the problem of reductionism has turned into a more fundamental problem of method, the very core of science itself. By focusing on ‘relations that work’ (while continuously adapting the question until it yields a statistically significant answer), on utilitarian effects (accurately predicting the short-term), rather than fundamental causation, machine-driven analysis dispenses with the notion of a disinterested search for an external truth and fully concentrates on relationships that can be manipulated for pre-determined ends. But since the actor who does the analysis – most clearly in the case of social media companies – is a core element of the situation he/she/it is analyzing, and is thus inside the problem rather than outside of it, result of the analysis can immediately be fed back into the situation changing its composition or dynamics. From the point of view of the company paying for the research, this is not a bug, but a feature.

In some way, this is an old problem of the social sciences, now on steroids. Max Weber argued already that what distinguishes social science from other forms of research is that the ideas people have about society, in part derived from social science, affect the dynamics of society. Noortje Marees (2017) sees this kind of ‘interactivity’ as one of the core elements of new field of digital sociology. This problem seems to plague ever more sciences because of the aforementioned breakdown of separation between scientific process and the object of analysis. Machine driven analysis takes this as a starting point, accelerating the processes by feeding its results back into the ‘object’ and claims to overcome it by reducing the temporal scope of analysis making it, in effect, a continuous process, rather than a one-time event.

Acknowledging Utility, Positionality and Partiality

This suggests to me that it might be more productive to think of machine-driven ‘data science’ as a new mode of knowing, one that breaks with fundamentals of scientific method that defined the modern-liberal era. This need not be a bad thing, because modern science produced not just knowledge, but also as Ulrich Beck (1992) observed, a lot of risk. Thus we need new methods that can deal with the dynamism and complexity of the problems we are not just facing, but in which we are in over more complex ways, also implicated in. There is a need to find new ways to make scientific facts transparent and democratically accountable. Rather than trying to defend traditional ideals of science – disinterestedness, distance, inter-subjectivity – we would acknowledge that science is ever more interested. This is not to advocate an ‘anything goes’ attitude, or a superficial relativism or post-modern claim about the constructedness of science, but it might be a first step to develop tools and methods to account for the necessary positionality of any knowledge claim that concerns complex, dynamic systems in which the observer is directly implicated.

This is all the more urgent for political reasons. The number of actors who have access to very large data is sharply limited. In effect, nobody can do research on social media data the way Facebook can do it. And here, it’s obvious that this research is interested and a source of social power. In such a context, claims of ‘scientific objectivity’ are likely to serve as a way to abdicate responsibility for the research and its consequences. To highlight the positionality and partiality of any claim, also and in particular in data science, would render more obvious the need to combine competing claims into new ways of understanding the world that is not so much inter- but rather multi-subjective. Each of these claims, in order to be understood as science, needs to be rigorous, fact-based and transparent to others, but they cannot claim to be disinterested or separated from outcomes.

References:

Beck, Ulrich. 1992. Risk Society: Towards a New Modernity. Theory, Culture & Society. London ; Newbury Park, Calif: Sage Publications.

Hall, Stuart. 1990. “Cultural Identity and Disapora.” In Identity: Community, Culture, Difference, edited by Jonathan Rutherford. London: Lawrence & Wishart.

Kuhn, Thomas. 1962. The Structures of Scientific Revolutions. Chicago: University Press.

Latour, Bruno. 2004. “Why Has Critique Run out of Steam? From Matters of Fact to Matters of Concern.” Critical Inquiry, no. 30: 225–48.

Lyotard, Jean-François. 1984. The Postmodern Condition: A Report on Knowledge. Minneapolis: University of Minnesota Press.

Manjoo, Farhad. 2008. True Enough: Learning to Live in a Post-Fact Society. Wiley.

Marres, Noortje. 2017. Digital Sociology: The Reinvention of Social Research. Malden, MA: Polity.

Schapin, Steven, and Simon Schaffer. 1985. Leviathan and the Air-Pump: Hobbes, Boyle and the Experimental Life. Princeton, NJ: Princeton University Press.

Stalder, Felix. 2018. The Digital Condition. Cambridge, UK ; Medford, MA: Polity Press.

Von Foerster, Heinz. 2003. “Cybernetics of Cybernetics.” In Understanding Understanding: Essays on Cybernetics and Cognition, 283–86. New York: Springer.

Citation: Stalder, F. (2018). From inter-subjectivity to multi-subjectivity: Knowledge claims and the digital condition. In E. Bayamlioglu, I. Baraliuc, L. Janssens, & M. Hildebrandt (Eds.), BEING PROFILED:COGITAS ERGO SUM (pp. 98–101). https://doi.org/10.5117/9789463722124

License: This text is licensed under a Creative Commons Attribution-NonCommercial- NoDerivatives 4.0 International License (CC BY-NC-ND 4.0). To view a copy of this license, visit https://creativecommons.org/licenses/by-nc-nd/4.0/