Presentation @ Digital Existence II: Precarious Media Life

Later this month I will be presenting at Digital Existence II: Precarious Media Life, at the Sigtuna Foundation, Sweden, organised by Amanda Lagerkvist via the DIGMEX Research Network (of which I am a part) and the Nordic Network for the Study of Media and Religion. The abstract for my part of the show:

 

On the terror of becoming known

Today, we keenly feel the terror of becoming known: of being predicted and determined by data-driven surveillance systems. The webs of significance which sustain us also produce persistent vulnerability to becoming known by things other than ourselves. From the efforts to predict ‘Lone Wolf’ terrorists through comprehensive personal communications surveillance, to pilot programs for calculating insurance premiums by monitoring daily behaviour, the expressed fear is often of misidentification and misunderstanding. Yet the more general root of this anxiety is not of error or falsehood, but a highly entrenched moralisation of knowing. Digital technologies are the newest frontier for the reprisal of old Enlightenment dreams, wherein the subject has a duty to know and technological inventions are an ineluctable force for better knowledge. This nexus demands and requires subjects’ constant vulnerability to producing data and being socially determined by it. In turn, subjects turn to what Foucault called illegalisms[1]: forms of complaint, compromise, obfuscation, and other everyday efforts to mitigate the violence of becoming known. The presentation threads this normative argument with two kinds of grounding material: (1) episodes in becoming-known drawn from original research into American state- and self-surveillance, and (2) select works in moral philosophy and technology criticism.[2]

 

[1] Foucault, M., 2015. The Punitive Society: Lectures at the College de France 1972-1973 B. E. Harcourt, ed., New York: Palgrave Macmillan.

[2] E.g. Jasanoff, S., 2016. The Ethics of Invention: Technology and the Human Future, New York: W.W. Norton & Co; Vallor, S., 2016. Technology and the Virtues: A Philosophical Guide to a Future Worth Wanting, Oxford: Oxford University Press; Winner, L., 1986. The Whale and the Reactor: A Search for Limits in an Age of High Technology, Chicago: Chicago University Press.

Advertisements

Lecture @ Copenhagen Business School

I was recently at Copenhagen Business School, courtesy of Mikkel Flyverbom, to discuss the book-in progress. An earlier version of the talk, given at MIT, is online in podcast form here.

An excerpt from my notes:

“So we have two episodes, two ongoing episodes. On one hand, you have the state and its technological system, designed for bulk collection of massive scales, and energised by the moral and political injunction towards ‘national security’ – and all of this leaked through Edward Snowden. On the other hand, you have the popularisation of self-tracking devices, a fresh addition to the growing forms of constant care and management required of the employable and productive subject, Silicon Valley being its epicentre.

These are part of a wider penumbra of practices: algorithmic predictions, powered by Bayesian inference and artificial neural nets, corporate data-mining under the moniker of ‘big’ data… now, by no means are they the same thing, or governed by some central force that perpetuates them. But as they pop up around every street corner, there are certain tendencies that start to characterise ‘data-driven’ as a mode of thinking and decision-making.

The tendency I focus on here is the effort to render things known, predictable, calculable – and how pursuing that hunger entails, in fact, many close encounters uncertainty and the unknown.

Here surveillance is not reducible to questions of security and privacy. It is a scene for ongoing conflicts over what counts as knowledge, who or what gets the authority to declare what you are, what we consider ‘good enough’ evidence to watch people, to change our diet, to arrest them. What we’re seeing is a renewed effort at valorising a certain project of objective knowledge, of factual certainty, of capturing the viscera of life into bits, of producing the right number that tells us what to do.”

 

 

Data Epistemologies – Dissertation online

Data Epistemologies: Surveillance and Uncertainty, my dissertation at the University of Pennsylvania, is now online for public access here. It is, in many ways, a working draft for my current book project.

Abstract:

Data Epistemologies studies the changing ways in which ‘knowledge’ is defined, promised, problematised, legitimated vis-á-vis the advent of digital, ‘big’ data surveillance technologies in early twenty-first century America. As part of the period’s fascination with ‘new’ media and ‘big’ data, such technologies intersect ambitious claims to better knowledge with a problematisation of uncertainty. This entanglement, I argue, results in contextual reconfigurations of what ‘counts’ as knowledge and who (or what) is granted authority to produce it – whether it involves proving that indiscriminate domestic surveillance prevents terrorist attacks, to arguing that machinic sensors can know us better than we can ever know ourselves.

The present work focuses on two empirical cases. The first is the ‘Snowden Affair’ (2013-Present): the public controversy unleashed through the leakage of vast quantities of secret material on the electronic surveillance practices of the U.S. government. The second is the ‘Quantified Self’ (2007-Present), a name which describes both an international community of experimenters and the wider industry built up around the use of data-driven surveillance technology for self-tracking every possible aspect of the individual ‘self’. By triangulating media coverage, connoisseur communities, advertising discourse and leaked material, I examine how surveillance technologies were presented for public debate and speculation.

This dissertation is thus a critical diagnosis of the contemporary faith in ‘raw’ data, sensing machines and algorithmic decision-making, and of their public promotion as the next great leap towards objective knowledge. Surveillance is not only a means of totalitarian control or a technology for objective knowledge, but a collective fantasy that seeks to mobilise public support for new epistemic systems. Surveillance, as part of a broader enthusiasm for ‘data-driven’ societies, extends the old modern project whereby the human subject – its habits, its affects, its actions – become the ingredient, the raw material, the object, the target, for the production of truths and judgments about them by things other than themselves.

[Talk] U. Milano-Bicocca

I will be at the University of Milano-Bicocca next week to give a talk on surveillance, self-tracking and the data-driven life. It will overlap significantly with my presentation last week at the Affect Theory conference; I’ll be posting the full text and slides afterwards. Abstract below.

The Data-Driven Life: The Parameters of Knowing in the Online Surveillance Society

‘Information overload’ is an old cliché, but when it was still fresh, it conveyed a broad and fundamental liquidity in the parameters of our experience. What it meant – and felt like – to remember, to know, was changing. Surveillance today is not simply a question of privacy or governmental power, but a practical extension of such liquidity. Surveillance’s fevered dream of total prediction hinges on its ability to subtend human sensibility – with its forgetfulness, bias, and other problems – to reach ‘raw’ and comprehensive data. This data-hunger, shared by states, corporations and individuals alike, betrays a ‘honeymoon objectivity’. The rise of new technologies for knowledge production is being misconstrued as a discovery of pure and unmediated information. The result is a profound shift in what qualifies as knowledge; who, or what, does the knowing; what decisions and actions are legitimated through that knowledge. Surveillance practices and controversies today host a reparametrisation of what ‘knowing’ entails.

In this talk, I will address two specific cases: the state surveillance of the Snowden Affair, and the self-surveillance of the Quantified Self (QS) movement. I draw on interviews, ethnographic observation and archival research that is part of a larger, ongoing project.

  1. I know we are being watched, Snowden told us so – but I don’t see it, and I don’t feel it. A vast surveillance program withdraws into the recesses of technological systems, denying our capacity to know and experience it. Conventional forms of proof or risk probabilities elude both arguments for and against it. This situation provokes two major patterns of ‘knowing’. First, subjunctivity leverages the recessive unknowability surrounding surveillance as if it were in some way true and certain, producing hypothetical, provisionary bases for real, enduring actions and beliefs. Statistical measures of danger become mathematically negligible, yet affectively overwhelming. Second, interpassivity projects others (human and nonhuman) who believe and experience what we cannot ourselves in our stead. Even if the world of surveillance and terror is not real in my back yard, these interpellated others help make it ‘real enough’. Technology’s recession thus provokes an existential dilemma; how do I ‘know’? What is it supposed to feel like to ‘know’?
  1. We cannot stand guard over our judgments without machines to keep us steady. If our knowledge of our own bodies, habits and affects were previously left to unreliable memories and gut feeling, QS promises a data-driven existence where you truly “come into contact with yourself” – through persistent, often wearable, self-surveillance. The Delphic maxim know thyself is applied to a very different existential condition, where my lived relationship with technology becomes the authoritative site for an abstracted relationship with my own body. Yet QSers also acknowledge that data is always incomplete, raising new uncertainties and requiring the intervention of subjective judgment. Here, it is technology’s protrusion which forces the question: how will you ‘know’ yourself through a digitality that subtends your memory and intention?