Presentation @ 4S

I will be at 4S [Society for Social Studies of Science], Boston this week in the ‘Surveillance and Security’ panel: Thursday 31 August, 4.00pm, Sheraton Boston Flr 3 Kent.

Recessive Objects: Surveillance and the (Dis)appearance of fact

Recessive objects are things which promise to extend our knowledge, but thereby publicise the very uncertainty threatening that knowing. These archives, statistical figures, black boxes mobilise our enduring faith in nonhuman objectivity and technological progress, imposing a sense of calculability and predictability. Yet far from extinguishing uncertainty, they provide material presence of the absent, secret, unknowable – especially the widening gap between human and machinic sensibility. Recessive objects address longstanding questions about the social production of what counts as objective fact, and what kind of virtues are invested into ‘new’ technologies. They emphasise the practical junctures where public imagination, material artefacts and the operational logic of new technologies intersect.

I discuss two recessive objects featuring centrally in America’s recent encounters with surveillance technologies: (1) the Snowden Files, an indefinite archive of state secrets leaking profusely since 2013; (2) the latest generation of self-tracking devices. What does it mean to know about a vast state surveillance system, even as it operates almost entirely removed from individuals’ sensory experience? How can the public make its judgment when proof of surveillance’s efficacy is itself classified? What kind of ‘self-knowledge’ is it when we learn about our bodies through machines that track us in ways our senses cannot follow – and claim to ‘know you better than you know yourself?

Advertisements

Criticising Surveillance and Surveillance Critique

New article now available on open access @ Surveillance & Society.

Abstract:

The current debate on surveillance, both academic and public, is constantly tempted towards a ‘negative’ criticism of present surveillance systems. In contrast, a ‘positive’ critique would be one which seeks to present alternative ways of thinking, evaluating, and even undertaking surveillance. Surveillance discourse today propagates a host of normative claims about what is admissible as true, probable, efficient – based upon which it cannot fail to justify its own expansion. A positive critique questions and subverts this epistemological foundation. It argues that surveillance must be held accountable by terms other than those of its own making. The objective is an open debate not only about ‘surveillance or not’, but the possibility of ‘another surveillance’.

To demonstrate the necessity of this shift, I first examine two existing frames of criticism. Privacy and humanism (appeal to human rights, freedoms and decency) are necessary but insufficient tools for positive critique. They implicitly accept surveillance’s bargain of trade-offs: the benefit of security measured against the cost of rights. To demonstrate paths towards positive critique, I analyse risk and security: two load-bearing concepts that hold up existing rationalisations of surveillance. They are the ‘openings’ for reforming those evaluative paradigms and rigged bargains on offer today.

Data Epistemologies – Dissertation online

Data Epistemologies: Surveillance and Uncertainty, my dissertation at the University of Pennsylvania, is now online for public access here. It is, in many ways, a working draft for my current book project.

Abstract:

Data Epistemologies studies the changing ways in which ‘knowledge’ is defined, promised, problematised, legitimated vis-á-vis the advent of digital, ‘big’ data surveillance technologies in early twenty-first century America. As part of the period’s fascination with ‘new’ media and ‘big’ data, such technologies intersect ambitious claims to better knowledge with a problematisation of uncertainty. This entanglement, I argue, results in contextual reconfigurations of what ‘counts’ as knowledge and who (or what) is granted authority to produce it – whether it involves proving that indiscriminate domestic surveillance prevents terrorist attacks, to arguing that machinic sensors can know us better than we can ever know ourselves.

The present work focuses on two empirical cases. The first is the ‘Snowden Affair’ (2013-Present): the public controversy unleashed through the leakage of vast quantities of secret material on the electronic surveillance practices of the U.S. government. The second is the ‘Quantified Self’ (2007-Present), a name which describes both an international community of experimenters and the wider industry built up around the use of data-driven surveillance technology for self-tracking every possible aspect of the individual ‘self’. By triangulating media coverage, connoisseur communities, advertising discourse and leaked material, I examine how surveillance technologies were presented for public debate and speculation.

This dissertation is thus a critical diagnosis of the contemporary faith in ‘raw’ data, sensing machines and algorithmic decision-making, and of their public promotion as the next great leap towards objective knowledge. Surveillance is not only a means of totalitarian control or a technology for objective knowledge, but a collective fantasy that seeks to mobilise public support for new epistemic systems. Surveillance, as part of a broader enthusiasm for ‘data-driven’ societies, extends the old modern project whereby the human subject – its habits, its affects, its actions – become the ingredient, the raw material, the object, the target, for the production of truths and judgments about them by things other than themselves.

[Talk] U. Milano-Bicocca

I will be at the University of Milano-Bicocca next week to give a talk on surveillance, self-tracking and the data-driven life. It will overlap significantly with my presentation last week at the Affect Theory conference; I’ll be posting the full text and slides afterwards. Abstract below.

The Data-Driven Life: The Parameters of Knowing in the Online Surveillance Society

‘Information overload’ is an old cliché, but when it was still fresh, it conveyed a broad and fundamental liquidity in the parameters of our experience. What it meant – and felt like – to remember, to know, was changing. Surveillance today is not simply a question of privacy or governmental power, but a practical extension of such liquidity. Surveillance’s fevered dream of total prediction hinges on its ability to subtend human sensibility – with its forgetfulness, bias, and other problems – to reach ‘raw’ and comprehensive data. This data-hunger, shared by states, corporations and individuals alike, betrays a ‘honeymoon objectivity’. The rise of new technologies for knowledge production is being misconstrued as a discovery of pure and unmediated information. The result is a profound shift in what qualifies as knowledge; who, or what, does the knowing; what decisions and actions are legitimated through that knowledge. Surveillance practices and controversies today host a reparametrisation of what ‘knowing’ entails.

In this talk, I will address two specific cases: the state surveillance of the Snowden Affair, and the self-surveillance of the Quantified Self (QS) movement. I draw on interviews, ethnographic observation and archival research that is part of a larger, ongoing project.

  1. I know we are being watched, Snowden told us so – but I don’t see it, and I don’t feel it. A vast surveillance program withdraws into the recesses of technological systems, denying our capacity to know and experience it. Conventional forms of proof or risk probabilities elude both arguments for and against it. This situation provokes two major patterns of ‘knowing’. First, subjunctivity leverages the recessive unknowability surrounding surveillance as if it were in some way true and certain, producing hypothetical, provisionary bases for real, enduring actions and beliefs. Statistical measures of danger become mathematically negligible, yet affectively overwhelming. Second, interpassivity projects others (human and nonhuman) who believe and experience what we cannot ourselves in our stead. Even if the world of surveillance and terror is not real in my back yard, these interpellated others help make it ‘real enough’. Technology’s recession thus provokes an existential dilemma; how do I ‘know’? What is it supposed to feel like to ‘know’?
  1. We cannot stand guard over our judgments without machines to keep us steady. If our knowledge of our own bodies, habits and affects were previously left to unreliable memories and gut feeling, QS promises a data-driven existence where you truly “come into contact with yourself” – through persistent, often wearable, self-surveillance. The Delphic maxim know thyself is applied to a very different existential condition, where my lived relationship with technology becomes the authoritative site for an abstracted relationship with my own body. Yet QSers also acknowledge that data is always incomplete, raising new uncertainties and requiring the intervention of subjective judgment. Here, it is technology’s protrusion which forces the question: how will you ‘know’ yourself through a digitality that subtends your memory and intention?

[ARTICLES] “Subjunctive and Interpassive” and “Presence”

Presence, or the sense of being-there and being-with in the new media society

Open access @First Monday.

This essay argues that the ways in which we come to feel connectivity and intimacy are often inconsistent with and irreducible to traditional markers like physical proximity, the human face or the synchronicity of message transmission. It identifies this non-objective and affective property as presence: conventionalised ways of intuiting sociability and publicness. The new media society is a specific situation where such habits of being affected are socially and historically parametrised. The essay provides two case studies. First: how do we derive a diffuse, indirect, intuitive sense of communicative participation — and yet also manage to convince ourselves of anonymity online? The second describes surveillance and data-mining as a kind of alienation: I am told my personal data is being exploited, but I do not quite ‘feel’ it. Surveillance practices increasingly withdraw from everyday experience, yet this withdrawal actually contributes to its strong presence.

Subjunctive and Interpassive ‘Knowing’ in the Surveillance Society

Open access @Media and Communication.

The Snowden affair marked not a switch from ignorance to informed enlightenment, but a problematisation of knowing as a condition. What does it mean to know of a surveillance apparatus that recedes from your sensory experience at every turn? How do we mobilise that knowledge for opinion and action when its benefits and harms are only articulable in terms of future-forwarded “as if”s? If the extent, legality and efficacy of surveillance is allegedly proven in secrecy, what kind of knowledge can we be said to “possess”? This essay characterises such knowing as “world-building”. We cobble together facts, claims, hypotheticals into a set of often speculative and deferred foundations for thought, opinion, feeling, action. Surveillance technology’s recession from everyday life accentuates this process. Based on close analysis of the public mediated discourse on the Snowden affair, I offer two common patterns of such world-building or knowing. They are (1)subjunctivity, the conceit of “I cannot know, but I must act as if it is true”; (2) interpassivity, which says “I don’t believe it/I am not affected, but someone else is (in my stead)”.