Presentation @ Digital Existence II: Precarious Media Life

Later this month I will be presenting at Digital Existence II: Precarious Media Life, at the Sigtuna Foundation, Sweden, organised by Amanda Lagerkvist via the DIGMEX Research Network (of which I am a part) and the Nordic Network for the Study of Media and Religion. The abstract for my part of the show:

 

On the terror of becoming known

Today, we keenly feel the terror of becoming known: of being predicted and determined by data-driven surveillance systems. The webs of significance which sustain us also produce persistent vulnerability to becoming known by things other than ourselves. From the efforts to predict ‘Lone Wolf’ terrorists through comprehensive personal communications surveillance, to pilot programs for calculating insurance premiums by monitoring daily behaviour, the expressed fear is often of misidentification and misunderstanding. Yet the more general root of this anxiety is not of error or falsehood, but a highly entrenched moralisation of knowing. Digital technologies are the newest frontier for the reprisal of old Enlightenment dreams, wherein the subject has a duty to know and technological inventions are an ineluctable force for better knowledge. This nexus demands and requires subjects’ constant vulnerability to producing data and being socially determined by it. In turn, subjects turn to what Foucault called illegalisms[1]: forms of complaint, compromise, obfuscation, and other everyday efforts to mitigate the violence of becoming known. The presentation threads this normative argument with two kinds of grounding material: (1) episodes in becoming-known drawn from original research into American state- and self-surveillance, and (2) select works in moral philosophy and technology criticism.[2]

 

[1] Foucault, M., 2015. The Punitive Society: Lectures at the College de France 1972-1973 B. E. Harcourt, ed., New York: Palgrave Macmillan.

[2] E.g. Jasanoff, S., 2016. The Ethics of Invention: Technology and the Human Future, New York: W.W. Norton & Co; Vallor, S., 2016. Technology and the Virtues: A Philosophical Guide to a Future Worth Wanting, Oxford: Oxford University Press; Winner, L., 1986. The Whale and the Reactor: A Search for Limits in an Age of High Technology, Chicago: Chicago University Press.

Advertisements

Presentation @ 4S

I will be at 4S [Society for Social Studies of Science], Boston this week in the ‘Surveillance and Security’ panel: Thursday 31 August, 4.00pm, Sheraton Boston Flr 3 Kent.

Recessive Objects: Surveillance and the (Dis)appearance of fact

Recessive objects are things which promise to extend our knowledge, but thereby publicise the very uncertainty threatening that knowing. These archives, statistical figures, black boxes mobilise our enduring faith in nonhuman objectivity and technological progress, imposing a sense of calculability and predictability. Yet far from extinguishing uncertainty, they provide material presence of the absent, secret, unknowable – especially the widening gap between human and machinic sensibility. Recessive objects address longstanding questions about the social production of what counts as objective fact, and what kind of virtues are invested into ‘new’ technologies. They emphasise the practical junctures where public imagination, material artefacts and the operational logic of new technologies intersect.

I discuss two recessive objects featuring centrally in America’s recent encounters with surveillance technologies: (1) the Snowden Files, an indefinite archive of state secrets leaking profusely since 2013; (2) the latest generation of self-tracking devices. What does it mean to know about a vast state surveillance system, even as it operates almost entirely removed from individuals’ sensory experience? How can the public make its judgment when proof of surveillance’s efficacy is itself classified? What kind of ‘self-knowledge’ is it when we learn about our bodies through machines that track us in ways our senses cannot follow – and claim to ‘know you better than you know yourself?

Presentation @ ICA 2017

Next week, I’ll be presenting at the International Communications Association conference, San Diego, on the future as a trope by which potentiality and speculation may be folded into the domain of reasoned judgment. Saturday 27 May, 9.30am, Aqua Salon AB.

On Futures, and Epistemic Black Markets

The future does not exist: and this simple fact gives it a special epistemic function.

The future is where truths too uncertain, fears too politically incorrect, ideas too unprovable, receive unofficial license to roam… The future is a liminal zone, a margin of tolerated unorthodoxy that provides essential compensation for the rigidity of modern epistemic systems. This ‘flexibility’ is central to the perceived ruptures of traditional authorities in the contemporary moment. What we call post-fact politics (David Roberts), the age of skepticism (Siebers, Leiter), the rise of pre-emption (Massumi, Amoore), describe situations where apparently well-established infrastructures of belief and proof are disrupted by transgressive associations of words and things. The future is here conceptualised as a mode for such interventions.

This view helps us understand the present-day intersection of two contradictory fantasies: first, the quest to know and predict exhaustively, especially through new technologies and algorithms; second, heightened anxiety over uncertainties that both necessitate and elude those efforts. In the talk, I trace these contradictory fantasies across several interconnected scenes. We find voracious data hunger and apophenia (Steyerl) accompanied by visions of uncontrolled futures in the Snowden affair, and the narrative of radical terrorism; in Donald Trump’s depiction of eroded borders, and the use of latest surveillance technologies in tracking Muslims; in the revival (endurance?) of the paranoid style (Richard Hofstadter); and even in the apocalyptic warnings over climate change, tempered by deep confusion over the reliability of such estimates.

The trading of ‘futures’ in stock markets originated from the need to align uncertainties inherent in agricultural timescales with the epistemic demands of human markets. The futures I speak of are currency in epistemic black markets: spaces where things other than presently real, proven, accounted for, may nevertheless be traded for sentiment and opinion.

 

Criticising Surveillance and Surveillance Critique

New article now available on open access @ Surveillance & Society.

Abstract:

The current debate on surveillance, both academic and public, is constantly tempted towards a ‘negative’ criticism of present surveillance systems. In contrast, a ‘positive’ critique would be one which seeks to present alternative ways of thinking, evaluating, and even undertaking surveillance. Surveillance discourse today propagates a host of normative claims about what is admissible as true, probable, efficient – based upon which it cannot fail to justify its own expansion. A positive critique questions and subverts this epistemological foundation. It argues that surveillance must be held accountable by terms other than those of its own making. The objective is an open debate not only about ‘surveillance or not’, but the possibility of ‘another surveillance’.

To demonstrate the necessity of this shift, I first examine two existing frames of criticism. Privacy and humanism (appeal to human rights, freedoms and decency) are necessary but insufficient tools for positive critique. They implicitly accept surveillance’s bargain of trade-offs: the benefit of security measured against the cost of rights. To demonstrate paths towards positive critique, I analyse risk and security: two load-bearing concepts that hold up existing rationalisations of surveillance. They are the ‘openings’ for reforming those evaluative paradigms and rigged bargains on offer today.

Lecture @ MIT List Centre

I will be speaking at MIT’s List Centre on May 19, as part of the exhibition An Inventory of Shimmers: Objects of Intimacy in Contemporary Art. The talk / discussion is titled Intimacy with Technology. See more info here, and RSVP here.

“Join List Visual Arts Center and Sun-ha Hong, Mellon Postdoctoral Fellow, Comparative Media Studies/Writing, MIT in a discussion about the affective charge and the promise of objectivity surrounding the idea of raw data. As we move into an era of ambient and ubiquitous computing, Hong will discuss the intimate aspects of data to explore how we as humans engage in relationships with technologies—relationships which lead us to ask, ‘what is intimacy?’. In response to the List Center’s exhibition An Inventory of Shimmers: Objects of Intimacy in Contemporary Art Hong will delve into the mediated and aesthetic elements that objects assume.”

 

 

Lecture @ Copenhagen Business School

I was recently at Copenhagen Business School, courtesy of Mikkel Flyverbom, to discuss the book-in progress. An earlier version of the talk, given at MIT, is online in podcast form here.

An excerpt from my notes:

“So we have two episodes, two ongoing episodes. On one hand, you have the state and its technological system, designed for bulk collection of massive scales, and energised by the moral and political injunction towards ‘national security’ – and all of this leaked through Edward Snowden. On the other hand, you have the popularisation of self-tracking devices, a fresh addition to the growing forms of constant care and management required of the employable and productive subject, Silicon Valley being its epicentre.

These are part of a wider penumbra of practices: algorithmic predictions, powered by Bayesian inference and artificial neural nets, corporate data-mining under the moniker of ‘big’ data… now, by no means are they the same thing, or governed by some central force that perpetuates them. But as they pop up around every street corner, there are certain tendencies that start to characterise ‘data-driven’ as a mode of thinking and decision-making.

The tendency I focus on here is the effort to render things known, predictable, calculable – and how pursuing that hunger entails, in fact, many close encounters uncertainty and the unknown.

Here surveillance is not reducible to questions of security and privacy. It is a scene for ongoing conflicts over what counts as knowledge, who or what gets the authority to declare what you are, what we consider ‘good enough’ evidence to watch people, to change our diet, to arrest them. What we’re seeing is a renewed effort at valorising a certain project of objective knowledge, of factual certainty, of capturing the viscera of life into bits, of producing the right number that tells us what to do.”

 

 

Data Epistemologies – Dissertation online

Data Epistemologies: Surveillance and Uncertainty, my dissertation at the University of Pennsylvania, is now online for public access here. It is, in many ways, a working draft for my current book project.

Abstract:

Data Epistemologies studies the changing ways in which ‘knowledge’ is defined, promised, problematised, legitimated vis-á-vis the advent of digital, ‘big’ data surveillance technologies in early twenty-first century America. As part of the period’s fascination with ‘new’ media and ‘big’ data, such technologies intersect ambitious claims to better knowledge with a problematisation of uncertainty. This entanglement, I argue, results in contextual reconfigurations of what ‘counts’ as knowledge and who (or what) is granted authority to produce it – whether it involves proving that indiscriminate domestic surveillance prevents terrorist attacks, to arguing that machinic sensors can know us better than we can ever know ourselves.

The present work focuses on two empirical cases. The first is the ‘Snowden Affair’ (2013-Present): the public controversy unleashed through the leakage of vast quantities of secret material on the electronic surveillance practices of the U.S. government. The second is the ‘Quantified Self’ (2007-Present), a name which describes both an international community of experimenters and the wider industry built up around the use of data-driven surveillance technology for self-tracking every possible aspect of the individual ‘self’. By triangulating media coverage, connoisseur communities, advertising discourse and leaked material, I examine how surveillance technologies were presented for public debate and speculation.

This dissertation is thus a critical diagnosis of the contemporary faith in ‘raw’ data, sensing machines and algorithmic decision-making, and of their public promotion as the next great leap towards objective knowledge. Surveillance is not only a means of totalitarian control or a technology for objective knowledge, but a collective fantasy that seeks to mobilise public support for new epistemic systems. Surveillance, as part of a broader enthusiasm for ‘data-driven’ societies, extends the old modern project whereby the human subject – its habits, its affects, its actions – become the ingredient, the raw material, the object, the target, for the production of truths and judgments about them by things other than themselves.