The Futures of Anticipatory Reason

Coming out very soon at Security Dialogue is a piece I worked on together with Piotr Szpunar, whose book Homegrown: Identity and Difference in the American War on Terror came out last year with NYU Press – so both of us looking closely at current developments in surveillance, counter-terrorism and the demand to predict. In the article, we argue that anticipatory security practices (just one part of the even broader current obsession with prediction) invoke the future to open up wiggle room for unorthodox, uncertain and otherwise problematic claims about people. This gap, which we call ‘epistemic black market’, is very useful for the flexibility it affords security practices – flexibility that is typically used to reinforce longstanding biases and power relations, exemplified by the continuing insistence on the figure of the brown, Muslim terrorist.

You can find the pre-proofread version on this site here.

 

Abstract:

This article examines invocations of the future in contemporary security discourse and practice. This future constitutes not a temporal zone of events to come, or a horizon of concrete visions for tomorrow, but an indefinite source of contingency and speculation. The ongoing proliferation of predictive, pre-emptive and otherwise anticipatory security practices strategically utilise the future to circulate the kinds of truths, beliefs, claims, that might otherwise be difficult to legitimise. The article synthesises critical security studies with broader humanistic thought on the future, with a focus on the sting operations in recent US counter-terrorism practice. It argues that the future today functions as an ‘epistemic black market’; a zone of tolerated unorthodoxy where boundaries defining proper truth-claims become porous and flexible. Importantly, this epistemic flexibility is often leveraged towards a certain conservatism, where familiar relations of state control are reconfirmed and expanded upon. This conceptualisation of the future has important implications for standards of truth and justice, as well as public imaginations of security practices, in a period of increasingly pre-emptive and anticipatory securitisation.

Advertisements

Presentation @ Digital Existence II: Precarious Media Life

Later this month I will be presenting at Digital Existence II: Precarious Media Life, at the Sigtuna Foundation, Sweden, organised by Amanda Lagerkvist via the DIGMEX Research Network (of which I am a part) and the Nordic Network for the Study of Media and Religion. The abstract for my part of the show:

 

On the terror of becoming known

Today, we keenly feel the terror of becoming known: of being predicted and determined by data-driven surveillance systems. The webs of significance which sustain us also produce persistent vulnerability to becoming known by things other than ourselves. From the efforts to predict ‘Lone Wolf’ terrorists through comprehensive personal communications surveillance, to pilot programs for calculating insurance premiums by monitoring daily behaviour, the expressed fear is often of misidentification and misunderstanding. Yet the more general root of this anxiety is not of error or falsehood, but a highly entrenched moralisation of knowing. Digital technologies are the newest frontier for the reprisal of old Enlightenment dreams, wherein the subject has a duty to know and technological inventions are an ineluctable force for better knowledge. This nexus demands and requires subjects’ constant vulnerability to producing data and being socially determined by it. In turn, subjects turn to what Foucault called illegalisms[1]: forms of complaint, compromise, obfuscation, and other everyday efforts to mitigate the violence of becoming known. The presentation threads this normative argument with two kinds of grounding material: (1) episodes in becoming-known drawn from original research into American state- and self-surveillance, and (2) select works in moral philosophy and technology criticism.[2]

 

[1] Foucault, M., 2015. The Punitive Society: Lectures at the College de France 1972-1973 B. E. Harcourt, ed., New York: Palgrave Macmillan.

[2] E.g. Jasanoff, S., 2016. The Ethics of Invention: Technology and the Human Future, New York: W.W. Norton & Co; Vallor, S., 2016. Technology and the Virtues: A Philosophical Guide to a Future Worth Wanting, Oxford: Oxford University Press; Winner, L., 1986. The Whale and the Reactor: A Search for Limits in an Age of High Technology, Chicago: Chicago University Press.

Data Epistemologies – Dissertation online

Data Epistemologies: Surveillance and Uncertainty, my dissertation at the University of Pennsylvania, is now online for public access here. It is, in many ways, a working draft for my current book project.

Abstract:

Data Epistemologies studies the changing ways in which ‘knowledge’ is defined, promised, problematised, legitimated vis-á-vis the advent of digital, ‘big’ data surveillance technologies in early twenty-first century America. As part of the period’s fascination with ‘new’ media and ‘big’ data, such technologies intersect ambitious claims to better knowledge with a problematisation of uncertainty. This entanglement, I argue, results in contextual reconfigurations of what ‘counts’ as knowledge and who (or what) is granted authority to produce it – whether it involves proving that indiscriminate domestic surveillance prevents terrorist attacks, to arguing that machinic sensors can know us better than we can ever know ourselves.

The present work focuses on two empirical cases. The first is the ‘Snowden Affair’ (2013-Present): the public controversy unleashed through the leakage of vast quantities of secret material on the electronic surveillance practices of the U.S. government. The second is the ‘Quantified Self’ (2007-Present), a name which describes both an international community of experimenters and the wider industry built up around the use of data-driven surveillance technology for self-tracking every possible aspect of the individual ‘self’. By triangulating media coverage, connoisseur communities, advertising discourse and leaked material, I examine how surveillance technologies were presented for public debate and speculation.

This dissertation is thus a critical diagnosis of the contemporary faith in ‘raw’ data, sensing machines and algorithmic decision-making, and of their public promotion as the next great leap towards objective knowledge. Surveillance is not only a means of totalitarian control or a technology for objective knowledge, but a collective fantasy that seeks to mobilise public support for new epistemic systems. Surveillance, as part of a broader enthusiasm for ‘data-driven’ societies, extends the old modern project whereby the human subject – its habits, its affects, its actions – become the ingredient, the raw material, the object, the target, for the production of truths and judgments about them by things other than themselves.

About

I am a doctoral student at the Annenberg School for Communication, University of Pennsylvania.

My current research involves the ‘infra-phenomenal’ in digital life. The way we experience digital interfaces and platforms is distinct from the developers or analysts’ specifications of them, and also from society’s own discourse about how such media reportedly work. This phenomenological dimension is strongly modulated by the affordances and obligations digital objects levy upon our perception, attention and habit. All this feeds into internalised impressions and conceptualisations of the digital world – folk understandings of trust and risk, transparency and privacy, connectivity and creativity.

Specifically, current projects include (1) everyday discourse and how communication relies on ‘uncertain’ and semantically negative techniques; (2) the ability of objects to levy ‘obligations’ on their environments and subjects; (3)how crowdfunding participants navigate conceptualisations of trust and risk. I draw on writers including Michel Foucault, Gilles Deleuze, Maurice Merleau-Ponty, Ludwig Wittgenstein, Pierre Bourdieu and Slavoj Žižek.