Weiter zum Inhalt

The Age of Surveillance Capitalism by Shoshana Zuboff


The Age of Surveillance Capitalism – The Fight for a Human Future at the New Frontier of Power

By Shoshana Zuboff 2019;

Profile Books, 691 pages

Siri Beerends*

Current debates around big data and the power of Big Tech are often framed in terms of privacy and data ownership: give users ownership over their data and the imbalance of power between tech companies and users will dissolve. The Age of Surveillance Capitalism makes clear how these debates fail to understand the exploitative business model that underpins our digital world and the immense techno-social challenges we are facing today.

I. Exiles from Our Own Behavior

We have entered a new era of capitalism where human experience is claimed as free raw material for translation into behavioral prediction products. Our real-world activity is continuously rendered from our phones, cars, streets, maps, homes, shops, bodies, trees, buildings, airports and cities. ‘Although some of these data are applied to service improvement’, Zuboff writes, ‘the rest are declared as a proprietary behavioral surplus, fed into advanced manufacturing processes known as ‘machine intelligence’, and fabricated into prediction products that anticipate what you will do now, soon, and later. Finally, these prediction products are traded in a new kind of marketplace that I call behavioral futures markets. Surveillance capitalists have grown immensely wealthy from these trading operations, for many companies are willing to lay bets on our future behavior.’1 It is easy to get overwhelmed by the many issues Zuboff addresses, but the main issue is crystal clear: these prediction products create extensive asymmetries in knowledge and power, making us exiles from our own behavior. Giving users ownership of their data will not give them power over the predictions derived from it. Zuboffs prediction imperative learns us that there is no point in owning data that should not exist in the first place. The basic principle of the prediction imperative is simple: the surest way to predict behavior is to intervene at its source and shape it. This is what tech companies like Google and Facebook are doing: designing interventions to eliminate uncertainty by ‘nudging’, ‘tuning’ and ‘herding’ our behavior in specific directions. For example by inserting a specific phrase into your Facebook news feed, timing the appearance of a buy button on your smartphone, or shutting down your car engine when an insurance payment is late. With the rise of connected smart sensors, any kind of behavior can be registered, analyzed and changed to encourage choices that benefit the architect, not the individual.2 But also without sensors, companies are perfectly capable of this, as Zuboff describes in her analysis of Pokémon Go. The game has been praised for taking youngsters outside, but Zuboff recognizes it as a tool to direct individual actions toward local market opportunities ‘where high bidders enjoy an ever-closer approximation of guaranteed outcomes’.3

Zuboff convincingly explains why and how these behavior interventions can happen outside of our awareness by introducing the problem of two texts. The first text is composed of what we inscribe on its pages: our posts, blogs, videos, photos, conversations, music, stories, observations, likes and tweets. The first text functions as the supply operation for the second text: the shadow text. Everything that we contribute to the first text becomes a target for behavioral surplus extraction. That surplus fills the pages of the second text that is only readable to surveillance capitalists. As a result, we become ‘exiles of our own behavior’: we have no access to the knowledge that is derived from our real-world activity, and no control over how this knowledge is transformed into means to others’ market ends.4 Another reason why we have no control is because monitoring is often presented as fun, competitive and gratifying, rewarding us for behavior improvements with points. We accept monitoring because we think of it as an innocent form of gamification, or as a fair price for a personalized free service. But that doesn’t underpin the logic of surveillance capitalism. Tech companies ‘poach our behavior for surplus and leave behind all the meaning lodged in our bodies, brains and beating hearts. Not unlike the monstrous slaughter of elephants for ivory. Forget the cliché that if it’s free you are the product. You are not the product; you are the abandoned carcass. The “Product” derives from the surplus that is ripped from your life’5. The comparison seems dramatic but the image of the carcass captures the fact that we are a natural resource being harvested.

II. The False Promise of Computational Certainty

It is evident that Zuboff doesn’t want the world to underestimate the consequences of surveillance capitalism: ‘If industrial civilization flourished at the expense of nature and now threatens to cost us the Earth’, she writes ‘an information civilization shaped by surveillance capitalism will thrive at the expense of human nature and threatens to cost us humanity’.6 Philosophers, ethicist and social scientists have often discussed these ‘threatens to humanity’, not always with convincing arguments. Zuboff makes a convincing case as not the technologies themselves, but the business model, behaviorist worldview and ideological principles that underlie these technologies are her central objects of analysis. She describes for example how leading figures in Silicon Valley are inspired by behavioral theorists like Skinner, Meyer and Planck who imagined that with the ‘correct technology of behavior’ knowledge could eliminate anomalies, driving our behavior toward pre-established parameters that align with collective norms. In the spirit of these behavioral theorists, surveillance capitalists work towards a hive-mind: a human simulation of machine learning systems in which each element learns and operates in contact with every other element. Their aim is a computer mediated society where our mutual visibility becomes the habitat in which we attune to one another, producing social patterns based on imitation that can easily be manipulated for guaranteed outcomes. In this computer mediated hive, tech companies tune society to achieve social confluence, replacing social trust and politics with collective pressure and computational certainty.7 Instead of futuristic speculations, Zuboff provides examples and research sources that show how the hive-mind is put into practice through business patents, research groups and influential figures in Silicon Valley. One of those influential figures is Alex Pentland, director of the Human Dynamics Lab at MIT Media Lab. Together with his doctoral students he develops instruments and methods that promise to transform human behavior into highly predictive math. With his ‘social physics’ he aims to code human beings not by race, income, occupation or gender, but rather by patterns of behavior subgroups.8 To achieve this, our data needs to be collected outside of our awareness through ‘unobtrusive behavioral monitors’ and a ‘seamless nervous system that covers the planet’. By transforming the collected data into behavior demographics, Pentland promises to predict disease, financial risk, consumer preferences and political views ‘with between 5 and 10 times the accuracy of the standard measures’9.

Where tech-optimists promote the hive-mind as a means to prevent problems, Zuboff demystifies it as a tool to make our behavior more predictable and tradable for behavioral futures markets. What Zuboff doesn’t discuss very thoroughly in her book, are the scientific criticisms on the predictive powers of data analysis. Law professor Andrew Fergusson has published extensively on unproven predictive software, predictive analytics as a self-fulfilling prophecy, and the non-scientific basis of predictive policing.10 Also mathematician Cathy ‘O Neil provided empirical insights on algorithmic bias, negative feedback loops and correlation causality mix-ups in her bestseller Weapons of Math Destruction. Predictive risk modeling is under pressure since more people have become victims of bias and false positives. Many aspects of our complex and ambiguous reality do not fit into simplified data-models. Yet when predictive models are wrong, they become a reality when we blindly act upon the datafied simplifications that these models provide. This is also reflected in Zuboffs prediction imperative: data-models do not capture reality; they shape reality and are therefore able to make predictions. This is how computational certainty can extinguish our felt reality.11 The computational certainty of the hive-mind is not only put into practice through self-fulfilling prediction models and Alex Pentland minded computer programmers. Zuboff describes how we already welcomed the machine-hive in our lives through our smartphone and social media use. Since we are permanently glued to our screens, our smartphones have become ‘unobtrusive behavioral monitors’. And since almost everybody is on social media, our social patterns are already manipulated for social confluence and guaranteed outcomes. Psychological research shows that digital natives experience themselves from the outside looking in, constantly comparing themselves with others, attuning their behavior to imagined audiences: ‘The magnetic pull that social media exerts on young people’, Zuboff writes ‘drives them toward more automatic and less voluntary behavior’.12

III. Claiming Our Right to the Future Tense

A common criticism on The Age of Surveillance Capitalism is that it would be ‘just another book about capitalism’. Yes, it is about capitalism, but the book is mostly about the replacement of society with machine action and automated behavior. Contrary to many other writers and scientists, Zuboff succeeds in explaining what is at stake because she clearly articulates what she means by ‘human’. The threaten that could cost us humanity is that machine processes not only automate information flows about us, but now also aim to automate us.13 In modifying our behavior for profit without our awareness, surveillance capitalism shifts the locus of control over the future tense from ‘I will’ to ‘You will’, supplanting autonomous action with heteronomous action, resulting in more automatic and less voluntary behavior. At stake – as Zuboff clearly phrases – is our inward experience from which we form our ‘will to will’ and ‘the public space to act on that will’. This is why we should have the right to sanctuary: the right to opt-out and have inward experiences from which we can form our own thoughts, values, intentions and actions. Inspired by Hannah Arendt, Zuboff understands will as an organ for the future: ‘the act of our will is our claim to the future tense’. This is why we should also have the right to the future tense: the right to act free from the influence of illegitimate forces that operate outside our awareness to influence, modify and condition our behavior.14

The Age of Surveillance Capitalism is an important book that provides us with vocabulary to arm ourselves against contemporary asymmetries of power and knowledge. The book should alarm us for an important reason: we – citizens, young people, journalists, scientists, scholars, policy makers – might still be able to fix this. The lawless post 9/11 realm in which surveillance capitalism was born and able to flourish doesn’t have to be irreversible. Building from Anthropologist Laura Nader, Zuboff want us to understand laws as ‘possibilities of democratic empowerment’. Collective action is needed to develop laws that can defend our right to sanctuary and the future tense. According to Zuboff, the greatest danger is that we come to feel at home in a glass life without privacy, or that we consider hiding from it as the new normal:

‘When I speak to an audience of young people, I try to alert them by calling attention to ordinary values and expectations before surveillance capitalism began its campaign of psychic numbing. I tell them that the word “search” has meant a daring existential journey, not a finger tap to already existing answers; that “friend” is an embodied mystery that can be forged only face-to-face and heart-to-heart; and that “recognition” is the glimmer of homecoming we experience in our beloveds face, not “facial recognition”. I say that it is not OK to have our best instincts for connection, empathy, and information exploited by a draconian quid pro quo that holds these goods hostage to the pervasive strip search of our lives. It is not OK for every move, emotion, utterance, and desire to be catalogued, manipulated, and then used to herd us through the future tense of the sake of someone else’s profit. These things are brand-new, you should not take them for granted because they are not OK.’15

Zuboff has been criticized for missing the point that many thinkers have already highlighted the thread of the rising influence of companies.16 However, the thread Zuboff describes lies not in the rising influence of companies, the thread lies in the fact that these companies are specialized in giving us exactly what we want for no money – apps, clicks, likes, credit scores and swipes –, at the hidden costs of automated behavior. If the thread to human autonomy begins with an assault on our awareness, then let this book be our key to awareness.

Notes

[1] Shoshana Zuboff, The Age of Surveillance Capitalism, The Fight for a Human Future at the New Frontier of Power (Profile Books 2019) 8

[2] ibid 202

[3] ibid 317

[4] ibid 100 – 186

[5] ibid 377

[6] ibid 12

[7] ibid 409

[8] For examples see, <https://www.endor.com/social-physics

[9] ibid 426-428

[10] Andrew G Ferguson, ‘The Rise of Big Data Policing’ (NYU Press, 2017)

[11] ibid 21

[12] ibid 447-449

[13] ibid 8

[14] ibid 521-523

[15] ibid 521

[16] See for example, Blayne Haggart, ‘Why I Won’t be Teaching the Age of Surveillance Capitalism’ <www.blaynehaggart.wordpress.com/2019/02/15>

Export Citation