Skip to main content

Reflections on the priesthood of surveillance capitalism

Shoshanna Zuboff's "The Age of Surveillance Capitalism" is a stirring read. The work is substantial, composed of over 500 pages of narrative, with an extra ~150 pages of notes and references. It is divided into 3 sections covering

1) the relationship between surveillance and industrial capitalism, focusing mostly on the industrial revolution and social implications of Ford's assembly line
2) surveillance capitalism's components and sources of power, and a correction of the "you are the product" metaphor which is frequently used to describe surveillance tech
3) the social implications and psychological transformations that are currently occurring (or will occur) under surveillance capitalism

Despite the literal weight of the book and the technical content on which it is based, Zuboff's writing is poetic and engaging, and it makes for an easy (but long) read. I found the third section of the book to be most interesting, in which she discusses the colonial mindset of the "priests" of surveillance capitalism -- the data scientist. I wanted to write a little about that here.




As a data scientist, I never saw myself as a "priest", or even a data evangelist, a label that some in the tech community self-proclaim. But I'm writing about the idea of this priesthood because it generated the most internal struggle for me. Priesthood suggests an influential and paternalistic power, one that can be a helper of mankind, or simply pose, yet can also be abusive and predatory. Data scientists who consider themselves data evangelists and proclaim its saving powers may also consider themselves priests, but likely do not consider their position of power or potential to abuse. Regardless, if you are a data scientist, whether you liken yourself to the priestly position or not, it's very possible that someone looking for answers in their data may. And I would encourage you to read this book.

While the other sections introduced some new perspective on the role of surveillance capitalism in shifting balance of knowledge and power in society, section 3 forced me to re-examine my place as a scientist and data scientist in staging and supporting a system designed to asymmetrize the balance of knowledge and power between social classes. There was so much content and so much to reflect on, I'm not sure I can adequately and accurately express my discomforts here. But I'm going to give it a shot.

What is surveillance capitalism?

Before I get into it however, let me first share Zuboff's definition of surveillance capitalism, and why it's something that should concern us:

"Surveillance capitalism unilaterally claims human experience as free raw material for translation into behavioral data... (which are) fabricated into prediction products... (and are) traded in a new kind of marketplace for behavioral predictions that I call 'behavioral futures markets'."

Basically, surveillance capitalism is the gathering of behavioral data in massive quantities, which are used to power business decisions that hinge on predictions of your future behavior. When businesses can gain an edge on guessing what you'll do next, they can claim to offer more efficient solutions, and thus make more money. So, for example, based on the physical activity from your Fitbit, a company may place a bet on the state of your health in the next couple years and decide how much it should cost to insure you, thus they can claim savings by denying you expensive procedures or medicine. Or based on a video interview and the number of times you pause or look away from the interviewer, an HR department may place a bet on your productivity as an employee and decide whether to hire you or not, thus claiming savings by preventing a suboptimal hire or prolonging candidate searches. Or based on your age, what neighborhood you live in and the temperature outside, a company may provide a police department your likelihood of being arrested that day, thus offering police an opportunity to make quotas with less effort and less accountability.

Under surveillance capitalism, the ultimate goal isn't to provide you with better products that make your life easier, it's to flood the environment with products which gather information. The improved user experience that comes from updating cameras and increasing storage capacities on phones is secondary to its purpose of glamorizing recording instruments and increasing the yield of the mining tools that extract the raw materials of surveillance capitalism: your experiences. Improvements to products are not meant primarily to make your life better or easier. They are meant to lure you into engaging with the surveillance instruments.

This is probably unsurprising to almost everyone -- the power of "artificial intelligence" has been marketed so widely as an inevitable force for progress, as an improvement to our society, many of us have come to accept it (or enthusiastically welcome it) into our homes and as part of our bodies, even with its known flaws. We sign privacy agreements without reading them. We assume our social security numbers, pictures of our family, and DNA code are floating out there somewhere, and we accept that we are powerless to protect them, just hoping by chance that they won't be abused. We accept defeat as the cost of progress. The progressive potential of artificial intelligence and surveillance is shown to us in commercials constantly as the way to something better. It is an intelligent path forward, a necessary change to keep up with the competition. Yet, when the community raises its concerns about its power, it is repackaged as a minor tweak to old marketing methods, the same old same old, nothing to see here. We are told that trying to understand customers and provide them with better products has always been part of doing good business. So if surveillance and artificial intelligence is just a way to do the same business more efficiently, why should surveillance capitalism be feared?

Why should surveillance capitalism concern us?

In short, the rise of surveillance capitalism should concern us because in order for it to survive:

1) People must be willing to lose the privacy of their experiences. Zuboff refers to behavioral data as the raw material of surveillance capitalism. Your behaviors fuel the engine of surveillance capitalism, and without the ability to track and record them, the engine would stop running. You know those suggestions to connect to friends on LinkedIn, Facebook, and Twitter? The primary objective here is not for you to enrich your friendships, it's to enrich your data set so that your networking and "like" behaviors can be refined into a higher quality personality profile. The movement reminder on your Fitbit does not beep necessarily to get you to move and increase your health, but to gather more data on your daily habits. The funny Instagram and Tik Tok filters are not there to make you laugh more, but to capture more angles on your face and posture to improve facial recognition technology. Sure, these small improvements make the products more fun and maybe more useful, but you must be willing to trade being watched / recorded / torn down / quantified, and ultimately, judged to use them.
2) A necessary gap of knowledge must exist between the surveyors and the surveilled. Zuboff refers to this knowledge asymmetry as an inability to access the shadow text written by surveillance capitalists. If people were provided access to the inner workings of the prediction machines, this would reduce efficiency and result in unwanted questioning and increased accountability. For example, if you have ever wondered why your credit score isn't what you think it should be, or didn't get called after applying for a job that seemed like a perfect fit, or your insurance rates are inexplicably high -- don't expect to get an answer. It's not efficient and it's likely a legal danger for companies. Many of these decisions are driven by the algorithms written in the "shadow text", and to try to explain that text is not worth their time or the consequences they may face upon deciphering it.
3) The surveilled must accept a degree of psychosocial damage. That is, people must be willing to lose privacy as inevitable and necessary, which psychologically is very difficult. How well do you function knowing you're being watched and dismantled for data, constantly? Do you think you can ever rest at ease when every movement and sound you make is used to profile you, to determine your potential, to quantify your worth? Do you think living under the microscope might keep you from saying what you really want to say, going where you really want to go, and being who you really are or want to be? As Zuboff puts it, surveillance capitalism leaves us "trapped in a condition of 'no-exit', where the only walls are made of glass." And in order for surveillance capitalism to survive, "the natural human yearning for refuge must be extinguished and the ancient institution of sanctuary deleted."

Basically, just as industrial capitalism forces an unequal distribution of money and power, surveillance capitalism forces an unequal distribution of knowledge. And just as industrial capitalism destroys our physical refuges and our natural environments in the name of economic progress, surveillance capitalism destroys our personal refuges and experiences, our psychological environment. It is a violation of space. And while I assume most people have some sense of that violation each time we engage with our phones and computers, and that we write off our privacy as a fair trade for progress, as a scientist and data scientist who is trained to preach the usefulness of data-driven solutions, I'm especially conflicted when thinking about my role in propping up this system.

Another characteristic of surveillance capitalism is that it poses as a force for some greater good, some utopian efficiency from which we can all learn through shared experience, different streams of ideas, and swarm intelligence. On the surface this can be attractive, and in fact it is one of the reasons I became interested in data science. In my academic experience, large data sets were mouthwatering and rare, and my colleagues and I would often fantasize over giant data sets only accessible to businesses, dreaming "if only we had that data..."  So I was drawn to the idea of using the intelligence of the swarm to try to understand hard problems. And I still feel like it doesn't have to be scary -- if it's held within a trustworthy system. Zuboff shows, however, that surveillance capitalism is not to be trusted, and its definition of the greater good is probably not the same as yours.

The priests of surveillance capitalism

Now, what concerned me most in the last section of Zuboff's book is how she illustrated seemingly normal scientific behavior (mainly the strong motivation to discover new things) which helps to build the surveillance capitalism infrastructure, as somewhat eccentric and cultish. Of course, scientists are a weird bunch. I've made life choices for science that friends and family don't understand, and have devoted more hours trying to perfect parts of my discipline than any other pursuit in my life. But the desire to record, extract, and quantify signals that some may consider unquantifiable is presumably what drives most students into the sciences and graduate studies (and beyond), and is certainly not uncommon -- the latest US census info indicates there are 4.5 million people with PhDs. Now I don't think Zuboff is attacking the scientific mindset, but clearly there is something to be said about taking on scientific pursuits without considering the potential social ramifications. And there is tons written on this subject -- unethical practices in science and data science, so I won't get into that here. My main point is that I think Zuboff is attempting to show what happens when you mix 1) the motivations of a scientist who may be overconfident in their ability to understand the landscape of an unknown territory, and 2) the social status they are given as leaders in unknown territory, you get something of a priestly figure, a guide through the unknown. Hence, data scientists are the priests of surveillance capitalism. Further, if we mix in 3) that the unknown territory is rich in resources and 4) a lack of considering the social implications of trekking through that territory, extracting all its resources, and changing the landscape, you get something more like a colonizer. So in a sense, data scientists, with their priestly influence, are strongly positioned to contribute to the colonial actions of surveillance capitalism. Zuboff also discusses this in the first section of the book.

Admittedly, I didn't see this right away. Her metaphors seemed a little extreme to me at first. What Zuboff labeled as the instrumentation of capitalism, I once saw it simply as data collection. When she questioned viewing people and society as an organism, to me that was just modeling. I still struggle with her suggestion that employing methods to track and study wild animals on humans dehumanizes us. Certainly we have learned a lot about humanity by studying non-human animals and systems -- not everything, but a lot. And a lot of those studies have lead to technologies that benefit people. However, I do agree that that depending too much on one method or one source of information can lead to harmful oversimplifications of human problems. That it can brainwash the science and tech community into believing that all the intricacies and dimensions of being human can be reduced into a few orthogonal components. I have seen that firsthand, and you don't have to look any further than today's Twitter feed showing technically-minded people with little to no experience in health studies or epidemiology offering mostly unhelpful (and in worst case, harmful) analyses and solutions to the COVID-19 pandemic.

Again, I think Zuboff's concern is not so much what we can learn from studying simpler organisms and systems, but how that learning success can go to a data scientist's head when mixed with the colonial aspects of capitalism. Much like a priest may convince the congregation they have exclusive access to the Word of God because of their deep understanding of the Bible, a scientist who has demonstrated they can solve a few complex problems may influence others of their lone ability to solve other complex problems, whether intentional or not. This might be true sometimes, but it's often dependent on the validity of the data. And I would argue that the oceans of proxy data available to surveillance capitalists (movement, voice, face, heart rate, spending habits, location, genetics) are not necessarily valid in determining things like your likelihood to buy, to steal, to come in on time, to agree, to foreclose, to live another 10 years, to do your job well. The data is simply... too simple. And it's often noisy, incomplete, or just plain incorrect and out of place. It's not the scientific behavior alone that's concerning. It's that in combination with the resulting invalid extrapolations and interpretations, the scale on which it takes place, stirred in with the underlying motivations of capitalism that makes for a dangerous mixture. I wondered, and still do, whether and how much my scientific motivations contribute to a system which is dangerously transforming our norms of privacy and autonomy.

What I fear

This was transformational reading for me, and I highly recommend Zuboff's book to anyone, especially those to whom the book labels the "priests" of surveillance capitalism. While Zuboff's description of the problem can at times feel a little mystical, the issues she raises concerning a growing imbalance of power, and the forces feeding that growth are quite real and plain to see. If we continue to trade privacy for so-called progress, and accept the inevitability of surveillance and the prediction engines it fuels further into our lives, our homes, and our bodies, I fear each of us risks the right to refuge. I fear we may lose the right to be ourselves. As Zuboff puts it:

"... there can be no doors, no locks, no friction, no opposition between intimacy and distance, house and universe... Seek not the petal-soft iridescent apex of the shell. There is no purpose to curling up in its dark spire. The shell is just another connected node, and your daydream is already finding an audience in the pulsating net of this clamorous glass life."

I'm no priest, but I consider that a mortal sin against the self.

Comments