Page Body

Page Main

Post Main

Post Article

Sam Biddle, The Intercept:

An unavoidable takeaway of “The Age of Surveillance Capitalism” is, essentially, that everything is even worse than you thought. Even if you’ve followed the news items and historical trends that gird Zuboff’s analysis, her telling takes what look like privacy overreaches and data blunders, and recasts them as the intentional movements of a global system designed to violate you as a revenue stream. “The result is that both the world and our lives are pervasively rendered as information,” Zuboff writes.

Tech’s privacy scandals, which seem to appear with increasing frequency both in private industry and in government, aren’t isolated incidents, but rather brief glimpses at an economic and social logic that’s overtaken the planet while we were enjoying Gmail and Instagram. The cliched refrain that if you’re “not paying for a product, you are the product”? Too weak, says Zuboff. You’re not technically the product, she explains over the course of several hundred tense pages, because you’re something even more degrading: an input for the real product, predictions about your future sold to the highest bidder so that this future can be altered. “Digital connection is now a means to others’ commercial ends,” writes Zuboff. “At its core, surveillance capitalism is parasitic and self-referential. It revives Karl Marx’s old image of capitalism as a vampire that feeds on labor, but with an unexpected turn. Instead of labor, surveillance capitalism feeds on every aspect of every human’s experience.”

Shoshana Zuboff:

I remember sitting at my desk in my study early in 2012, and I was listening to a speech that [Google’s then-Executive Chair] Eric Schmidt gave somewhere. He was bragging about how privacy conscious Google is, and he said, “We don’t sell your data.” I got on the phone and started calling these various data scientists that I know and saying, “How can Eric Schmidt say we don’t sell your data, in public, knowing that it’s recorded? How does he get away with that?” It’s exactly the question I was trying to answer at the beginning of all this.

Now we have markets of business customers that are selling and buying predictions of human futures. I believe in the values of human freedom and human autonomy as the necessary elements of a democratic society. As the competition of these prediction products heats up, it’s clear that surveillance capitalists have discovered that the most predictive sources of data are when they come in and intervene in our lives, in our real-time actions, to shape our action in a certain direction that aligns with the kind of outcomes they want to guarantee to their customers. That’s where they’re making their money. These are bald-faced interventions in the exercise of human autonomy, what I call the “right to the future tense.” The very idea that I can decide what I want my future to be and design the actions that get me from here to there, that’s the very material essence of the idea of free will.

I write about the Senate committee back in the ’70s that reviewed behavioral modification from the point of view of federal funding, and found behavioral mod a reprehensible threat to the values of human autonomy and democracy. And here we are, these years later, like, La-di-da, please pass the salt. This thing is growing all around us, this new means of behavioral modification, under the auspices of private capital, without constitutional protections, done in secret, specifically designed to keep us ignorant of its operations.

A long time ago, I think it was 2007, I was already researching this topic and I was at a conference with a bunch of Google people. Over lunch I was sitting with some other Google executives and I asked the question, “How do I opt out of Google Earth?” All of a sudden, the whole room goes silent. Marissa Mayer, [a Google vice president at the time], was sitting at a different table, but she turned around and looked at me and said “Shoshana, do you really want to get in the way of organizing and making accessible the world’s information?” It took me a few minutes to realize she was reciting the Google mission statement.

Surveillance capitalism in general has been so successful because most of us feel so beleaguered, so unsupported by our real-world institutions, whether it’s health care, the educational system, the bank … It’s just a tale of woe wherever you go. The economic and political institutions right now leave us feeling so frustrated. We’ve all been driven in this way toward the internet, toward these services, because we need help. And no one else is helping us. That’s how we got hooked.

For some people, the sort of caricature of “We just want convenience, we’re so lazy” — for some people that caricature holds. But I feel much more forgiving of these needs than the caricature would lead us to believe. We do need help. We shouldn’t need so much help because our institutions in the real world need to be fixed. But to the extent that we do need help and we do look to the internet, it is a fundamentally illegitimate choice that we are now forced to make as 21st century citizens. In order to get the help I need, I’ve got to march through surveillance capitalism supply chains. Because Alexa and Google Home and every other gewgaw that has the word “smart” in front of it, every service that has “personalized” in front of it is nothing but supply chain interfaces for the flow of raw material to be translated into data, to be fashioned into prediction products, to be sold in behavioral futures markets so that we end up funding our own domination. If we’re gonna fix this, no matter how much we feel like we need this stuff, we’ve got to get to a place where we are willing to say no.

Yup (1, 2, 3, 4).

Paul Ciano

Enjoyed this post?

Subscribe to my feed for the latest updates.