Shoshana Zuboff’s discussion of surveillance capitalism leaves the tech user feeling stripped of dignity. As it turns out, the “lack of knowledge” that has led to a casual approach to issues of privacy online has immense consequences that many of us don’t understand the half of. Zuboff says that this lack of knowledge is the continued hope of big tech companies that are profiting off our ignorance. By increasing their rights to privacy and stealing ours, they create a self-sustaining economy that no longer needs a classic reciprocal capitalist relationship to gain money; when they can take what we put out there for free and sell it for a profit, why bother asking us to be part of the process? This new economy of profiting off data, lack of checks and balances, and unevenly distributed access to privacy is what Zuboff is referring to when she speaks of the new “logic of accumulation” that she calls “surveillance capitalism”.
There is a lot to break down in Zuboff’s theory, and I am not the one to do a graceful job of it. I grasped about 75% of what she was talking about, but I am left confused about terms like “logic of accumulation”, “non-market”, and even good ole “democracy”. Despite this confusion, I was able to gather that this model of going about making money is really bad. Not for those making money of course, but very bad for us who are being passive participants and victims in a global scale rip off. This rip off not only takes what we are putting out for free and makes money off it, but it is drawing us in to depend on the very things that mine our data so that we will continue to make more free products for them to sell. I never wanted to know what it was like to be a pig being fattened for the slaughter, but now that I do, I can’t help but feel the weight of the injustice of it all.
I think what gets me the most is exactly what the interviewer asks Zuboff – “Don’t we let them?”. In other words, don’t we all know how much we are giving up and yet choose to be okay with this incredible violation of privacy because we love personalization too much? Zuboff seems to think if we knew more, we wouldn’t. She says the issue is truly ignorance and that by getting the word out about how horrible this all is will create the “collective action” needed to move forward with laws and regulations that will reign in these tech giants… but I question that. Maybe this is me playing the devil’s advocate, which at the end of the day is a useless exercise in being difficult for difficulty’s sake, but there really is a part of me that questions the power we give to education and its ability to change people’s minds.
I know that could come off as an incredibly pessimistic thing to say, but it plays into the larger question that needs answering within this issue of surveillance capitalism – what motivates people to change? In the mental health and addiction world there is a five step model of change: precontemplation, contemplation, preparation, action, and maintenance (with the option to relapse). Considering the fact that our technology has become embedded into our lives in a way that causes us to depend on them for everyday functioning – not unlike an individual with their substance of choice – an addiction model of change isn’t a bad lens to look through to consider this issue of education leading to change.
“One group of scholars behind a major study of youth online behavior concludes that a ‘lack of knowledge’ rather than a ‘cavalier attitude toward privacy,’ as tech leaders have alleged, is an important reason why large numbers of youth ‘engage with the digital world in a seemingly unconcerned manner’
When we are in the precontemplation stage of change we don’t think we really have a problem, but we are often being told by outside influences that we do. As Zuboff is pointing out, not many of us really see the true issue that is behind the challenges of privacy. I don’t think it is the case that many people don’t worry about privacy online, I think it is more the case that we don’t understand the full extent to which we don’t have privacy and the implications of our lack of privacy. It is a safe bet to say that many of us after reading Zuboff’s article have moved forward from precontemplation into contemplation. Maybe education does work….
Contemplation: When we begin to understand what Zuboff is saying, it enters into our mind that “wow, this might just be a problem, how can I fix this?”, this is the point in which most people get stuck; it is here that education begins to look less like a knight in shining armor that will save us and more like a nerd wrapped in tin foil that no one wants to listen to. And this isn’t the case just for this topic. Pretty much every single issue and problem you can possibly think of faces this sudden halt in reason and logic. Why? Because our brains don’t want to think with our higher order sections that are logical.
The reason it takes a lot of effort to move past contemplation is because this stage has us bogged down in worrying about what it will be like to give up something that has been serving a purpose. We aren’t using reason, we are using our reptilian and emotional brains that don’t like to feel uncomfortable. And I can tell you right now, most of us can’t give up something up without the hope that something better will fill its place and stop our discomfort. When we consider the fact that “The new tools, networks, apps, platforms, and media” continue to be “requirements for social participation,” as Zuboff writes, the idea of finding alternatives feels hopeless. Which is exactly the reason that the “inevitability” of surveillance capitalism that big tech preaches to keep us in contemplation is so powerful. It appeals not to our logic, but to our emotions and reptilian ways of wanting to just be made to feel good; as much as we might think we are rational creatures, we aren’t ever far from being like the lab rat tapping on the lever for one more delicious shock to our pleasure centers.
Preparation: So how in the world do we get ourselves past the point of just thinking about change, and plan for real change? How can we defeat not only a social culture of tech we are adapting to, but our own biological impulses and needs that don’t want to give up the tech? There are individuals discussing this very thing and teaching better practices of going about our relationship with digital tools, but is this going to be enough?
Though I think education is vital to change, I don’t believe it is enough to push us past preparation into the more sustainable “action” and “maintenance” stages of change. This dynamic relies on the efforts of the few to motivate the masses to change their practices and without the buy in and actual work of collective action that Zuboff says is needed to create laws and regulations, we won’t reach the point of holding these companies accountable. There needs to be something more to make us have hope that what we are giving up will be worth what we will get in return. Sadly, human nature is more about rewards and punishments than we would hope. Or at least that is the outlook of this contemplator.