Categories
Ch 09: Surveillance Capitalism

Finding a Path through Surveillance Capitalism

This week, I’ll be attempting to lead us through Shoshana Zuboff’s article “Big Other: Surveillance Capitalism and the Prospects of an Information Civilization” and her video interview with Channel 4 News “on ‘surveillance capitalism’ and how tech companies are always watching us.” For both my blog and my pathfinding activities, I’ll be focusing mostly on the article, but I’ll occasionally make references to the video, as well. 

I’d like to start by deciphering exactly what Zuboff means by the term “surveillance capitalism.” As we discussed last week in our conversation about Zeyneb Tufekci’s TED Talk, big tech companies like Facebook and Google are constantly collecting our data and selling it to advertisers. Zuboff explains that this new form of commerce is fundamentally different than the capitalism of the 20th century because we are no longer the consumers—advertisers are. Instead of being consumers, then, users of Google, Facebook, Twitter, etc., are the product. Our data is the commodity that’s being bought and sold. 

This worldwide shift to surveillance capitalism has far reaching effects. For example, Zuboff describes how the nature of contracts are changing. There is no longer any trust between the parties or room for “human fallibility” in contracts because employers are aware of every action their employees take (81). While reading this section, I couldn’t help but think of the horrendous working conditions in Amazon Warehouses. Workers claim that Amazon keeps track of their movements and fires anyone who can’t keep up with the strenuous workload; the company has fired workers for taking too many seconds to cross the length of the warehouse and pregnant women for taking too many bathroom breaks. Amazon treats its employees’ labors as “machine processes,” which, as Zuboff describes, costs the workers their freedom (81). 

Another facet of surveillance capitalism is the change in the relationship between workers and consumers. Zuboff compares Silicon Valley to the automotive industry to demonstrate that, unlike in 20th century capitalism (in which middle class workers were also consumers of the goods they produced), under surveillance capitalism, tech companies “have little interest in [their] users as employees” because so much of their processes are automated that they’re able to make as much (if not more) money as traditional companies but with fewer employees and costs (80). 

The idealist in me wants to see this type of automation as a good thing—less need for labor-intensive jobs could mean more time for people to pursue their passions and find work that provides a higher quality of life—but our current society is severely lacking the safety nets and social support that would be necessary to accommodate the job loss that accompanies automation. As it stands now, all this automation means is that under surveillance capitalism, the small number of employees at tech companies will accumulate a much higher concentration of wealth than the users who populate their platforms.  

These users are not only missing out on the capital generated by surveillance capitalism, but they’re also lacking the knowledge necessary to gain wealth in this system. In her interview with Channel 4 News, Zuboff explains that “knowledge equals control.” Big tech companies accumulate almost limitless knowledge about us by collecting our data, but “we know nothing about them.” She provides examples of how “[s]urveillance capitalism thrives on the public’s ignorance” (83), such as when Google’s Street View cars secretly scraped data from WiFi networks and only stopped the practice after being forced to pay a settlement (78). Google has repeated this strategy of conducting secret, unethical data collection numerous times, and the company only stops after new laws are made. 

After reading these stories about repeated ethical violations, I can’t help but feel resigned to losing my privacy. In the video interview, Zuboff suggests the solution is to enact laws that will regulate tech companies, but, as I’ve mentioned in previous blogs, our government doesn’t comprehend the magnitude and scope of the surveillance capitalism problem. How can we regulate these tech companies when we don’t even know about the unethical practices they’re planning next? As Zuboff says, these companies “outrun public understanding” (83), so how can a population that barely understands the problem hope to solve it? 

Stopping surveillance capitalism isn’t as simple as ditching our devices or boycotting Google. In the video, Zuboff clearly explains that, because more people are working longer hours for less money (something I’ve also touched on in previous blogs), we need the convenience that these online services provide. It’s also impossible to go off the grid, because these companies even have access to security footage from cameras in public places. We can’t escape the “ubiquitous networked institutional regime that records, modifies, and commodifies everyday experience” that Zuboff calls “Big Other” (81), so we’re always being watched.

This constant surveillance has the power to change our behavior. As Zuboff describes this “anticipatory conformity,” I couldn’t help but think of the “Zoom Gaze” and how the knowledge that we’re being watched affects our actions, words, and emotions. Zoom shattered our illusions of privacy by revealing intimate details of our home lives, but tech companies were stealing our private information and using it to affect our behavior long before the pandemic. 

This power to change our behavior concerns me greatly. We touched on this topic in last week’s class discussion, but it’s worth mentioning again because it has so many sinister implications. Zuboff’s example of using Pokémon GO to drive foot traffic to specific businesses doesn’t sound too dystopian, but there’s a real danger that the principles of surveillance capitalism have invaded our political system and influenced elections. As I did in last week’s blog, I must once again question how we can be sure our beliefs and behaviors are ours rather than a result of the information the algorithms choose to feed us. 

Despite the length of this post, I realize I haven’t touched much on the post pandemic university. Truthfully, I’m a little reluctant to narrow my focus to academia because the effects of surveillance capitalism reach much wider than the university setting and have more significant impacts on nearly every facet of our lives. However, it is still worth questioning what data universities are collecting on their students and whether schools will use that information to engage in the “behavioral modification” Zuboff describes (82). If universities are using platforms like Blackboard or Canvas to collect our data, how might they use that data to influence our behavior and increase their own profits?

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.