Ch 10: Race & Technology Field Notes Miscellaneous

Diversity, Equity, and Humanity

I’d like to start this week’s blog by sharing the gif I made of a scene from the Screening Surveillance film “Frames.” This short film shows a dystopian society in which surveillance cameras and facial recognition technology track your every move. It presents an uncomfortably plausible future (in fact, the social credit system the video hints at already exists in China), and it strengthens the points made in this week’s videos about the need for equitably designed facial recognition software (but more on that later). I know we were supposed to add text to the gif, but part of the reason I found this scene so powerful is because there’s no dialogue; the cinematography speaks for itself. I didn’t want to cheapen the power of this frame by adding a needless caption.

And now I’m going to contradict myself by spending a whole paragraph explicating this scene that “speaks for itself.” The wide, overhead view; the barren trees and cold, snowy field; and the singular slow moving figure in this frame all speak to a sense of intense loneliness. This particular shot evokes those feelings of disconnect we discussed during class last week; the fact that we’re constantly being surveilled is making it harder for us to form genuine relationships (ones that aren’t based on the facade we paste all over social media). 

The feelings of isolation present in “Frames” also connect to the two videos we watched for class this week: Joy Buolamwini’s TED Talk “How I am fighting algorithms with bias” and Ruha Benjamin’s “Viral Justice: Pandemics, Policing, And Portals With Ruha Benjamin.” Both videos explore how algorithms, artificial intelligence, and facial recognition software replicate and perpetuate existing racial biases. Buolamwini shares her own personal experience with facial recognition software that couldn’t recognize her dark skin but instantly recognized her white colleagues’ faces. Although I can’t speak to this issue from personal experience, Buolamwini’s description makes the entire situation seem isolating and dehumanizing. 

These unfeeling AIs and algorithms are deeming an entire race to have faces that are unrecognizable as human because the developers didn’t have people with darker skin in mind when making this technology. To think that these programmers just forgot such a large portion of the population while coding this tech is disturbing for a lot of reasons. The videos cover many of these frightening implications, such as the growing prevalence of flawed facial recognition technology in police departments, but what disturbs me is the obvious lack of empathy in the people who developed these programs. 

Not only, as the videos suggest, do we need more diverse training sets to ensure facial recognition technology can recognize all faces, but we also need more diverse programmers and coders to develop this tech. In the post pandemic university, I’d like to see more recruitment of diverse students into STEM majors, but I think that’s just one piece of the solution. As our awareness of bias in technology grows and develops, universities should be recruiting more liberal arts and humanities majors, as well. Currently, I think our society pushes STEM majors as the only useful degrees because they’ll train you to work in tech, or be an engineer, or learn to code, or find some other lucrative career while those worthless humanities majors get stuck working at Starbucks and regretting their life decisions. 

Obviously, I disagree with this mindset, not only because I’m currently studying English Writing, but also because a society full of only STEM majors would be dangerous. Both of this week’s videos show how easily unempathetic programmers’ biases can seep into the software they’re developing. We therefore need more humanities majors—people who have studied the long histories of oppression of different minority groups, who are cognizant of society’s biases and their own, and who can empathize with diverse peoples and cultures—to join the tech industry. If the developers of this flawed facial recognition software had hired even one historian, or sociologist, or artist, etc., that liberal arts major could have easily spotted the glaring omission of a diverse testing set or pointed out the developing team’s lack of diversity. My hope for the post pandemic university is that academia will recruit diverse students into fields that focus on equity and empathy to prevent these algorithmic biases from further infecting our society.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.