Categories
Miscellaneous

I’m going to get baking ads now, aren’t I?

                When I encounter a new topic that I have no foundational information about, I start to feel the sensation of panic that arises from *gasp* not being knowledgeable about something. This is a deep insecurity that I mask with baking and researching in a chaotic fashion that usually leads to more questions than answers. Yet out of the swirl of chaos and carbs, a technique of learning has emerged that seems to be mellowing with age – whether it can be deemed good or bad is yet to be determined. This ‘technique’ is to simply expose myself to as many different explanations of the same topic until I start to grasp it essence. Revolutionary, I know.

As we progress through this class and are exposed to the digital realm and all its dark corners, I have retreated to this learning method to try to grasp some of these topics that I have never had any real exposure to. When I listened to Tufekci’s TED Talk on the way algorithms used to target us for ads are also being used in more sinister ways, I found that for once I was familiar with some of what she was discussing.

I mentioned at the beginning of the semester that I decided to switch over to #netnarr partially because of a New York Times podcast called Rabbit Hole. This podcast does a deep dive into the way that our internet use impacts us, with a very focused segment on the role of YouTube and its algorithms. It was no surprise to me, then, when Tufekci discussed the way that YouTube will take one thing we are looking at and throw us into the depths of the more extreme if we let the autoplay go for a little too long. Her approach to the conversation helped me understand the relationship between the ads and the videos that are chosen and the algorithm at play behind them. As she explains it, “The same algorithms set loose upon us to make us more pliable for ads are also organizing our political, personal and social information flows…”

The whole thing left me thinking that the messages of my youth about certain things being “gateways” into worse things wasn’t as far off as I’ve come to believe. Now, I’m not trying to fall into the Orwellian trap that Tufekci discusses; I agree with her 100% that building AI boogey men doesn’t help with identifying and preventing the real threats that are at play. But the “basic cheap manipulation” that we have become used to in our online lives is scary when looked at through the light that more is going on within the mechanisms of these annoying ads than we think. Especially when we consider that these algorithms are targeting our perception of reality.

Most of us already know that what the internet shows us is a curated version of reality, but knowing about something and not being impacted by it are two different things. Tufecki captures this fine distinction in a frightening way when she is discussing the way information shows up in our feeds: “As a public and as citizens, we no longer know if we’re seeing the same information or what anybody else is seeing…” We may all know that we are being targeted in different ways and seeing different information, but do we really consider how that impacts us as individuals and as a society? Are we paying close enough attention to where that idea we had came from or who we see ourselves and others as?

Though Tufecki presents this as a distinctly artificial intelligence issue, I once again turn to the notion that none of this is new. The scale this is able to be conducted on is different – which is Tufecki’s point – but the idea that reality is being fed to us in different ways by different sources? That’s old news. The history of the education system is a great example of this. Education has been conducted in such a way that certain scholars and viewpoints have been specifically picked for institutions to adhere to which are then planted into the fertile minds of the students who come to these institutions. The institutions of higher education may strive for individuals who think critically for themselves and come to their own conclusions about issues, but there will always be the heavy lean of the institution or scholar’s perspective guiding how these issues are approached. When you consider that education has always been part of crafting a version of reality and combine it with the new approach of using technology to craft reality in ways that are far more insidious and covert, you get a lot of considerations for the post-pandemic university.

Though the crafting of reality in the learning process is rife for discussion, I find a a more mundane example to be somehow more insidious. The example I am thinking of is the reality of the university culture and student body. This was seen last week when danah boyd brought up the idea of admissions and who is allowed to attend a university. We already know that universities can turn away people whose affiliations or ideologies (i.e. realities) don’t align with their missions (or the mission of those funding them); but what if the ability to pick or choose people went beyond just being able to look up people’s social media profiles and was part of the data mining and selling that Tufecki’s is talking about? The fact that this is already being done to recruit students to universities shows that it really isn’t a matter of “if” but “how” in the post-pandemic university. How are we going to see data on students applied to the reality the university wants to craft? On a more practical level, how will the laws around student information and FERPA be impacted by what seems to be a whole new set of private information on students? At the end of the year, can I request a transcript of all the information that was collected on me and ask that it be erased from the college’s records so they can’t sell it? How would I feel knowing I fit the algorithm for a university, and what would that say about me?

               Again, the more I look into things, the more questions I seem to find. I have a great deal of baking and researching ahead of me as I continue to explore.    

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.