On a surface level, I was always aware that algorithms shape our online experience, that we produce impressive amounts of data every day, and that big companies use this data for profit. However, I never really thought about what all of this really means. The We Are Data readings made me think deeper into what goes on behind the scenes of web surfing, social media usage, and my overall online presence.
We have been forced into a system where companies like Google, Microsoft, and Yahoo have all
taken our privacy rights and can observe and control us, but it was all “voluntary” on our part, so they technically can’t be to blame. We knowingly log onto their servers every day, sign terms & conditions without any thought, and surf the web every day (Cheney-Lippold, 2019). However, saying these actions are “voluntary” is a huge stretch. You can’t check your email (a necessity for most jobs) without logging onto the Google, Microsoft, or Yahoo server, you can’t own an iPhone without agreeing to terms & conditions, and you can’t perform everyday activities without surfing the web (learning new information, connecting with friends, shopping, etc.). We involuntarily voluntarily give away our privacy rights.
Also from the readings, the notion that we mainly exist in this world as what our data says we are is an eye-opening perspective. Our data makes up an algorithmic identity for us– one we have no control over. Google assigns us an identity based on our internet activity, and our experience on the internet is subsequently constructed around it. My activity on Google could resemble that of what Google thinks a man’s internet activity would be, and Google will think I am a man and treat my search results and advertisements as such. “Google’s gender is a gender of profitable convenience. It’s a category for marketing that cares little whether you really are a certain gender, so long as you surf/purchase/act like that gender” (Cheney-Lippold, 2019, p. 7). Our online identities are decided by marketers, governments, and advertisers, are fueled by money and power, and are unconcerned with who we really are.
The idea that we don’t exist in this world as humans, rather as a compilation of data, is an eerie one. Our value to the world is essentially in our data, and our data decides a large portion of our status in the world. Our gender, ethnicity, age, likelihood of criminal activity, terrorist status, and more are essentially decided by algorithms based on the data we produce. We essentially have no say in who we are. The term datafication, or “the transformation of part, if not most, of our lives into computable data,” describes this phenomenon (Cheney-Lippold, 2019).
Datafication can be problematic because it ignores previous discriminatory social and economic structures that were present throughout history. Classifications that algorithms make of people are done so with these problematic systems in mind and can perpetuate stereotypes or cause certain groups of people to be treated unfairly. This can be seen in HP’s facial recognition software which could not detect a black man’s skin; there was an unintentional presumption of whiteness to be the norm in the software, exemplifying structural white privilege (Cheney-Lippold, 2019). Another example can be seen in the Chicago PD’s usage of data to create a “heat map” of people that are at risk of engaging in criminal activity. It is impossible to separate the police department’s history of racist practices from the development of this heat map. Consequently, certain groups are more likely to face the wrath of law enforcement to a more severe degree than others (Cheney-Lippold, 2019).
Even though I learned a lot from these readings, I don’t think my personal or professional use of social media will change. This is the way the world is– big companies know and use my information, my online identity is shaped by my everyday actions, and I have no control over whom the internet thinks I am or assigns me to be. The ethics of this can be debated heavily, and I agree that many aspects of how data is used are unethical. For example, as previously discussed, we have no choice but to give up our privacy rights, and discriminatory practices are perpetuated through the usage of data to define people. However, things will not change any time soon. We heavily rely on technology, and its necessity in society is only rapidly increasing.
Cheney-Lippold, J. (2019). Introduction. In We are data algorithms and the making of our Digital Selves (pp. 3–36). essay, New York University Press.
Commented on: Brittney Sposito, Julia Wisk, Mia Eifrid, Emmanuel Amula
ReplyDelete