Racialized Ways of Machine Seeing

paglen_porn
Porn (Corpus: The Humans), Adversarially Evolved Hallucation 
by Trevor Paglen

I’ve recently been using FaceOSC, a face recognition software that can output to various programs. FaceOSC doesn’t reliably work when I have my glasses on. It makes it very hard to program when I must use my face, because I can’t read the screen at the same time. This experience of unreadability came to mind when I read the texts for this week on machine seeing.

Ten years ago, I remember chagrin among my Asian friends when news came out that cameras sometimes couldn’t recognize our eyes, asking if someone blinked. I found a Time article from back then which gave a technical explanation:

“The constant flow of images is usually too much for the software to handle, so it downsamples them, or reduces the level of detail, before analyzing them. … An eye might only be a few pixels wide, and a camera that’s downsampling the images can’t see the necessary level of detail.”

In the Time article, the founder of 3VR, which creates face recognition for security cameras, claimed that “A racially inclusive training set won’t help if the larger platform is not capable of seeing those details.” I thought that this was a convenient excuse; how likely was it that they had a racially inclusive data set? Had they included Asian people in the development process who could have spotted this problem, and made efforts to fix the technology?

In his essay “Ways of Machine Seeing,” Geoff Cox writes that “‘machine learning’ techniques are employed on data to produce forms of knowledge that are inextricably bound to hegemonic systems of power and prejudice.” Ramon Amaro explains how this applies to face detection in his essay “As If”:

“… common facial detection libraries are often trained on normalized spectrums of data that are prone to false negatives without proper light conditions. In other words, they are trained on image data that includes primarily white subjects. The white phenotype then becomes the pre-existing condition and the prototypical assemblage from which all future human characteristics are measured.”

Furthermore, machine seeing reduces individuals into a norm. Amaro writes that “the basis of simulation here characterizes the living as an emanation of pre-existing conditions, reducing the operation of individuation, and primarily the differences amongst the living, to no more than an assemblage of contradictions that are negated and subsumed into a higher, more homogenous, unity of existence.” Two eyes, two eyebrows, a nose, and a mouth.

For my Machine Learning module this week, we had to pick an artwork which uses machine learning. I chose Trevor Paglen’s Adversarially Evolved Hallucinations, where the artist trained an AI to recognize images from a corpus and then generate new images. The image above was generated from an AI trained on the corpus “The Humans.” When I saw them in a gallery, I was taken by their glitchy beauty, but after reading these texts I realized that all of the images generated from “The Humans” are white.

In “Porn,” we see the object of desire abstracted, but still recognizable as a white woman. Since the data is pulled from porn, this is expected, but still symbolic of the ways that white women are seen as sexualized bodies. For an intersectional analysis as described by Safiya Umoja Noble in “A Future for Intersectional Black Feminist Technology Studies,” we must pay attention to the ways in which machine seeing encodes gender and racial identities together. The computer can’t see me as both sexualized and asleep.

References

Amaro, Ramon. “As If.” e-flux architecture vol. 97, Feb. 2019. https://www.e-flux.com/architecture/becoming-digital/248073/as-if/.

Cox, Geoff. “Ways of Machine Seeing.” Unthinking Photography, Nov. 2016. https://unthinking.photography/articles/ways-of-machine-seeing.

Noble, Safiya Umoja. “A Future for Intersectional Black Feminist Technology Studies.” Scholar & Feminist Online, Issue 13.3 – 14.1, 2016. http://sfonline.barnard.edu/traversing-technologies/safiya-umoja-noble-a-future-for-intersectional-black-feminist-technology-studies/.

Rose, Adam. “Are Face-Detection Cameras Racist?” TIME, 22 Jan. 2010. http://content.time.com/time/business/article/0,8599,1954643-1,00.html.