Racism and Technology are co-produced: a post-humanist perspective

Sonal
9 min readOct 15, 2021

--

.

Introduction

Technological innovation does not always imply social progress. History is a testament to socio- technical projects resulting in far-reaching ramifications in the political and social milieu. Technology design can engender new issues and propagate the existing ones if not examined with a critical lens. Racial discrimination, being deeply-seated in the society, when enters the design of technology, not just hinders people of color from reaping the benefits of technological progress but also creates situations of intimidation, humiliation and menace for them. The historical and contemporary examples of racial bias in technology empower us by making us aware of the prevalent social inequity and provide us with an intellectual understanding to resist the creation and reproduction of such technology. This paper argues that designing racially inclusive technology requires a shift from a techno-determinist approach to a post-humanist one when perceiving technology to understand how race and technology are co-produced. A hopeful future of racial inclusivity in technology requires an increased representation of people of color in the design and development of technology and racial literacy not just amongst tech workers but also in the general society.

Understanding Racism

Racism comprises attitudes, actions, and institutions that contribute to relative disadvantages for members of racial groups with comparatively less power. It ranges from overt acts such as hate speech and violence to systemic exclusions, prejudices, and biases to subtle, even unconscious acts such as aversive racism and microaggressions [1]. The pyramid of hate [2] illustrates the escalating levels of behaviors and attitudes of bias, hate and oppression in society. When bias goes unchecked at the lower levels of the pyramid, it leads to a normalization of accepting discrimination. Though not all biases at the lower levels lead to genocide, every genocide in society has been built on those unchecked biases. We need to challenge biased attitudes and behaviors in ourselves, others and institutions to shield the society from being plagued by discrimination and hatred (Figure 1).

Figure 1: Pyramid of Hate — Anti-Defamation League, 2019.

Racial Discrimination in the United States

Racial discrimination against Blacks in the United States finds its historical roots in slavery. The killing of George Floyd, on May 25 2020 is a perfect example of police brutality towards African Americans. This contemporary episode evokes the atrocities that slaves in the 16th and 17th centuries underwent as they worked on the cotton fields of the South in dehumanizing work conditions where they were whipped, raped and mentally tortured by their white masters. While Floyd’s case happened out in the public light, behind the doors, North Miami police were caught to be using black faces for target practice in 2015 [3] (Figure 2).

Figure 2: Black faces for target practice used by North Miami Police
Figure 3: Overpasses with less clearance height designed by Robert Moses

Racism of the society when enters the design of socio-technical systems further perpetuates it in the society. For instance, between the 1920s and 1950s, Robert Moses, the master-builder of roads, parks and bridges designed around 200 overpasses in Long Island New York with less clearance height to obstruct the black and other poor people who took the 12-feet tall buses from entering Jones Beach, his well-acclaimed public park. According to the evidence provided by Moses’ biographer, Roberto A.Caro, the reasons reflect Moses’ social class bias and racial prejudice [4]. Even generation after Moses’ death, his projects continue to embed a social divide in present-day New York (Figure 3).

Moving from history to the present day, there are many examples in healthcare and criminal justice that bring to light biases encoded in technology. In December 2020, the New England Journal of Medicine published a research letter that showed how a pulse oximeter used in Covid-19 care had an encoded racial bias [5]. In May 2016, ProPublica, a non-profit news organization, analyzed Compas, a software used across the United States to predict future criminals, and found that it was biased against blacks [6]. In the entertainment industry, video games have garnered enough criticism for intensifying racist stigmas where games like Resident Evil 5, Uncharted, and the God of War all portray strong white men shooting non-whites [7]. Even when it comes to speech and facial recognition, the algorithms have been found to be biased. For instance, in March 2020, Stanford researchers found that automated speech recognition is more likely to misinterpret black speakers [8] (Figure 4). In Dec 2019, researchers of the National Institute of Standards and Technology found that algorithms falsely identified African-American and Asian faces 10 to 100 times more than Caucasian faces [9].

It’s not seldom that racial bias in technology gets published in newspapers with big tech companies apologizing for racial biases in their AI systems that caused disgrace to the black community. In September 2021, Facebook apologized for racial bias in its AI system which popped an automated prompt to the users watching a video from a British tabloid featuring Black men to ‘keep watching videos about primates [10]. In 2015, Google apologized when its photo app tagged two black people as gorillas [11] (Figure 5).

Figure 4: Word Error Rate (WER) higher for blacks than whites
Figure 5: Google’s photo tap tagged black people as gorillas

Moving forward from the present day, we have an uncertain future of technology before us. Derrick Bell, an American lawyer and civil rights activist, urges a radical assessment of reality through creative methods and racial reversals, insisting “to see things as they really are, you must imagine them for what they might be” [12]. A relevant example here is Sleep Dealers (Figure 6), a science- fiction movie by Alex Rivera that projects a dystopian side of future technology through the story of a young man from Mexico who migrates to the United States not physically but over the internet; as his body is connected to the internet, he controls a machine that performs his labor in America [13]. The movie presents a remodelled version of slavery of the past for the future mediated through technology.

Figure 6: Sleep Dealers, a dystopian science-fiction movie by Alex Rivera

Understanding Agency

The past and present-day examples of racial bias in technology, and projections about the future make it seem imperative to comprehend ‘agency’. A techno-determinist approach which considers technology as the prime actor shaping social relations and causing social change, can be traced when the above-mentioned science-fiction movie, like any other Hollywood movie, portrays the dystopian image of technology where robots, automation and emerging technology lead human to its own detriment. Even when Silicon Valley portrays a utopian image of technology by selling the idea of how technology is going to make us more efficient and save our lives a similar techno-determinist approach can be read. Such an approach of viewing the vast landscape of technology and society is limited as it obscures the human agents behind their development. As racism and technology are co-produced, we need to view this issue through a post-humanist lens which transgresses the technology-society binary and stresses on co-agency.

With racial bias being found in many present-day technological applications, a term called techno-racism has surfaced. As per Mutale Nkonde, founder of AI For the People, techno-racism describes a phenomenon in which the racism experienced by people of color is encoded in the technical systems used in our everyday lives [14]. The term, first used in 2019 by a member of the Detroit civilian police commission to describe a faulty facial recognition system that couldn’t identify one black man or woman to another [15], gained attention in 2020 as webinar title with Tendayi Achiume, a UN special rapporteur on racism. Achiume argues that digital technologies can implicitly or explicitly exacerbate existing biases about race and ethnicity [16]. Here again, a shift in the narrative which moves agency solely from technology and distributes it between human and technology might be helpful. Perceiving technology and human as co-agents shines a light on the role of biases in the design of technology and the need to critically analyze and challenge racist attitudes and behaviors that create biases in technology.

Towards Racially Inclusive Technology

The human-centered design process should reflect upon which humans are prioritized. More people of color need to be involved in user research and usability testing for a racially inclusive design. According to Sasha Constanza-Chock, a professor at MIT, we have an ethical imperative to systematically advance the participation of marginalized communities in all stages of the technology design process; through this process, resources and power can be more equitably distributed [17].

The design process also requires to be critically interrogative to check unconscious biases. Both the tech workers — who design and develop technology, and the general people of the society need to develop an informed and conducive interrogative attitude to prevent the acceptance of discriminatory practices and behaviors. For instance, the top-ranked questions on Slido, an audience engagement tool that allows people to pose questions in real-time — Why is #DesignSoWhite? To do design for good don’t we have to talk about the oppressive systems of white supremacy,heteropatriarchy, and capitalism? make a good example of the required interrogative intellect for racially inclusive technology [18].

Another way of exposing and weeding out discrimination is through deliberate and inventive subversion of the technological status quo [19]. For instance, Hyphen-Labsan, an international team of women of color who work at the intersection of technology, art, science, and futurism, experiments with a wide array of subversive designs–including earrings for recording police altercations, and visors and other clothing that prevent facial recognition [20].

Racism seems to be an inextricable component in social behaviour but, we need to remember that if inequity is woven into the very fabric of society, then each twist, coil, and code is a chance for us to weave new patterns, practices, politics; its vastness will be its undoing once we accept we are the pattern makers [19].

References:

[1] Ihudiya Finda, Ogbonnaya-Ogburu, Angela D. R. Smith, Alexandra To, Kentaro Toyama.2020.. Critical Theory of HCI. https://dl.acm.org/doi/10.1145/3313831.3376392

[2] Anti-Defamation League. 2018. Pyramid of Hate. https://www.adl.org/sites/default/files/documents/pyramid-of-hate.pdf

[3] Willard Shepard and Mc Nelly Torres. Jan. 16, 2015. MSNBC. Retrieved from https://www.msnbc.com/msnbc/family-outraged-after-finding-police-using-mug-shots-target- practice-msna506881

[4] Winner. 1986. Do Artifacts Have Politics?

[5] Michael W. Sjoding, Robert P. Dickson, Theodore J. Iwashyna, Steven E. Gay, Thomas S. Valley. 2020. Racial Bias in Pulse Meter. The New England Journal of Medicine. https://www.nejm.org/doi/full/10.1056/nejmc2029240

[6] Julia Angwin, Jeff Larson, Surya Mattu, Lauren Kirchner. May 23, 2016. Machine Bias. ProPublica. Retrieved from https://www.propublica.org/article/machine-bias-risk-assessments-in- criminal-sentencing

[7] David Hankerson, Andrea R. Marshall, Jennifer Booker, Houda El Mimouni, Imani Walker.2016. Does Technology Have Race? ACM Digital Library. https://dl.acm.org/doi/10.1145/2851581.2892578

[8] Edmund I. Andrews. March 23, 2020.Stanford researchers find that automated speech recognition is more likely to misinterpret black speakers. Stanford news. Retrieved from https://news.stanford.edu/2020/03/23/automated-speech-recognition-less-accurate-blacks/

[9] Natasha Singer and Cade Metz. Dec. 19, 2019. Many Facial-Recognition Systems Are Biased, Says U.S. Study. The New York Times. Retrieved from https://www.nytimes.com/2019/12/19/technology/facial-recognition-bias.html

[10] Dustin Jones. Sept 14 2021. Facebook Apologizes After Its AI Labels Black Men As ‘Primates’. NPR. Retrieved from https://www.npr.org/2021/09/04/1034368231/facebook-apologizes-ai-labels- black-men-primates-racial-bias

[11] Loren Grush. July 1, 2015. Google engineer apologizes after Photos app tags two black people as gorillas. The Verge. Retrieved from https://www.theverge.com/2015/7/1/8880363/google- apologizes-photos-app-tags-two-black-people-gorillas

[12] Bell 1995, p. 898.

[13] Director, Alex Rivera. Producer, Anthony Bregman. 2008. Sleep Dealers. United States and Mexico.

[14] Faith Karimi. May 9, 2021. People of colour have a new enemy: techno-racism. CNN. Retrieved from https://www.cnn.com/2021/05/09/us/techno-racism-explainer-trnd/index.html

[15] Tom Perkins. 17 Aug 2019. It’s techno-racism: Detroit is quietly using facial recognition to make arrests. The Guardian. Retrieved from https://www.theguardian.com/us- news/2019/aug/16/its-techno-racism-detroit-is-quietly-using-facial-recognition-to-make-arrests

[16]: Tendai Achiume. 23 July 2020. Techno-Racism and Human Rights: A Conversation with the UN Special Rapporteur on Racism. Retrieved from https://chrgj.org/event/techno-racism-and-human-rights/ [17]: Costanza-Chock 2018.

[18]: Twitter @thisisDRS June 26, 2018 at 5:25 a.m. 131

[19]: Benjamin, Ruha. Race after Technology : Abolitionist Tools for the New Jim Code. 2019. [20]: Visit http://www.hyphen-labs.com.

--

--