Tech&Innovation

How Racism is Coded into Technology

Riveting presentation on how racism is deeply entrenched in numerous technologies, including popular applications, sophisticated algorithms, and services believed to be neutral or even helpful, is given by the keynote speaker at this year’s annual conference of the International Neuroethics Society. Sociologist and Princeton University’s African American Studies program affiliate, Ruha Benjamin, has dubbed this a “new Jim Code” to signify that it is, as she puts it, today’s iteration of the so-called Jim Crow laws that lasted in the South for almost a century after the Civil War.

The annual Fred Kavli lecture on Friday, Oct. 23, the culmination of the two-day conference, was given by Benjamin. To ensure participants’ safety, the meeting was conducted electronically this year. In an online meeting, the number of participants was twice as high as prior meetings, with participants arriving from 30 different countries. Benjamin brings a fresh perspective to the topic because of his unique experience. Her mother’s family is of Iranian descent, while her father’s is African American, as Princeton’s public affairs office claims. She was born in India, and when she was a kid, she moved to the United States with her family. Her family was reared as Baha’i believers. Spelman College in Atlanta awarded her a bachelor’s degree, and the University of California in Berkeley conferred a doctorate on her.

Ben provided many startling illustrations of how “anti-Blackness” infiltrates all major sectors of American culture. The concepts she presented in this presentation were almost certainly known to people who have read her books or previously made commentary on her topics, but I doubt that everyone who registered for the meeting was acquainted with all of them.

One of the very first things she said was an extremely tragic event that was revealed in a newspaper article in 2015. It is a previously concealed case of the kind of anti-Black attitudes that still affect enforcement practices, as shown by the North Miami police department utilizing mug photos of Black male offenders as target practice. While it was distressing that their faces were being used as targets, the silver lining was that a group of clergymen volunteered their own faces as targets, saying “use me instead.” Resistance to this kind of senseless brutality was a major step in demonstrating the senselessness of what the Miami police were doing.

A smart reaction to predictive policing systems that utilize police technology to identify where crime is likely to occur is one of my personal favorites among her examples (usually in lower income Black neighborhoods, which are more heavily policed to start with). A rebuttal to this theory was authored by a group of IT professionals in 2017, who examined data from a financial regulatory company to discover areas where white collar crime is most likely to occur. Users of their app may get alerts before entering a zone that poses a threat to White Collar Crime Risk. They also created composite pictures of what most probable wrongdoers would look like by using photos of business leaders who are white males. Concern was raised by the project leader in an interview at the time: “What if police went to financial districts and stopped and frisked white men in business suits?”

Two separate academic investigations were mentioned by Benjamin, who found that it is next to impossible to eradicate entrenched biases. In order to increase their odds of seeing difficult conduct, instructors were encouraged to view video clips of preschoolers in a classroom and look for behaviors that may lead to expulsion of the classroom. Teachers were found to have spent more time gazing at black boys than white youngsters using eye-tracking equipment. They were in fact actors, and the videos showed no signs of disobedience. The main researcher told a reporter at the time: “Implicit prejudices don’t begin with Black males and the police.” They start with preschoolers who are Black and include their instructors as well.

While this research discovered that Whites showed greater support for punitive justice policies, such as California’s Three Strikes Law and New York City’s stop-and-frisk policy, when exposed to information about the grossly disproportionate numbers of Black people in prison, it also discovered that Whites became more supportive of laws and policies that contributed to those disparities, including the Three Strikes Law and the stop-and-frisk policy in New York City. Fear of Black offenders was increased in the white participants who said they “deserve to be incarcerated.”

Benjamin was cautions about thinking that racism would be simple to fix technologically. She referred to the book Race on the Brain: Bridging the Gap Between Neuroscience and Society by Jonathan Kahn, which asserts that studies dealing with implicit bias have been joined with neuroscience and neuroscience-based research on finding racially coded reactions in distinct brain regions with imaging technologies. This “pills for racism” approach makes it easier for them to get rid of racism, without taking into consideration the cultural, historical, and social elements that they believe are the main causes of racism.

Additionally, disease algorithms were mentioned. This algorithm tried to predict which patients needed extra attention to prevent hospitalization, but failed because it mistakenly used health care expenses as a proxy for illness. On average, black individuals pay lesser medical expenses than white people, therefore the software was programmed to give precedence to white patients who have lower medical costs over black patients who have higher medical expenditures. Millions of individuals were negatively impacted.

To make the argument Benjamin also cited a book called The Protest Psychosis: How Schizophrenia Became a Black Disease, which was released in 2010. The book’s author, a physician and cultural critic, presented his conclusions from a decade’s worth of data obtained from a mental hospital in Michigan in interviews that appeared at the time. Schizophrenia used to be thought of as an illness mostly affecting white petty criminals, but in the wake of the riots and civil rights demonstrations of the 1960s, it started to be commonly thought of as a sickness more often associated with Black males who were hostile, aggressive, and violent. They were diagnosed amid a social environment that had obviously altered.

The issue of algorithmic bias is rising to prominence as an exceptional challenge in healthcare and medicine, according to Harvard and Dana Foundation board chairman and distinguished service professor Steven Hyman, Ph.D. Benjamin was complimented by the writer who said that he provided a sophisticated point of view on the problems. She makes a crucial point in that humans often see resource allocation by algorithm as objective and fair because they believe that computers are devoid of goals and free of emotion. “However, since these huge datasets used to train machine learning models are created by capturing our imperfect reality with its numerous hidden biases, ” machine learning “may in fact be universalizing those biases and making them more difficult to detect.”

Is there anything we can do to rid our technology of concealed racism?

Neuroethics may have a key role to play in altering or challenging people’s beliefs, since it can reveal the “complicity” of science and neurology in contributing to these erroneous perspectives. She believes that not just technical wizards, but humanists and social scientists, should be part of the design process as well.

By teaching individuals both within and outside of the tech industry about the differences between racism in society and in technology, and how to remove it, we can become a part of the solution. As Benjamin herself is the founder of an innovative Just Data Lab at Princeton, which brings together important individuals to rethink the data required for justice, Benjamin herself has spearheaded the founding of the lab. In partnership with community groups, the lab is creating a “pandemic portal” to monitor and respond to the pandemic’s racial aspects, and to help alleviate existing disparities.

Most organizations and businesses have shown their support for social justice and the cause of racial equality by issuing public comments in favor of the Black Lives Matter movement. However, the most confounding issue that she posed was “the million-dollar question,” and how to translate ideas into action.

Citing Benjamin, Benjamin argued that America has become a racist country by “insignificant degrees” because Jim Crow won’t be erased by powerful gestures. Instead, advocates need to look closely at the ways that daily practices and patterns, including the way job offers are promoted, perpetuate racism. This includes unconscious racist attitudes that are handed down from one generation to the next. Do not expect grand, flowery words to cure racism, she said. At this moment, we may begin to counteract the thinking patterns that are affecting us.

reference : https://dana.org/article/baked-in-how-racism-is-coded-into-technology/

Categories: Tech&Innovation