Measuring bias: Course helps students better identify discrimination

"Signals, Data & Equity” challenges students to analyze commonly used technologies and systems in order to highlight bias found within them

Danielle Lacey 

“Signals, Data and Equity,” a course offered by the McKelvey School of Engineering, aims to help students better identify potential elements of oppression, such as sexism, racism and xenophobia, in technology and systems.

The course, taught by Neal Patwari, professor of electrical & systems engineering and of computer science & engineering, requires students to not only analyze current technologies for potential biases, but also research ways to keep these same biases from future innovations. Throughout the semester, students worked in groups to identify or design solutions for bias in commonly used technologies.

In the 2022 fall semester, the course was selected to receive a Rotating Undergraduate Studio grant from the Center for the Study of Race, Ethnicity & Equity, which allowed Patwari to bring more voices into the course’s discussion.

“I used the funding to bring in speakers such as Os Keyes from the University of Washington and Timnit Gebru from the Penn State University DAIR Lab, two of the foundational authors in this area,” Patwari said. “The students loved the speakers and mentioned their work many times throughout the semester. It was an amazing opportunity to have them present.”

We spoke with three groups who took the course in fall 2022 to learn which problems they sought to solve and how the course impacted their understanding of more equitable engineering.

 

The Humans Behind the Numbers

PoliceBoard_600x400.png

Kyerra Norton, a junior majoring in computer science, researched predictive policing algorithms and their impact in cities such as Atlanta and Los Angeles after being introduced to the topic in “Citizen Scientist,” a College Writing Program course she took the previous semester.

According to Norton, predicative policing algorithms use historical crime data to help law enforcement agencies and governments determine where to deploy police or estimate where crime is more likely to happen.

What problem did you seek to solve?

Norton: I researched and compared how we use predictive policing to shift responsibility for imposing biases on large-scale communities away from the police and the government. I found that these algorithms are not fixing the problem of discrimination but exacerbating them in minority communities.

How do these algorithms enforce bias?

Norton: People use these algorithms thinking that because they're created on a computer, they're not biased. They're not thinking about the person who's coding the algorithm. Our biases as engineers are imposed on what we're building, and that fact is left out of articles about predictive policing and its successes.

How has this course inspired you to be a more ethical engineer?

Norton: I have more knowledge of how I can make a change in the corporate world. You feel very small when you're in a company with thousands of other employees and not focusing on the whole code base. I found that I can help by joining a union or other groups that are already trying to make these changes.   

Were there any speakers or readings that stood out to you in particular?

Norton: Os Keyes, a doctoral student at the University of Washington. I was really interested in their paper on autism, specifically, on how autism has been turned into a capitalist gain for startup companies. I wanted to delve deeper into that because I hadn't thought about how these topics could affect people on a basis other than gender or sexuality.

I also thought it was interesting that it wasn’t technology that was bad, it was the specific language used to talk about people who are neurodivergent or atypical.

 

Gender Bias Detection Tool

GenderBias_600x400.png

Cassie Jeng, a senior majoring in systems science & engineering and computer engineering with a minor in computer science; Cameron Kalik, a junior in computer science; and Olivia Schriber, a senior majoring in systems science & engineering with a minor in computer science, designed and developed a tool that can automatically detect bias in written documents.

Why did you decide to design this tool?

Schriber: There are certain terms that are biased in and of themselves. In electrical engineering, there are “male connectors” and “female connectors.” I don't know why these are electrical engineering terms, but that was where the discussion started. We could just say “prong” and “insert.”

Kalik: We wanted to build something. We're studying engineering because building things is cool. We thought about the different issues that we were discussing in class and thought that gender bias was a good topic to tackle. One surprising finding was how terms that didn't seem to have any gender bias on the surface could, in fact, be biased.

Can you give an example of that?

Schriber: The word “polite.” You might not think that polite has a gender bias, but a lot of the papers we read said that it’s feminine coded. By putting the word “polite” in a job description, a company could be saying this role requires what traditionally have been considered women’s skills.

Jeng: In a similar manner, there were words we looked for that were labeled as masculine-coded bias, like “adventurous” and “headstrong." It doesn’t seem that they would have a gender impact, but they could hint someone is looking for a male.

How did this course shape your understanding of engineers and what engineers do?

Jeng: It changed my way of looking at new projects. Originally, I thought, “That's really cool. Look at what they did.” Since taking the class, I now think, “Where did that come from? What was their motivation? Was there anything that maybe I would've thought about that they didn't?”

Kalik: It opened my eyes to how expansive the field of ethics is, especially toward model building and algorithms. Maybe there needs to be some restructuring in the technology sector. Have an ethicist work with an engineer to ensure that not only is the technology cool, but it's also ethically sound.

Schriber: I've started asking companies I interview with, “What are you doing to make sure the project you're creating is equitable?” I would not have thought to ask that before this class.

 

Unregulated Bias in TSA Screening

Illustration of TSA screening line

Gabby Day, a junior majoring in electrical engineering and computer science; Sarah Ellis, a junior majoring in electrical engineering; and Camille Giertl, a junior majoring in biomedical engineering, built a website to raise awareness about the potential discrimination transgender and nonbinary individuals can face during Transportation Security Administration (TSA) screenings.

According to the team’s findings, imaging technologies, such as those used in full-body scanners, rely on biased algorithms that can result in false alarms when transgender and nonbinary individuals are screened. This often results in these individuals having to undergo additional screening.

What problem did you aim to solve?

Day: Our project aims to research the unregulated bias in TSA advanced imaging technology screening algorithms. We offer methods to reduce false alarms in marginalized communities, including modifying the algorithm training process and instituting policy changes. Since travel is becoming more prevalent, it’s important to recognize how seemingly neutral technologies may have ingrained biases.

What drew you to this project?

Ellis: Camille brought up this project idea, and I think we were all drawn to it because we hadn’t considered how a TSA screening, which is meant to prevent harm, could impact members of the gender-nonconforming community. Through our research, we found that this mistreatment is happening all over the country, even as close to home as Lambert-St. Louis International Airport.

Giertl: I thought it would be important to bring more attention to an issue that is really prevalent in Americans’ day-to-day lives that I have not seen researched before.

Why did you decide to take this course?

Day: It brings awareness to how engineers can impact human society in a way that avoids harm. Many classes at McKelvey Engineering focus on technical processes and may not hold engineers to high ethical standards. Many universities do not offer this type of class, so it was refreshing to see something that pioneers this mindset.

Giertl: I thought it would be really interesting to explore data and algorithms in a way that is often overlooked: From a social sciences point of view.

Were there any speakers or readings that stood out to you in particular?

Ellis: Timnit Gebru stood out to me for her research into bias in artificial intelligence. It really opened my eyes to the fact that, despite the popular belief that technology is incapable of bias, algorithms and AI are biased because of how they are created and trained. 

Click on the topics below for more stories in those areas

Back to News