The New Jim Code? Race and Discriminatory Design

EdSurge Podcast

The New Jim Code? Race and Discriminatory Design

By Rebecca Koenig     Aug 20, 2019

The New Jim Code? Race and Discriminatory Design
A bench with armrests that prevent people from laying down may be an example of discriminatory design.

This article is part of the collection: How AI Is Impacting Teaching and Learning.

People have a tendency to treat technology and data as neutral, sterile and immune to mortal failings. Yet the digital tools we use at schools, jobs and home don’t simply fall from the sky—humans produce them. And that means human biases can and do slip right into the algorithms that have increasing power over our lives.

This week for the podcast, we’re talking with someone who’s questioning the assumptions embedded in the technology and data we use in education, health care, law enforcement and beyond.

That guest is Ruha Benjamin, associate professor of African American Studies at Princeton University and author of new book “Race After Technology.” “Human beings are imagining our technological infrastructure,” she points out—and some people’s fantasies are other people’s nightmares.

EdSurge met up with Benjamin at the Digital Pedagogy Lab at the University of Mary Washington, a gathering of teachers, students and others who are exploring the role of technology in education. We talked about discriminatory design and her concept of The New Jim Code.

Listen to the discussion on this week’s EdSurge On Air podcast. You can follow the podcast on the Apple Podcast app, Spotify, Stitcher, Google Play Music or wherever you listen. Or read a portion of the interview below, lightly edited for clarity.

EdSurge: I'm curious how you would describe discriminatory design and define the idea of the New Jim Code.

Ruha Benjamin: Absolutely. And so I first started thinking about the phrase discriminatory design starting in the life sciences and biotechnology. And the example that I used to illustrate what that actually looks like practically goes back to a park bench that I [once sat on] in Berkeley, California, which is where I went to graduate school.

I was back in Berkeley in February and wanting to just lay on this bench for a few minutes between meetings, and I couldn't lay down on the bench because there were armrests built into the bench, which is pretty common. But at that moment I was frustrated by it because I wanted to lay down and I realized, "Oh, you know, there's probably a reason for this." I'm sitting here in the Bay Area, which has a homeless crisis in direct proportion to the growth of the tech industry. And it's likely that businesses are putting [the armrests on] to deter so-called vagrants from laying down.

I went and did some searching online and found that this is a global phenomenon: the idea of designing public space in order to draw in certain publics and exclude certain publics. An armrest is just one mechanism, one design decision that reflects that value and that politics, in which we want to get the problem out of sight, out of mind. We're not really dealing with homelessness, but we don't want to see those who are affected by it.

I've found these benches in a town in France. The mayor put them out on Christmas Eve, and they literally have a cage built around the bench and you pay to get into that. And the people in the town were so appalled that within 24 hours they had the bench removed, which tells me that, organizing collectively, you can actually change the design of whatever discriminatory design we're talking about. You don't just have to accept it.

And then the example that best illustrates this notion of discriminatory design for me is a bench that has spikes built into it, and it's metered. So you put in a coin, and then the spikes retreat for about 15 minutes, and then they give you a little beep, beep, beep, and the spikes come back up and you have to keep feeding the meter of the bench. That bench was designed by a German artist initially to get people to think around this question of how we privatize public space. And we can apply that to so many arenas.

We can think about the privatization of education. You pay to access it and to let the spikes retreat. And if you can't pay, then you're harmed or excluded.

I'll give you a much shorter answer for the New Jim Code, which is really a specific manifestation of discriminatory design in which racist values and assumptions are built into our technical systems. And how on the surface it looks like innovation. It looks like a shiny new thing that makes life easier. It's more efficient. We think of it as more neutral. But when you go right beneath the surface, you see the spikes, you see the way in which it's producing nightmares for some in the name of efficiency and progress.

You've pointed out that data and technology isn't merely found, it is produced. Can you talk a little bit about why that distinction matters and why the producers matter?

Absolutely. So in the context of the New Jim Code and discriminatory design, one of the main sorts of channels that's producing this inequity is the data that's used to train automated systems. The training data.

Let's say you're trying to decide what teachers to hire in your school. And you're basing it on those teachers who've excelled previously. That's your training data. But let's say for the last 50 years, you've only hired teachers that come from three elite schools. And that history is built into who is excelling now, and that means you're likely to get more of the same. Your automated system is going to learn what you have previously thought was a good teacher and give you more of that.

So if you are trying to broaden that pool, you're not likely to do it unless you go back and look at that training data. And that way, we are both reproducing this history and erasing it at the same time, because we think of these automated systems as somehow removed from the past and removed from our ongoing social practices.

I really encourage us to have a post-intentional analysis of equity and justice. And I get this concept of post-intentional racism from my colleague Imani Perry, because oftentimes the first reaction to any sort of example of potential inequity is for people to say, "Well, was it intended or not?" As if that is the measurement for how seriously we should take it.

We don't do that with other forms of harm, but we do that with social harms. So if I'm parked outside of this building and someone is breaking into my car, I don't run up to them and ask, "Do you feel that you're a thief? In your heart, do you identify as a thief? Do you mean to be a thief?" No. We would look at the outcomes of their actions and the impact on me to measure how to deal with it.

But with social harms, we want to do all of this psychological dissection in order to decide [whether or not] it’s intentional. It's very hard to prove something is intentional, and that's part of the power of that way of diverting our attention.

What role should people who have strong technical abilities play in ensuring data and technology don't encode inequity? And what about people who have different kinds of skills, artistic skills, communication skills or social science research skills such as yourself?

It's such an important question because I do think we've entered a moment where I feel like there's more and more people with technical backgrounds who get the problems I'm describing in terms of discriminatory design, and the New Jim Code. From the time I started the book “Race After Technology” to today, it's only been about two and a half years. But the tone of the conversation has changed dramatically. When I started, I thought I was expecting and preparing myself for much more pushback and wariness on the part of those who are in the tech industry. But in the last year or two, there's been a growing movement of tech insiders and employees who are challenging and pushing their own companies to deal with the stakes—the political stakes and social stakes—of the things that they're producing.

I think your first step needs to be looking around and not trying to reinvent the wheel—looking to see what organizations, what institutions have been in this struggle and working on this for a long time. And then ask them what they need in terms of any kind of technological contribution to that solution or that movement. And so it's really calling for a technological humility, that you don't have all the answers, and that even if you could produce an app that would address some aspect of it, that might not be what's really needed at that moment.

To that end, how can technology be used to combat discrimination and achieve a more just society?

I definitely think some technologies have a role to play, but there are also many technologies I think that we need to be able to refuse as well. Just because it's possible to design something doesn't mean we need to design it. And so I feel like we need to widen the space of refusal and resistance against certain forms of technologies in which there's no way to design the spikes out of it.

But there are many, I think, arenas in which technology can play a role.

One I'll just mention comes out of a creative twist on this. Rather than look at crime prediction of street-level crime, which is currently happening in towns and cities all across the country, in which the people in the most oppressed communities are being surveilled and their criminalities being predicted through automated systems, there is a group that started what they call the White Collar Crime Early Warning System, and they create heat maps of cities all across the country that look at the likelihood of financial crimes occurring in certain neighborhoods. And of course, it's usually where the banks are. So they've created this interactive map and you have an app that you download that alerts you when you're entering a “high-crime area.” It created a profile of the likely criminal using the LinkedIn images of like 6,000 CEOs, and it turns out most of them are white and male.

And so it's subverting this idea, and like the spiked bench, it's getting us to think about the underlying logics in this wider technical process of policing. That's an example of a creative reversal that gets us to think and question something that's actually happening in the world right now. And so this is a way that we can use technology but also question technology at the same time.

Learn more about EdSurge operations, ethics and policies here. Learn more about EdSurge supporters here.

Next Up

How AI Is Impacting Teaching and Learning

More from EdSurge

Get our email newsletterSign me up
Keep up to date with our email newsletterSign me up