Search

Recent Posts

New Documentary "Coded Bias" Explores How Tech Can Be Racist And Sexist : Code Switch - NPR

New Documentary "Coded Bias" Explores How Tech Can Be Racist And Sexist : Code Switch - NPR

Courtesy of the 2050 Group

Facial recognition systems from large tech companies often incorrectly classify black women as male — including the likes of Michelle Obama, Serena Williams and Sojourner Truth. That's according to Joy Buolamwini, whose research caught wide attention in 2018 with "AI, Ain't I a Woman?" a spoken-word piece based on her findings at MIT Media Lab.

The video, along with the accompanying research paper written with Timnit Gebru of Microsoft Research, prompted many tech companies to reassess their facial recognition data sets and algorithms for darker and more female-looking faces.

"Coded Bias," a documentary directed by Shalini Kantayya which premiered at the Sundance Film Festival in late January, interweaves Buolamwini's journey of creating the Algorithmic Justice League, an advocacy organization, with other examples of facial recognition software being rolled out around the world — on the streets of London, in housing projects in Brooklyn and broadly across China.

Jennifer 8. Lee, a journalist and documentary producer, caught up with Joy Buolamwini and Shalini Kantayya in Park City, Utah after the premiere of Coded Bias. The interview has been condensed for length and clarity.


Joy, why do you combine technology and arts in your work?

Buolamwini: So I am a poet in code — P.O.C. We need more poets in tech, and we need more people of color in tech. The reason I call myself a poet is because poets illuminate the uncomfortable, and they make us feel what we otherwise not might see. And I realized as I was doing these research papers with MIT that performance metrics and statistical analysis only get you so far. So how do you get from performance metrics to performance arts, where it hits you in a visceral way, so that you can humanize the implications of what we were showing with the research?

Shalini, a striking number of the characters and the experts interviewed in the film are women. Why?

Kantayya: My film is mostly women, but it was not intentional when I began the film. What I found was there was this group of mathematicians and data scientists who were incredibly astute, well researched, but at the same time had this commitment to their humanity. Many had a double identity: were a woman, were of color. And that allowed them to see this issue from an entirely different perspective. For me, many of the parallels in the film industry I see happening in the tech industry, and I could very much relate to these women in tech, like issues of representation.

What was it like testifying about your work in front of the House of Representatives, especially given that the Senate hearing for Mark Zuckerberg did not put the technological sophistication of Senators in the best light?

Buolamwini: Many of the committee members had in-depth questions, so they used all of their time. I was really surprised how into the weeds they got, because we do our spoken testimony, three to five minutes, then we have the written testimony which is 20 pages, and they were referencing components where if you didn't really read it, you wouldn't have gotten that in-depth.

So I was surprised at first, and then they had a second hearing and they had a third hearing. They were saying that this is one of the committee's favorite topics — facial recognition — because you actually have support and agreement across both sides of the aisle, and it's very difficult to find topics like that in this current political state.

Kantayya: By going to the hearing, I actually learned that our government actually works sometimes.

What would the regulatory oversight of algorithms and artificial intelligence look like?

Buolamwini: I'm actually working with a group of researchers on a paper right now. The major issue is that level of expertise is not going to happen in government, and so we have been looking at a hybrid model where you can have an institution that enables expertise to be cultivated. So that when companies are planning or hoping to deploy A.I. systems, it's not just "spray and pray." You actually have to verify the claims of your system. With the FDA model, it's acknowledged that not all drugs work for everybody. There are limitations.

How did you come up with the idea of putting a white mask on in order to get the computer software to recognize your face?

Buolamwini: The reason I had a white mask in my office was while I was at MIT working hard, one of the older graduate women said, "We need a girls' night out," and it was a girls' night out during Halloween." Before we went out, she said, "We're going to do masks." So I thought it was a costume party and I bought a white mask for the costume party. I show up and it turned out to be facial masks, like a spa treatment! And because I had misunderstood the kind of masks she was talking about, it is why I had a white mask. I was a little embarrassed and I left it in my office.

And so when I came back from going out, I still had this project to do. I was really frustrated trying to get it to work and it was late. I kept trying different ways to detect my face. And the white mask was just sitting there. And I was like, "No way." I grabbed it, and it was almost immediate. I hadn't even turned it all the way, even from an angle it was already picking it up as a face, and I put it on. And I was like, this is crazy. I don't know if this would have happened if I hadn't misunderstood "girls' night out."


Jennifer 8. Lee is a journalist and documentary producer. You can follow her work here.

Let's block ads! (Why?)



2020-02-08 11:03:00Z
https://www.npr.org/sections/codeswitch/2020/02/08/770174171/when-bias-is-coded-into-our-technology
CAIiEACaHfRtBltBSKBiU7c-MeQqFggEKg4IACoGCAow9vBNMK3UCDCwpUk

Bagikan Berita Ini

0 Response to "New Documentary "Coded Bias" Explores How Tech Can Be Racist And Sexist : Code Switch - NPR"

Post a Comment

Powered by Blogger.