Untying Knots

Untying Knots: Uprooting Digital Oppression

Episode Summary

Mutale Nkonde, Founder of AI For the People, speaks with Untying Knots about her work to expose the realities of racism in artificial intelligence algorithms.

Episode Notes

Technology companies today serve as gatekeepers to everyday life, overseeing and influencing everything from communication to payments, in addition to access to information. This outsized role among some of the largest firms in the world, however, poses an exacerbated threat to Black, Indigenous, and people of color already harmed by inequitable systems. The current landscape of racial equity efforts in Silicon Valley also showcases the limitations for marginalized people to access power and lead decisions in tech innovation. Will emerging technology produce harmful surveillance and discrimination, or even reproduce digital forms of the same systemic oppression we’ve seen throughout history? Mutale Nkonde, Founder of AI For the People, speaks with us about her work to expose the realities of racism in artificial intelligence algorithms, as well as her vision to advance equitable change in the tech industry.

Notes:
Untying Knots, co-hosted by Nikhil Raghuveera and Erica Licht, explores how people and organizations are untying knots of systemic oppression and working towards a more equitable future. Each episode features special guests and a focus on thematic areas across society. 

This podcast is published by the Institutional Antiracism and Accountability Project at  Harvard Kennedy School’s Ash Center.

Music:
Beauty Flow by Kevin MacLeod
Link: https://incompetech.filmmusic.io/song/5025-beauty-flow
License: http://creativecommons.org/licenses/by/4.0/

About the Institutional Antiracism and Accountability Project

The Institutional Antiracism and Accountability Project believes in working at the intersection of community, academia, and policy to address intellectual and practical questions as they relate to antiracism policy, practice, and institutional change. In order to create and sustain change, the goal of this project is to promote antiracism as a core value for organizations by critically evaluating structures and policies within institutions. The project aims to analytically examine the current field of antiracism with a lens on research and innovation, policy, dialogue, and community involvement.

Our vision is to be a leader in institutional antiracism research, policy, and advocacy, and propose structural change in institutions and media centered on antiracism work in the public, private, non-profit sectors and digital space. This work will focus on researching existing organizations that conduct antiracism training and development while analyzing their effectiveness and promoting best practices in the field. Additionally, we will study the implementation of antiracism work among institutions that self-identify as antiracist and promote accountability structures in order for them to achieve their goals.

About the Ash Center 

The Ash Center is a research center and think tank at Harvard Kennedy School focused on democracy, government innovation, and Asia public policy. AshCast, the Center's podcast series, is a collection of conversations, including events and Q&As with experts, from around the Center on pressing issues, forward-looking solutions, and more. 

Visit the Ash Center online, follow us on Twitter, and like us on Facebook. For updates on the latest research, events, and activities, please signup for our newsletter.

Episode Transcription

Nikhil: This is the Untying Knots Podcast. I'm Nikhil Raghuveera.

Erica: And I'm Erica Licht.

Nikhil: Today, we'll be discussing the intersection of technology and racial equity. As technology and big data become ever present in our day to day, so too have concerns about its impact on Black, Indigenous, and people of color, as well as the limitations for marginalized people to be leading the industry, not just being led. In emerging conversations have questioned whether new technologies will create digital forms of the same systems of oppression we've seen throughout history.

Erica: So, Nikhil, we'll hear from Mutale Nkonde, who we had the wonderful pleasure of speaking with. She's the founding CEO of AI for the People and fellow at the Berkman Klein Center for Internet & Society at Harvard. Mutale's work from over the last decades has included everything from policy making, writing, and speaking engagements across the US and the world. She also holds affiliations at Stanford and Columbia. Her innovative career actually began as a broadcast journalist and documentary filmmaker for which she is an Emmy winner.

Nikhil: And we'll have the opportunity to speak with Mutale about her thoughts on directions to advance justice and data and society, how by untying digital knots of oppressive systems, we can create a more equitable future.

Erica: And just as a reminder, the views expressed by our guests are their own. And one last note, this episode was recorded at the end of 2020. Thanks for joining us.

Erica: So, Mutale, it's a pleasure to have you here today. Thanks for joining us.

Mutale: I'm so excited to be here and looking forward to the conversation.

Erica: We wanted to hear a little bit about your own work in starting AI for the People and what the kind of focus and purview has been of your work.

Mutale: AI for the People is a nonprofit that was incubated while I was a fellow at the Berkman Klein Center of Internet & Society over at Harvard Law School. And it really came out of me launching my career in tech critique, kind of loosely AI ethics through fellowship programs. And the fellowship programs I was on were always... they were low paid, all of them under $25,000 a year. There were no health benefits, but I was given a desk and I was given an email address and I was given resources, even though marginal resources of the organizations that acted as my host. And it was just really frustrating.

Mutale: And the reason that it was frustrating was the first fellowship that I did, I did a congressional project where I lived in New York, but I was spending three days a week in DC speaking to members of the Congressional Black Caucus, but very specifically supporting Congresswoman Clarke's office on their AI agenda. And the reason I was doing that was that she was co-chair of the House Energy and Commerce Committee and wanted to have a legislative slate that matched that particular portfolio. And so during that time, I ended up becoming the head of a team that were looking at the imaginary.

Mutale: And we were looking at the tech imaginary, and I was really focused on how would these technologies impact Black and brown folks, poor folks, trans folks, people who are on the margins and for whom these technologies are not optimized for. And those discussions led to 25 briefings that took place on the Hill over about 18 months. And then eventually the introduction of the Algorithmic Accountability Act, the DEEPFAKES Accountability Act, and the No Biometric Barriers to Housing Act. And this was all 2019. It had never been done before. We had an AI Caucus that had introduced the kind of broad AI agenda for the United States in the Congress before, I believe 2017, but this was 2019.

Mutale: And this was a team that was largely through the Congressional Black Caucus, the caucus for tech accountability and women and girls. And we were asking for things like impact assessment of algorithmic systems so that if they were found to be racially biased in the way that the Facebook algorithm had been found to be racially biased in the housing sector, that we wouldn't even release into industry. I was still making $20,000. I still had no healthcare. I still had two children. I was poor. And so it became very clear to me that if I were going to be meaningfully compensated for this work, I would have to create a structure, a legal structure to support that. AI for the People ultimately was a riff on the declaration for the people, by the people. I chose AI because when I was doing policy work, it was very clear to me that the public interest was not in mind with many of the technologies that I was looking at and wanted to audit.

Mutale: And there was no communication. Room after room after room, briefing after briefing after briefing, people would not believe that these technologies have the ability to make racist determinations. I am a journalist by training and so developed a communications firm. That's what we do. We create content that works with communities who are most impacted by these technologies, and we tell those stories. And our publics are the general public, so folks that are listening to this podcast and read broadsheets and watch TV and watch films. So we produce that type of content.

Mutale: We are about to start executive producing a movie in January and are just finishing the short right now, for example. We're bridge builders. So those stories are building bridges between the policy world, between academia, between communities, and communicating them out in such a way that there will be a time when we know that AI technologies are not... just are not neutral, that they take on social meanings, that they have significance to the lived experiences of Black people and other minoritized groups. And our projects speak to that. That's two years of investigation, inquiry, we hire a firm to go and do an ecosystem study for us, and producing content that is high-impact.

Nikhil: I love that you talked about being a bridge builder, particularly when you think about the policy world relative to communities that are particularly affected and trying to communicate that out. When we kind of think about senators and Congress people working with technology, a lot of them are not very familiar, and it's obviously getting more and more complex. So would love to hear a little bit more about where you think the policy legislation is going in the impact of the work that you've done. And how in particular does it connect to the work by private sector?

Mutale: We have three basic pillars of work at AI for the People, and they're completely directly tied to the bills that were introduced in 2019, which are kind of... the organization is an homage to. So we are really interested in making sure that policy makers understand what an algorithm is. My very first briefing was called What is an algorithm?, and it was a packed room in the House. I'm not sure people understood, and then information integrity, which is DEEPFAKES, and then race and biometrics. And I didn't choose those three areas. Those were actually constituent complaints that came into Congress Clarke's office that I was able to activate. So the very first thing that we did was look into research around entertainment and influence and storytelling. And we found that looking at the [inaudible 00:08:14] Center and others at UCLA, we found that that's actually a really effective way to get to policy makers.

Mutale: And we'd reached out to the folks that had created the slide deck for Inconvenient Truth, they happened to be in my network, and start to speak to them about the Inconvenient Truth case study and how it didn't get [inaudible 00:08:37] action, but it did increase public awareness. And that's how our model was born. So we're connectors, but we connect through popular culture and we connect through entertainment. So in our last project, where we launched our information integrity portal, we were looking at race-based disinformation in the 2020 election. We were really interested in domestic campaigns because we felt that the whole world via the Mueller Report were looking to Russia. But we wanted to see if there were any domestic campaigns, and we found one. One we focused on... The were actually five, but we focused on one because we didn't have budget. So shout out to funders who listen to this. Listen to Black women, we will save you.

Mutale: The hashtag that they were advancing was Vote Down Ballot. So it was this idea that Black folks should not take part in the election at the top of the ticket, but vote further down. The people that were advancing this ended up being interviewed by Kamau Bell on CNN on August 16th, which was really close to DNC. And we launched a counter. We teamed up with MoveOn. We released five videos where we used a theory called strategic silence, which meant that we didn't repeat the Vote Down Ballot message, but what we did do was to create a counter message, which was vote down COVID .and we collected stories of ordinary people in Philadelphia, Black folks who had been combating COVID, how it impacted them, and how the franchise and the act of voting was an act of power and a resistance, but also so an act to save their lives.

Mutale: And we were able to get Questlove to be part of that campaign via his social media. The MoveOn machine was able to create these beautiful videos with beautiful images, beautiful sound. And when we measured the efficacy of it, we used a tool called Signal, and we got 8.4 million impressions between October 26th and November 3rd, across Facebook and Twitter, versus 2.3 million impressions on Vote Down Ballot in that same week. So we ended up realizing that our culture and celebrity were a really good way of capturing policy makers, right? And going back to the same policy makers where we'd been trying to talk about content moderation in the context of DEEPFAKES, they understood it because we had these videos. They understood it because we had Questlove.

Mutale: And we had tried to speak to platforms around bringing down this campaign, and they didn't understand race. They didn't understand that there could be a language between Black people that would have a non-democratic language. In their minds, all Blacks were Democrats, and that's just not the case. So that's one case study where we were able to use storytelling and music and celebrity to tell this larger story, which is really about democracy, information, integrity, and the need for legislation on these issues.

Mutale: Have we impacted industry? Well in the DEEPFAKES Accountability Act, we asked for the labeling of content, which we saw Twitter do the selection cycle. I don't think that can be attributed to us, but that's something that does sit in statute. And I sit on the TikTok content advisory board, and I know that, in those discussions that obviously are confidential, but we will look to existing protocols or models when we're thinking about issues. And so I think where we contributed to industry was creating these frameworks, right? But industry have never been my focus because I really do not have faith that capitalism is incentivized towards racial justice. But capitalism is incentivized towards markets. So if we can change market expectations through grassroots action and then change market conditions through policy, I think that it's a much more effective way of changing industry.

Nikhil: So obviously there's so much that's happened around technology and equity over the last year and kind of even leading up to this year. And whether that's kind of the tech sector finally recognizing that it needs to do more on racial equity, whether it's a greater push by the public sector to start challenging the role of tech companies, and also thinking about funding commitment by venture capital fund founders who aren't white male. But there's really so much more. And so we were curious to know, what are you seeing on the field right now? And where do you think we've seen the most lasting change?

Mutale: Lasting change? Now, that's difficult to measure because I would look for sustained investment and moves in industry as well as the public sector. Tech industry, I obviously come from public interest. One of the things that I would say is I have seen a divestment, specifically from Google around diversity, equity, and inclusion over the last year. It kind of started just before the summer where there were announcements made that the company was scaling back and really thinking about how they thought around DI, given that in the last six years, since they started to issue the diversity report, the movement for Black employees, but specifically Black women hadn't changed. It was still standing at around 2%. And then kind of going fast forward Valeisha Butterfield, who was a long time DI Google employee leaves and goes to the Grammy's, where I think she's doing similar work.

Mutale: And then obviously Timnit Gebru, who was an AI researcher with Google Brain is fired around a paper that she wrote that was looking at the environmental impact of a Google AI technologies and their development. And in that story, we find that she was the one Black woman who was a researcher and a manager at Google. And so even though they're not the only company in the ecosystem, I would say that they are really an insight to industry. They're a huge player. But at the same time, we're looking at philanthropy and the development of AI for the People, which is my organization, the Algorithmic Justice League, which is led by Timnit's collaborator, Joy Buolamwini, and Data 4 Black Lives. And you're seeing three Black-led, tech-focused organizations.

Mutale: And so I'm cautiously optimistic, where I say on the public interest side, there is an alternative for Black folks that want to work in tech being developed. The three organizations that I have referred to are not developing product right now, but that may change in the future. I don't know. I don't have access to all their strategic plans. And so I'm wondering whether on the critique and on the oversight side, we're starting to see these startup orgs. On the product side, I'm not so tuned into the next big Black tech companies being launched, even though I have looked at Ethel's Club and So Good, which I think is a new social network. And I'm hoping that Black people thinking about alternative institutions might be the way forward, but it's just way too early to judge.

Erica: One of the things that Nikhil and I have been talking a lot about is the short and long term nature of the change. And to our question earlier, how are we and how are you in your own work monitoring that, keeping tabs on the field, and how are people in the field themselves holding these institutions accountable?

Mutale: I think for me, I'm definitely looking at a hybrid business structure, right? So we are lucky enough to be supported by the MacArthur Foundation, for example, along with Craig Newmark and the [inaudible 00:17:18] Media Fund in Philadelphia. We were lucky to have that institutional support, and some of that support is coming back, along with other philanthropic interests that have been looking at our work. But we're also really interested in looking at endowments and looking at board development as a vehicle to get people in who have experience with not just endowments but money managers so that we would be, in another universe, self-sufficient. And then we... our inaugural project, kind of our pilot, was a disinformation project that looked at race- based disinformation, and we were successful in our campaign, so we were really happy. But we also learned a tremendous amount.

Mutale: We were partnering with MoveOn.org and that particular work, and they have been opening doors for us to even look at for-profit work, right? So work where we come in as consultants like any other consultant would come in and help solve some problems. And the reason that is the way that I'm looking to establish ourselves and I'm hoping in this field that the organizations that I mentioned prior are also thinking very strategically about long term sustainability for a moment when Black lives do not matter to funders. And we don't know whether that will happen soon. I hope not.

Mutale: I was just part of a Creative Futures project with Ford, where I spoke about funding Black organizations as if we care about Black lives and Black leadership, which was really well received. And I was really happy to be given that platform, but ultimately in a capitalist system, you want to know that you are going to be able to operate and hire and do work that really pushes industry to serve people and not to serve corporations or shareholders. And then obviously, on the private side, I would hope they're given similar supports.

Erica: Yeah. You've given us so much to chew on in terms of thinking about, as you said, the connections between not only various sectors, but what those connections and essentially organizing strategies produce, like in real impactful change, pushing back the notion that it's capitalism that will get us there rather. And instead, how can organizing at the grassroots level or also particularly from Black-led people-of-color-led, connected initiatives get us to a new imagined place of real equity. We saw that Dr. Gebru was not only one of few you Black women at Google, but... which has a 1.6% number of Black employees overall, as you've spoken to, is one among several big, heavy hitters who've not only kind of publicly struggled to reconcile their own demographics internally, but also, to what you were saying, how they approach their work. And so we'd love to hear from you what you see this event as meaning or holding for Google and other institutions. What should we be learning from this right now?

Mutale: I do want to be on the record of saying that Dr. Gebru is one of the best of the best. My work in facial recognition really comes from a constituent complaint here in Brooklyn with two tenant activists where facial recognition was being introduced to their building. But it wasn't just that it was being introduced to their building as an alternative to key fobs, but it was that they're organizing against this was being tracked using CCTV infrastructure and other tools of surveillance that facial recognition relies upon. And when we were writing that statute, we looked to the paper, we looked to to Dr. Gebru's work to guide us through racialized implications of that. So I definitely owe that body of work to her.

Mutale: And as the news broke, AI for the People has a partnership with Amnesty International, where we're looking at use of facial recognition in the context of protest and how not just facial recognition, but other of surveillance technologies were used over the summer of unrest here in the United States. And then we're looking at three other cities across the world. And one of the things that was so great about Dr. Gebru being at Google was that it really gave tech criticism an air of credibility that I certainly did not have going down to DC in 2017, where you are kind of just... I was being viewed as this Debbie downer, there was no way technology could be racist, that I was just an angry Black woman just hating on the game. And I was able to pull out these papers and say, no, this woman works for Google research. This is the belly of the beast. But the fact that she cannot exist really sends a clear message that anything that stands in the way of building capitalist value within Silicon Valley will be removed.

Mutale: And it's disappointing. It's not surprising. And it speaks to the need for a counterbalance. I do wonder who is going to ask those questions of equity. This last paper was focused on environmental impacts of Google AI. And I would argue that environmental justice is a racial justice issue. It's an underexplored field and not just bringing her deep research and analytical work, but also who else is going to shine that? It's really sad that the tech sector as it is now may not be a place for Black people. I did an interview last week where I was saying we think about Jim Crow America as being punctuated by "whites only" signs across the south. But if we were to go to Silicon Valley, you would see those same signs held up. Except instead of being able to visibly see them, you would just look out and see that everybody is white, and that would be the indication.

Mutale: I'm really hoping that public pressure can really create a situation where Silicon Valley is not a place that is hostile to dissenting opinions and people who live in minoritized bodies, whether they be Black, brown, trans, poor folks, right? Poor, white folks that are underrepresented, people who are differently abled that can thrive because we do live in a technical world, our future is tech enabled, and we do need more people to be part of the conversations around how that world is constructed.

Mutale: The last thing I would say is it feels so good to be at the fun portion of my career. I never found policy work fun, but I was good at it. Right? I could analyze data, I could analyze stats, I could make compelling argument, but that's what happens in policy all day every day. And to be in a space where I'm working with artists and filmmakers and music makers and activists and creating what we hope will be the soundtrack and the video and the experiences of this movement is an incredibly liberating place to be. And it gives me the energy and the inspiration to keep going forward, even at moments like this, when I'm basically being told that the work I do does not matter to Silicon Valley. Black lives do not matter. And yet, I have this work that I find so incredibly uplifting and fulfilling.

Erica: Well, thank you so much for what you're doing both out there and in here to get the conversation out and also just really important, actual implementation discourse out. As people who use technology themselves, we are forever grateful. But also for those of us who just really care deeply about actually working towards an agenda of a real, tangible equity, it's really amazing to see what you are doing. So thank you so much for being here on Untying Knots. We've really appreciated the conversation today.

Mutale: Thank you.

Nikhil: This is Untying Knots. Thanks for listening.

Erica: Untying Knots is hosted by Nikhil Raghuveera and Erica Licht. It is a collaboration with the Institutional Antiracism and Accountability Project and supported by the Harvard Kennedy School's Ash Center and the Atlantic Council GeoTech Center. We'd like to thank Mutale Nkonde for her time in speaking with us.

Erica: Music is beauty flow by Kevin MacLeod.