22 December 2020

|

Race

Diversity / Inclusion In Practice

Timnit Gebru, Google, and institutional racism in AI: the issues within the tech giant’s ethics team

Dr. Timnit Gebru – a prominent AI researcher, former co-head of the Ethical AI team at Google, and one of very few black women in the technology industry – has reported that she was fired earlier this month for circulating an email in which she expressed frustration with Google’s lack of progress in hiring women and its associated lack of accountability.

Gebru’s email came about in response to Google’s order that she retract a research paper she had co-written regarding the risks of training AI by utilising text data (which also focuses on the issue of structural bias against ethnic minorities). This order from Google came shortly before Gebru was due to go on leave.

Gebru then drafted an email, criticising Google in response to this order, as well as Google’s poor progress in hiring women. She sent the email to an internal group (‘Google Brain Women and Allies’). Gebru also encouraged recipients to focus on “leadership accountability and thinking through what types of pressures can also be applied from the outside”. The email also made note of Google’s poor hiring practice, noting that only 14% of Google’s 2020 hires were women.

Gebru informed management that she wanted certain conditions met, including the publication of the identities of each individual that Jeff Dean, Google’s Head of AI, had spoken to and consulted as part of the review of the paper; she also demanded to know the exact feedback he had received before sending out the order for retraction. Gebru alleges that Google were unprepared to engage in a discussion about her research paper and that the company refused to meet her conditions. She asserts that she was not provided an opportunity to respond and subsequently dismissed. According to Gebru, Google responded:

We respect your decision to leave Google […] and we are accepting your resignation. […] However, we believe the end of your employment should happen faster than your email reflects because certain aspects of the email you sent last night to non-management employees in the brain group reflect behaviour that is inconsistent with the expectations of a Google manager.”

Jeff Dean informed employees (and later the public) that Gebru had resigned and that she had not in fact been dismissed. He also claimed that Gebru had failed to follow the company’s process for vetting work to be published externally – namely, that she allowed only one day for what was usually a two-week review period. However, Dean’s statement did not then acknowledge the conditions submitted by Gebru, nor did it explain why Gebru was not offered a chance to respond before the order for retraction was sent. Gebru ultimately asserts she was fired, and that Google’s framing of the situation as a ‘resignation’ is incorrect.

Additionally, many other individuals at Google dispute Dean’s characterisation of the review process as being two weeks, and also take issue with the alleged need for as thorough a review as Dean suggested in his public message:

  • Nicolas Le Roux, a Google AI researcher: “My submissions were always checked for disclosure of sensitive material, never for the quality of the literature review.” (Twitter)

  • William Fitzgerald, a former Google PR manager: “[Dean’s comments on Google’s review process] is such a lie. It was part of my job on the Google PR team to review these papers. Typically we got so many we didn’t review them in time or a researcher would just publish & we wouldn’t know until afterwards. We NEVER punished people for not doing proper process.” (Twitter)

Gebru’s supporters (which presently include over 2000 current and former employees of Google’s parent company Alphabet Inc., and more than 3000 external academics and industry professionals) maintain that the correct reason for Gebru’s dismissal was her criticism of Google’s process of improving working conditions for BAME individuals. The open letter and petition demand that Google provide answers for its decision to remove Gebru and block her co-authored research paper.

Sundar Pichai, CEO at Google, has since pledged to investigate the matter. In a memo, he acknowledged the need to assess the circumstances leading to Gebru’s departure with a view towards improvement and increased respect, and accepted that Gebru was a talented black woman that had “left Google unhappily”:

First – we need to assess the circumstances that led up to Dr. Gebru’s departure […] We will begin a review of what happened to identify all the points where we can learn — considering everything from de-escalation strategies to new processes we can put in place. Jeff [Dean] and I have spoken and are fully committed to doing this. […] [Gebru’s departure] has had a ripple effect through some of our least represented communities […] The events of the last week are a painful but important reminder of the progress we still need to make.”

Gebru Tweeted her dissatisfaction with Pichai’s update, commenting on the “strategic” language adopted by Pichai:

Don’t paint me as an “angry Black woman” for whom you need “de-escalation strategies” for. This thread is just the beginning of the toxicity I dealt with since before I even joined Google and I haven’t said anything specific yet. 1\

Gebru also rejected Pichai’s comments in an interview with the BBC, interpreting it as a statement designed to improve Google’s PR standing. Gebru stated that Pichai had not outright apologised for Google’s handling of the situation; rather, electing to express regret for the existence of doubts that have emerged as a result of it. Gebru also confirmed her view that Google is an institutionally racist institution, and her belief that “most if not all tech companies are institutionally racist […] It’s not by accident that black women have one of the lowest retention rates [in the technology industry]”. Gebru also indicated that Google does not care about diversity, highlighting her present situation as an example of Google retaliating against individuals that challenge the status quo.

In a recent interview with Technology Review, Gebru again made mention of poor diversity at Google:

I was definitely the first Black woman to be a research scientist at Google. After me, we got two more Black women. That’s, like, out of so many research scientists. Hundreds and hundreds […] It was just constant fighting. I was trying to approach it as talking to people, trying to educate them, trying to get them to see a certain point of view.”

Gebru also noted elements of an anti-whistleblowing culture within the tech giant:

“[…] People were complaining that this organization [Google Research] hired just 14% women. Samy, my manager, hired 39% women. It wasn’t like he had any incentive to do that whatsoever. He was the only reason I feel like this didn’t happen to me before. It’s probably because he was protecting us. And by protecting us, he would get in trouble himself. If other leaders are tone-policing you, and you’re too loud, you’re like a troublemaker – we all know that’s what happens to people like me – then if someone defends you, they’re obviously going to also be a problem for the other leaders.”

As regards racial discrimination in AI overall, Gebru believes that unless there is some sort of shift in power, whereby those who are affected by technology are allowed to shape them from the ground up, AI tools will be used “more for harm than good”.

Sources: BBC, Bloomberg; Business Insider; Platformer; Technology Review; Twitter

December 2020

© Farore Law



Written by:

Farore Law Logo

Farore Law