Bloomberg reported that recently two of Google’s ethical AI team members left to join Timnit Gebru’s new nonprofit research institute. This development is not good news for the technology giant as the two researchers belonged from the unit studying an area that is vitally important to Google for its future plans.
The two members who left Google are Alex Hanna, a research scientist, and Dylan Baker, a software engineer. Timnit Gebru’s new nonprofit research institute named DAIR, or Distributed AI Research, was founded in December with the purpose of analyzing various points of view and preventing harm in artificial intelligence.
Hanna, who will soon serve as the Director of Research at DAIR, said that she was dejected because of the toxic work culture at Google and also pointed out the underrepresentation of black women in the company.
Read More: OpenAI Introduces three new embedding model families in OpenAI API
A Google spokesperson said, “We appreciate Alex and Dylan’s contributions — our research on responsible AI is incredibly important, and we’re continuing to expand our work in this area in keeping with our AI Principles.”
The spokesperson added that they are also dedicated to building an environment where people with varied viewpoints, backgrounds, and experiences can do their best job and support one another.
The ethical Ai team of Google had already been in a controversy in 2020 when the team’s co-head spoke about the work culture of the company regarding women and black employees.
The severity of the matter was proven when Google’s parent company Alphabet’s CEO Sundar Pichai, had to take the issue into his hands and launch an investigation for the same. Later Googler fired two employees as an action against the allegations.
“A high-level executive remarked that there had been such low numbers of Black women in the Google Research organization that they couldn’t even present a point estimate of these employees’ dissatisfaction with the organization, lest management risk deanonymizing the results,” said Hanna.