Battling Bias in Artificial Intelligence on the hello, Human Podcast
by Elizabeth Mitelman, Jul 14, 11:59:07 am
Our podcast, hello, Human, offers an open forum to discuss the latest topics in artificial intelligence (AI) and how it’s being applied in the real world. We talk with not only the pioneers of AI, but also those who are putting AI to work transforming businesses, finding novel solutions to age-old problems, and advancing what humans can accomplish.
FortressIQ | Intelligent Insights for the Modern Enterprise
Episode 8 - Battling Bias in AI
In episode #8, our guest was Sherika Ekpo, former Global Diversity & Inclusion Lead at Google. Sherika previously worked in the U.S. federal government, holding senior level human resources, recruiting, and diversity/inclusion roles at several U.S. agencies. Sherika joined us on International Women’s Day as part of our Women in AI series.
Sherika’s background put her on the front lines of diversity, where the impact of racism and sexism directly affected workers and society. Especially in the technology industry, issues surrounding the lack of diversity in the workforce are well known. But, it’s also becoming more apparent how that lack of diversity in the teams building the technology has real ramifications for people using the technology. Related to AI, a lack of diversity can influence decisions that then limit training data or invite unconscious biases that lead to even further bias being present in the eventual output.
Technology, and increasingly, AI, is driving decisions in everything from human resources to healthcare to financial services, where Sherika started her career. After transitioning across several functions at JPMorgan Chase, she earned an MBA and moved into HR, eventually recruiting for the CFO at the Department of Homeland Security. That’s where her focus on diversity and inclusion (D&I) began as she developed “talent pipelines” to recruit private sector workers into government roles.
“While at (United States Digital Service), I did our first inaugural D&I plan and really realized a couple of key things,” Sherika recalled. “One, females rock. Our leadership team at that time was 60% female. I sat in a room with directors who looked like me. The men were in the minority for the first time, which is not normal for a technology company or agency, let alone one in government. That was exciting.”
As she drilled down into the engineering workforce, however, she found representation gaps for females, racial groups, and ethnic groups. So she set out to recruit more Black, Latinx, and indigenous representation on the engineering teams. But simply recruiting from the technology industry wouldn’t work, since there was a lack of diversity there, too. Seeing that Google was having the same D&I issues, but also had more resources, she moved there to try to find ways to solve diversity issues on a macro scale.
Mentorship became an obvious and influential means for helping Sherika give under-represented groups relatable access to potential opportunities. But, D&I issues frequently become a looping, circular challenge: it’s great to have a relatable mentor, but an over-represented ally might offer more influence.
“There’s one thing for me to be a mentor for a young black woman who looks like me and try to create opportunities for her in the tech space where I am,” said Sherika. “But it’s another thing to have someone who doesn’t look like her—let’s say it’s a male, let’s say it’s a white man—who can also champion and use his influence, circle, and capital to support this woman in her goals as well.”
That lack of influence and representation, specifically in technology, goes back to the earliest educational opportunities, Sherika says. Being exposed to coding and technology in kindergarten can be fantastic, but many schools, particularly those in minority communities, don’t have that access. That’s where technology companies can start to make a difference.
“When you think about K-12 and the opportunities that exist, I think there are a number of opportunities for tech companies, trade associations, and nonprofits to partner with very early on,” Sherika added. “There’s always an opportunity to go to and partner with local foundations and organizations to do summer camps or app development camps. Some of the most heart-warming stories I hear are from young women who are able to go to a camp that’s two weeks, and at the end of those two weeks they have developed an app that’s interactive, that’s real, that’s tangible, and that they can share with their families.”
That’s an investment in the future, but what about bias and a lack of D&I in technology today? One place to start, Sherika says, is with an assessment on gender and diversity pay gaps. Another is calling out unconscious bias, such as when a colleague directs a question to a male instead of a female. And, mentorship programs offer growth to both the mentor and mentee, regardless of their differences. It’s important, too, to consider the business case of D&I when making these recommendations.
“The reason why bias is something that we need to guard against is because bias in the workplace is essentially just unfairly excluding someone from opportunities even when they’re qualified,” says Sherika. “Sometimes bias is unconscious, but that leads to sexism, discrimination, stereotyping, and things of that nature. And when we have that, it stunts innovation and it actually stunts overall (business) growth.”
As for advice for women in AI and technology, Sherika recommends taking the right risks.
“Now is the time to chart your path so that you can seize the next opportunity,” Sherika concluded. “That opportunity may include you taking risks, and what I tell folks all the time is it is okay to take risks, but make sure that they are calculated. The way that you calculate the risks is based on what it is that you’re willing to sacrifice.”