Is AI gender bias leaving women in tech behind?

Is AI gender bias leaving women in tech behind?

With the AI tech boom in full swing, how do we make sure that women aren’t left behind?

I’ve spent the last decade scaling AI-driven SAAS projects. At MHR, I’ve been demoing the power of AI internally and externally. So naturally, how AI is perceived and used is at the top of my priority list.

According to research from the Harvard Business School, women in particular are avoiding AI tools. That is a huge problem. Why?  

Because AI isn’t going anywhere, and its capacity to boost efficiencies means it will become a core part of many business processes. If, as the research indicates, women are adopting AI tools at a 25% lower rate, that will be a huge blocker when it comes to career advancement. That’s also a problem for businesses, as it will slow growth.

And it’s a problem for AI.  

I’m going to talk a little bit more about the background around gender and AI, from the gender bias in AI tools to why women might be less likely to use it. I’m also going to talk about what I’ve seen when I’ve been showcasing AI to people, and what I think the solution to the AI gender gap looks like.

What is the gender bias in AI?

There are two strands we need to unpack here. The first is about the AI gender bias that can unfortunately get built into AI solutions.

Data forms the basis of every AI solution. If you train an AI on biased data, you’ll get biased results.  

Unfortunately, we as a society have a lot of biased data. The classic example is AI recruitment tools deciding that men are more suited to a certain task because men have historically been hired for it, but this issue can extend even further.  

For example, image recognition software often fails to identify women. Medical software might fixate on male symptoms. These are all problems caused by poorly chosen, biased data.

With many of these examples, rightfully, at the forefront of discussions about AI’s potential, it’s no wonder that many women might be a little reluctant to trust these tools, even when they’re trained on more diverse data sets and don’t have the same issues.

Secondly, we have to take a deeper look at why women are showing lower adoption rates of AI tools. There’s a psychology at play here, and understanding it means we’ll be able to solve it.

Addressing concerns and misconceptions about AI

Concerns around AI technology are not unique to women. Everyone’s read stories of AI being misused or making mistakes. Any sensible change management strategy should account for these, especially around data security, where the cost of mistakes shoots up.

But that doesn’t fully explain the AI gender gap.  

A lot of it comes down to an anxiety of perception. Women generally face higher scrutiny for not being experts in their fields. The lingering fear of being labelled a ‘diversity hire’ is a common one. Therefore, the idea of using AI as a ‘shortcut’ comes with more baggage.

In the research, women cited concerns about the ethics of AI. Again, a lot of this is well known, with concerns around plagiarism and environmental impact. It is entirely possible to use AI ethically, but if someone’s given a tool to use with no real training on it, those worries come to the forefront.  

Confidence gaps are also a problem. Many women can relate to feeling like their own achievements don’t matter or aren’t important. In an industry where self-promotion is vital, that can lead to being left behind.

The importance of upskilling women in AI

Women are underrepresented in the AI industry. According to the Alan Turing Institute, only 22% of people working in the AI and data science fields are women. Access to STEM education is a barrier here, as is access to upskilling initiatives. Balancing work and family life as well as dealing with stereotyping are common in many fields, but the underrepresentation in the tech sector can intensify this.

Why does this matter? Because without the input of all genders, races and communities, AI will never reach its true potential. It’ll just replicate old ways of thinking and deepen existing issues. You could be at risk of losing a lot of talent and skills, which are essential to your success as a company.

Consider mentorship, upskilling and building up a learning culture where employees are given the space to experiment and learn. These will all start the process of filling in these gaps.

How to improve equality in AI

With all the above, it might seem like these issues around AI and gender equality are insurmountable.

But then I think about our own high performance research and my own experiences working with women on AI.

My experience so far is that every time I demo the power of AI, I get a lot of excitement from female leaders. Often, they’ll ask for follow-up sessions to learn how to leverage AI for their work. And I know right now, they all leverage AI in their day-to-day.

This aligns with our research into the secrets of high performance. While 100% of leaders agreed that technology is a vital factor in achieving high performance, it was female leaders who said AI was the most significant option on the table. In fact, 38% of women we surveyed agreed with this, compared to 25% of men. Likewise, 62% of female leaders agreed that ‘AI would change the face of how their organisation operates’ compared with 45% of men.

The will to use the software is there. The understanding of its potential is there.

The problem, then, is being handed an AI tool and told to figure it out. That’s when concerns about the perception of the tool or the ethics of engaging with it creep in and confidence gaps can undermine people’s willingness to experiment and learn.

I think it comes down to giving direct opportunity to understand the tool, not just giving direct access to the tool.

When I sit down and talk to people, I’m able to talk through these concerns. It shows women they won’t be judged for utilising this technology and gives them space to process how they feel about it, to ask questions, to figure out how it will fit into their working world. That means they don’t just become users, they become advocates, like those female leaders we surveyed.

The idea that women are less engaged with AI tools didn’t resonate with me, but I realised it’s because our team is one that prioritises high performance and equal opportunities. We are making a point to implement measures that actively reduce gender biases across our internal processes and product development. That’s why I’m having a different experience to what the research suggests.

Whether you’re talking about AI products or about AI adoption, the most important step of all is to factor in the perspective of diverse groups.

This goes for training AI on unbiased data, but it also goes for factoring diverse voices in your adoption process.

AI may be revolutionary tech, but the core principles of change management haven’t changed. Bring people on board, account for new perspectives you hadn’t considered, and you’ll see real progress. 

Looking for something specific?