Christie Ronaldson

Relationship Manager - HMRC
AI

October 9, 2025

The AI gender gap and representation in our teams

This blog has been co-authored with Liz Leakey.

It’s Ada Lovelace Day on Tuesday, 14th October, a celebration of the life, vision, and contributions of the first computer programmer. In Ada’s time, few women were allowed to take part in scientific or technical work. Yet, 185 years after Ada wrote the first algorithm for calculating Bernoulli numbers, women remain underrepresented in technology and science.

While this article focuses on women and gender, we recognise that non-binary and trans people face similar challenges. We also acknowledge that race, age, sexuality, neurodiversity, religion, and culture are often underrepresented and at risk of bias.

Throughout history, cultural, structural, and institutional barriers have limited women’s participation in these fields. Social norms generally positioned women in caregiving or domestic roles, and women who worked in labs or computing were seen as anomalies or hobbyists. Despite this, we know that women did make major contributions, from code-breaking during WWII to calculating the trajectory for the first moon landing expeditions.

With the introduction of home computers in the 1980s, women faced another setback. Advertising, games, and products were targeted at men, and the number of women signing up for computer science degrees declined. This was unexpected, given the number of women who had previously learned Fortran, and the increase in women joining the workforce during the 1970s.

Fewer women in technical roles or STEM education meant fewer visible role models, and reinforced the idea that these fields “weren’t for them”. Other barriers, like unequal pay, limited promotion opportunities, lack of recognition or credit, and hostility and dismissiveness in professional cultures exacerbated these issues.

Despite efforts since the 1980s to encourage girls into STEM through education and play, only about a quarter of STEM undergraduates today are women. According to HESA (Higher Education Statistics Agency) data, between 2015 and 2022 just 12,190 women graduated in Computing, compared with 38,610 men.

Computing graduates over time

 

 

Source HESA

For mathematics degrees, female representation remains steadier, at around 39%.

Across Europe and the US, the picture is similar. In 2021, women made up 32.8% of STEM graduates in the EU, led by Romania (42.5%), Poland (41.5%) and Greece (40.9%).

In the US, women accounted for just 21% of STEM graduates in the same year.

By contrast, India leads globally with 43% of STEM degrees earned by women. However, only 29% continue into the workforce.

In UK industry, women hold around 20–25% of STEM roles, with 2024 figures at 24%.

 

Source: StemWomen.com

In short, women continue to be underrepresented in technology and engineering roles. With the rise of AI, this imbalance could either improve or deepen. We now have the opportunity to act intentionally to address this imbalance and prevent a widening gender gap.

Why having women in AI teams matters

The absence of women in AI and software development teams means missing out on the opportunity to add different perspectives and expand the diversity of thinking.

We know that half the world’s population are women, and those women are users of the products we create. It follows then, that including women in product and engineering teams is essential in ensuring AI systems reflect everyone’s needs. Representation matters in how we design products, build data models, and manage ethics, bias, and privacy.

Software development does not differ from other product development, where design solutions have been more strongly suited to men. For example, car seat belts, aeroplane cockpits, police vests, and astronaut flight suits were all designed for, and tested on men, leading to greater injury risks for women, inaccessible controls, and ill-fitting protective gear.

Clinical trials in healthcare show similar patterns. Trials of new medications favour male participants, and the lack of female representation has left women’s health under-researched, underfunded, and often overlooked. Without female representation in AI design, we risk repeating these blind spots, and building products that ignore female experiences and needs. We are already seeing the possible impact of this:

“A London School of Economics study published in BMC Medical Informatics and Decision Making found that artificial intelligence tools used by more than half of England’s councils are downplaying women’s physical and mental health issues, creating gender bias in social care decisions. 

The research examined Google’s AI model “Gemma” by generating 29,616 pairs of summaries from real case notes of 617 adult social care users, with only the gender swapped between each pair, and found that terms like “disabled,” “unable,” and “complex” appeared significantly more often in descriptions of men than women, while similar care needs in women were more likely to be omitted or described in less serious terms. 

Lead researcher Dr. Sam Rickman warned that because access to social care is determined by perceived need, this bias could result in unequal care provision for women if social workers were to rely on these AI-generated summaries.”

Full article: The Guardian 11th August 2025

We know that AI models are being trained on large, biased and incomplete data sets, already reflecting these inequalities and highlighting data poverty (where a lack of representative data produces skewed results). The risk is that LLMs may not just reflect this bias, but will amplify it and inadvertently reinforce it further.

Dr Timnit Gebru, AI computer scientist and founder of Distributed AI Research Institute, highlighted her worries about who gets to decide how AI is built and based on whose values, worldviews, datasets dominate?

“In accepting large amounts of web text as ‘representative’ of ‘all’ of humanity, we risk perpetuating dominant viewpoints, increasing power imbalances and further reifying inequality.”

Shaping teams

When advocating for more women in AI and software development teams, we should acknowledge that women more frequently express uncertainty about the impacts of AI. This scepticism is grounded in concerns around data privacy, gender bias in LLMs, the increase in misogynistic AI-generated content, and broader societal effects on jobs and future generations.

Addressing these concerns means including women and ensuring they are represented at all levels of decision-making and critical technical discussions.

AI product teams need a range of expertise and experience, not just technical expertise. This is about innovation and requires people who are:

  • Comfortable with experimentation, failing fast, iteration, and ambiguity
  • Pioneering, open to exploring new ideas, yet grounded in reality
  • Clear communicators who bring others along, and with good stakeholder management
  • Collaborative, hands-on, able to teach and generous with knowledge
  • Adaptable to new ways of working and resilient to change
  • Curious, willing to learn and explore and problem-solve
  • Pragmatic and aware of organisational constraints

Fei-Fei Li, Professor at Stanford University and Chief Scientist of AI/ML at Google Cloud, on the impact on innovation and creativity, says:

“Research repeatedly shows that when people work in diverse groups, they come up with more ingenuous solutions. AI will impact many of our most critical problems, from urban sustainability and energy to healthcare and the needs of ageing populations. We need a diverse group of people to think about this”

Many organisations are now experimenting with smaller teams working together. With the fast-paced rate of change in AI, this approach provides opportunities to shape teams in ways we might not have considered before and creates space for diverse voices and continuous learning.

At Equal Experts, through our work with clients, we’ve piloted two AI teams exploring how to use AI in the software development lifecycle. Both were intentionally staffed with a mix of men and women. Our hope is to set expectations that this becomes the norm. As one female team member put it: “Working in AI engineering with close to a 50/50 gender split has been an absolute dream!”

What can you do?

As a tech leader, consider:

  • Are women well represented in your product and engineering teams, and do you have the range of skills required to drive innovation in the year ahead?
  • Do you have enough women in leadership and decision-making roles to have an impact?
  • Have you assessed the quality and potential bias in the data your organisation uses to build models, algorithms, and AI products?

We asked 75 women working in tech in the North West (UK) for their ideas about what we can do to improve representation. Here’s a summary of what they said:

  • Actively and intentionally seek out women in your organisation to join AI teams
  • Listen to women’s perspectives in meetings and discussions about AI use
  • Encourage female leaders and AI practitioners to speak at external and internal events and conferences
  • Support women to attend AI conferences and events
  • Ensure AI products are thoroughly user-tested and represent diverse user needs
  • Consider adoption barriers when developing AI products
  • Include more women on hiring panels and in technical interviews to reduce unconscious bias
  • Introduce referral schemes, encouraging women to apply for AI roles and focussing on potential
  • Create internal training to demystify AI, build confidence, and help women join AI teams
  • Create communities of practice where women share AI knowledge and experiences
  • Host regular ‘lunch and learn’ sessions led by women, for women, creating a safe space to share and grow.

We’re running workshops with organisations to explore how AI might impact teams and ways to tackle the gender gap. If you’re interested in finding out more, drop us a line.

 

About the authors

Christie Ronaldson has over 8 years of experience working in agency and consultancy environments, where she’s built a reputation for putting people first. By leading with empathy, she helps customers solve complex problems through collaboration and trust. As Relationship Manager at Equal Experts, Christie focuses on understanding what truly matters to clients and ensuring they see tangible value from every engagement.Christie is passionate about creating outcomes that make customers genuinely happy, and she’s an active women in tech champion and enthusiast.

Liz Leakey is the Global Head of Design and Product at Equal Experts, with over 20 years of experience leading creative digital teams to deliver outstanding user experiences. She has worked with organisations including the BBC, Pret a Manger, Sky Betting & Gaming, the Department for Work and Pensions, The Very Group, and the John Lewis Partnership.Liz helps teams adopt a user-centred approach to building products and services – designing with empathy to create digital experiences that genuinely improve people’s lives.Passionate about enabling and empowering teams, Liz fosters collaboration and champions diverse thinking to drive innovation and solve complex problems together.

You may also like

Blog

Leading through uncertainty: Insights from tech leader John Crosby

Blog

Empowering women in tech: Equal Experts and HMRC unite for an inspiring conference

Blog

Why is AI making us talk about “developer productivity” (again)?

Get in touch

Solving a complex business problem? You need experts by your side.

All business models have their pros and cons. But, when you consider the type of problems we help our clients to solve at Equal Experts, it’s worth thinking about the level of experience and the best consultancy approach to solve them.

 

If you’d like to find out more about working with us – get in touch. We’d love to hear from you.