One Green Bean highlights gender bias with emerging AI tool

One Green Bean

“They effectively hold a mirror up to society”

One Green Bean has revealed how emerging AI generative image tools consistently under-represent women across senior professional roles on International Women’s Day.
 
The creative comms agency undertook a two-part experiment using Midjourney, an artificial intelligence platform that has surged in popularity in recent months. The program generates images based on decisive text prompts, trawling an estimated five billion images from the web for each search.

One Green Bean first asked the platform to generate images based on the job titles of three members of the global leadership team – managing director, executive creative director, and head of public relations EMEA. The results revealed a clear male gender bias.

To extend the experiment further, the team then ran top 20 highest paid jobs in the UK as ranked by The Times newspaper, through the Midjourney platform. 88% of the images reinforced male gender stereotypes. From chief executive to locum consultant, tax partner to aircraft pilot, artificial intelligence revealed an overwhelming bias towards men.

In the AU workforce, around one third of the Australia’s top jobs are filled by women according to the Australian Government’s Workplace Gender Equality Agency demonstrating that AI appears to reflect a distorted reality.

Kat Thomas, founder and global executive creative director of One Green Bean, said: “There’s been huge hype around AI tools like ChatGPT and Midjourney. We’ve been deep in experimentation to understand its potential, but an eye-opening limitation became clear very quickly.
 
“A distinct gender bias is very evident, with favourability consistently skewing male. That’s not the only bias either. When you do include ‘woman’ in your key words, imagery tends to be sexualised – big boobs, unbuttoned shirts, pouting lips.

“Another huge bias is around diversity, the images these platforms generate overwhelmingly skew white, as well as male.

“Our industry is obsessed with artificial intelligence and whilst embryotic right now, its capacity is revolutionary.  However, it’s not without its limitations and its bias against women is a significant hurdle these platforms need to overcome.
 
“They effectively hold a mirror up to society, demonstrating that ingrained cultural biases dictate the norms that machine intelligence currently relies on.”
 
Further research desk research using Midjourney revealed gender bias across a huge variety of disciplines. When ‘International Tennis Star’ was typed into the platform, four shouting men appear, with no sign of Serena Williams or Ash Barty.
 
Other roles the teams looked at included air traffic controller, paramedic, finance manager, police officer, train driver… all of which returned images of men.
 

Left to right: Kat Thomas, Amber Abbot and Sophie Nicholson

To Top