Powered by MOMENTUM MEDIA
cyber daily logo

Breaking news and updates daily. Subscribe to our Newsletter

Breaking news and updates daily. Subscribe to our Newsletter X facebook linkedin Instagram Instagram

Australian researchers reveal gender bias in GenAI image creation

Images created by generative artificial intelligence (GenAI) tend to present male figures over women and a lack of ethnic diversity.

user icon David Hollingworth
Wed, 18 Sep 2024
Australian researchers reveal gender bias in GenAI image creation
expand image

Researchers at Charles Sturt University have published a pair of papers that reveal an inherent gender and ethnic bias in AI-based image creation platforms.

Professor in Nuclear Medicine Geoff Currie and senior lecturer in medical imaging Johnathan Hewis of the university’s School of Dentistry and Medical Sciences used OpenAI’s DALL-E 3 text-to-image platform to create images of pharmacists and med students and compared the makeup of the generated images with known statistics regarding both groups.

Students Sam Anderson and Josie Currie co-authored the first study, Gender bias in generative artificial intelligence text-to-image depiction of medical students, while George John, senior lecturer in pharmacy practice, co-authored the second study, Gender and ethnicity bias in generative artificial intelligence text-to-image depiction of pharmacists.

============
============

The results revealed a stark difference between the AI images and the real world.

“Despite 54.3 per cent of undergraduate medical students in Australia being women, only 39.9 per cent of the artificially generated figures were female,” Professor Currie said in a statement.

“Not only this, but there was a lack of ethnic diversity, with only 7.7 per cent depicting mid skin tones and 0 per cent with dark skin tones.”

The generated images of pharmacists revealed a similar bias, failing to reflect that 64 per cent of Australian pharmacists are women.

“Only 29.7 per cent of generated images, both of individuals and group shots, represented women,” Professor Currie said.

“The ethnicity was also biased in this research, again depicting zero per cent of people with dark skin tones, and only 6.5 per cent with mid skin tones.”

Professor Currie’s counterpart, Hewis, said that while generative AI does offer convenience, it also has some drawbacks.

“Society has been quick to adopt AI because it can create bespoke images whilst negating challenges like copyright and confidentiality,” Hewis said.

“However, accuracy of representation cannot be presumed, especially when creating images for professional or clinical use.

“These studies highlight that generative AI can significantly amplify inherent biases leading to misrepresentation in gender and diversity.”

Professor Currie said such image generators should be used with caution.

“If images carrying such bias are circulated, it erodes the hard work done over decades to create diversity in medical and pharmacy workforces and risks deterring minority groups and women from pursuing a career in these fields,” Professor Currie said.

David Hollingworth

David Hollingworth

David Hollingworth has been writing about technology for over 20 years, and has worked for a range of print and online titles in his career. He is enjoying getting to grips with cyber security, especially when it lets him talk about Lego.

newsletter
cyber daily subscribe
Be the first to hear the latest developments in the cyber industry.