Events Calendar

Mon
Tue
Wed
Thu
Fri
Sat
Sun
M
T
W
T
F
S
S
28
29
1
2
3
6
7
8
9
10
12
13
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
1
Transforming Medicine: Evidence-Driven mHealth
2015-09-30 - 2015-10-02    
8:00 am - 5:00 pm
September 30-October 2, 2015Digital Medicine 2015 Save the Date (PDF, 1.23 MB) Download the Scripps CME app to your smart phone and/or tablet for the conference [...]
Health 2.0 9th Annual Fall Conference
2015-10-04 - 2015-10-07    
All Day
October 4th - 7th, 2015 Join us for our 9th Annual Fall Conference, October 4-7th. Set over 3 1/2 days, the 9th Annual Fall Conference will [...]
2nd International Conference on Health Informatics and Technology
2015-10-05    
All Day
OMICS Group is one of leading scientific event organizer, conducting more than 100 Scientific Conferences around the world. It has about 30,000 editorial board members, [...]
MGMA 2015 Annual Conference
2015-10-11 - 2015-10-14    
All Day
In the business of care delivery®, you have to be ready for everything. As a valued member of your organization, you’re the person that others [...]
5th International Conference on Wireless Mobile Communication and Healthcare
2015-10-14 - 2015-10-16    
All Day
5th International Conference on Wireless Mobile Communication and Healthcare - "Transforming healthcare through innovations in mobile and wireless technologies" The fifth edition of MobiHealth proposes [...]
International Health and Wealth Conference
2015-10-15 - 2015-10-17    
All Day
The International Health and Wealth Conference (IHW) is one of the world's foremost events connecting Health and Wealth: the industries of healthcare, wellness, tourism, real [...]
Events on 2015-09-30
Events on 2015-10-04
Events on 2015-10-05
Events on 2015-10-11
MGMA 2015 Annual Conference
11 Oct 15
Nashville
Events on 2015-10-15
Articles

Can AI image generators producing biased results be rectified?

Experts are investigating the origins of racial and gender bias in AI-generated images, and striving to address these issues.

In 2022, Pratyusha Ria Kalluri, an AI graduate student at Stanford University in California, made a concerning discovery regarding image-generating AI programs. When she requested “a photo of an American man and his house” from a popular tool, it generated an image of a light-skinned individual in front of a large, colonial-style home. However, when she asked for “a photo of an African man and his fancy house,” it produced an image of a dark-skinned person in front of a simple mud house, despite the descriptor “fancy.”

Further investigation by Kalluri and her team revealed that image outputs from widely-used tools like Stable Diffusion by Stability AI and DALL·E by OpenAI often relied on common stereotypes. For instance, terms like ‘Africa’ were consistently associated with poverty, while descriptors like ‘poor’ were linked to darker skin tones. These tools even exacerbated biases, as seen in generated images depicting certain professions. For example, most housekeepers were portrayed as people of color and all flight attendants as women, in proportions significantly deviating from demographic realities.

Similar biases have been observed by other researchers in text-to-image generative AI models, which frequently incorporate biased and stereotypical characteristics related to gender, skin color, occupations, nationalities, and more.