Events Calendar

Mon
Tue
Wed
Thu
Fri
Sat
Sun
M
T
W
T
F
S
S
26
27
28
29
30
31
1
2
4
5
6
7
8
10
11
12
12:00 AM - PFF Summit 2015
13
14
15
17
18
19
20
21
22
23
24
25
26
27
28
29
30
1
2
3
4
5
6
NextEdge Health Experience Summit
2015-11-03 - 2015-11-04    
All Day
With a remarkable array of speakers and panelists, the Next Edge: Health Experience Summit is shaping-up to be an event that attracts healthcare professionals who [...]
mHealthSummit 2015
2015-11-08 - 2015-11-11    
All Day
Anytime, Anywhere: Engaging Patients and ProvidersThe 7th annual mHealth Summit, which is now part of the HIMSS Connected Health Conference, puts new emphasis on innovation [...]
24th Annual Healthcare Conference
2015-11-09 - 2015-11-11    
All Day
The Credit Suisse Healthcare team is delighted to invite you to the 2015 Healthcare Conference that takes place November 9th-11th in Arizona. We have over [...]
PFF Summit 2015
2015-11-12 - 2015-11-14    
All Day
PFF Summit 2015 will be held at the JW Marriott in Washington, DC. Presented by Pulmonary Fibrosis Foundation Visit the www.pffsummit.org website often for all [...]
2nd International Conference on Gynecology & Obstetrics
2015-11-16 - 2015-11-18    
All Day
Welcome Message OMICS Group is esteemed to invite you to join the 2nd International conference on Gynecology and Obstetrics which will be held from November [...]
Events on 2015-11-03
NextEdge Health Experience Summit
3 Nov 15
Philadelphia
Events on 2015-11-08
mHealthSummit 2015
8 Nov 15
National Harbor
Events on 2015-11-09
Events on 2015-11-12
PFF Summit 2015
12 Nov 15
Washington, DC
Events on 2015-11-16
Articles

Can AI image generators producing biased results be rectified?

Experts are investigating the origins of racial and gender bias in AI-generated images, and striving to address these issues.

In 2022, Pratyusha Ria Kalluri, an AI graduate student at Stanford University in California, made a concerning discovery regarding image-generating AI programs. When she requested “a photo of an American man and his house” from a popular tool, it generated an image of a light-skinned individual in front of a large, colonial-style home. However, when she asked for “a photo of an African man and his fancy house,” it produced an image of a dark-skinned person in front of a simple mud house, despite the descriptor “fancy.”

Further investigation by Kalluri and her team revealed that image outputs from widely-used tools like Stable Diffusion by Stability AI and DALL·E by OpenAI often relied on common stereotypes. For instance, terms like ‘Africa’ were consistently associated with poverty, while descriptors like ‘poor’ were linked to darker skin tones. These tools even exacerbated biases, as seen in generated images depicting certain professions. For example, most housekeepers were portrayed as people of color and all flight attendants as women, in proportions significantly deviating from demographic realities.

Similar biases have been observed by other researchers in text-to-image generative AI models, which frequently incorporate biased and stereotypical characteristics related to gender, skin color, occupations, nationalities, and more.