Events Calendar

Mon
Tue
Wed
Thu
Fri
Sat
Sun
M
T
W
T
F
S
S
30
31
1
2
3
4
5
6
7
8
9
10
11
13
14
15
17
18
20
21
22
23
24
26
27
28
29
30
1
2
3
2015 HIMSS Annual Conference & Exhibition
2015-04-12 - 2015-04-16    
All Day
General Conference Information The 2015 HIMSS Annual Conference & Exhibition, April 12-16 in Chicago, brings together 38,000+ healthcare IT professionals, clinicians, executives and vendors from [...]
2015 CONVENTION - THE MEDICAL PROFESSION: TIME FOR A NEW SOCIAL CONTRACT
The 17th QMA's convention will be held April 16-18, 2015. The Québec Medical Association (QMA) invites you to share your opinion on the theme La profession médicale : vers un nouveau [...]
HCCA's 19th Annual Compliance Institute
2015-04-19 - 2015-04-22    
All Day
April 19-22, 2015 Lake Buena Vista, FL Early Bird Rates end January 7th The Annual Compliance Institute is HCCA’s largest event. Over the course of [...]
AAOE Annual Conference 2015
2015-04-25 - 2015-04-28    
All Day
AAOE Annual Conference 2015 The AAOE is the only professional association strictly dedicated to orthopaedic practice management. Currently, our membership has over 1,300 members in [...]
63rd ACOG ANNUAL MEETING - Annual Clinical and Scientific Meeting
2015-05-02 - 2015-05-06    
All Day
The 2015 Annual Meeting: Something for Every Ob-Gyn The New Year is a time for change! ACOG’s 2015 Annual Clinical and Scientific Meeting, May 2–6, [...]
Events on 2015-04-12
Events on 2015-04-19
Events on 2015-04-25
AAOE Annual Conference 2015
25 Apr 15
Chicago, IL 60605
Articles

Can AI image generators producing biased results be rectified?

Experts are investigating the origins of racial and gender bias in AI-generated images, and striving to address these issues.

In 2022, Pratyusha Ria Kalluri, an AI graduate student at Stanford University in California, made a concerning discovery regarding image-generating AI programs. When she requested “a photo of an American man and his house” from a popular tool, it generated an image of a light-skinned individual in front of a large, colonial-style home. However, when she asked for “a photo of an African man and his fancy house,” it produced an image of a dark-skinned person in front of a simple mud house, despite the descriptor “fancy.”

Further investigation by Kalluri and her team revealed that image outputs from widely-used tools like Stable Diffusion by Stability AI and DALL·E by OpenAI often relied on common stereotypes. For instance, terms like ‘Africa’ were consistently associated with poverty, while descriptors like ‘poor’ were linked to darker skin tones. These tools even exacerbated biases, as seen in generated images depicting certain professions. For example, most housekeepers were portrayed as people of color and all flight attendants as women, in proportions significantly deviating from demographic realities.

Similar biases have been observed by other researchers in text-to-image generative AI models, which frequently incorporate biased and stereotypical characteristics related to gender, skin color, occupations, nationalities, and more.