Events Calendar

Mon
Tue
Wed
Thu
Fri
Sat
Sun
M
T
W
T
F
S
S
31
1
3
4
5
6
7
8
9
10
11
13
14
15
16
17
18
19
20
21
22
23
25
26
27
29
30
1
2
3
4
02 Apr
2014-04-02    
All Day
Conference Link: http://www.nhlc-cnls.ca/default1.asp Conference Contact: Cindy MacBride at 1-800-363-9056 ext. 213, or cmacbride@cchl-ccls.ca Register: http://www.confmanager.com/main.cfm?cid=2725 Hotel: Location: Fairmont Banff Springs Hotel 405 Spray Ave Banff, [...]
HIMSS 15 Annual Conference & Exhibition
2014-04-12    
All Day
HIMSS15 may be months away, but the excitement is here...right now. It's not too early to start making plans for next April. Whether you're new [...]
2015 HIMSS Annual Conference & Exhibition
2014-04-12 - 2014-04-16    
All Day
The 2015 HIMSS Annual Conference & Exhibition, April 12-16 in Chicago, brings together 38,000+ healthcare IT professionals, clinicians, executives and vendors from around the world. [...]
IVC Miami Conference
The International Vein Congress is the premier professional meeting for vein specialists. IVC, based in Miami, FL, offers renowned, comprehensive education for both veterans and [...]
C.D. Howe Institute Roundtable Luncheon
2014-04-28    
12:00 pm - 1:30 pm
Navigating the Healthcare System: The Patient’s Perspective Please join us for this Roundtable Luncheon at the C.D. Howe Institute with Richard Alvarez, Chief Executive Officer, [...]
Events on 2014-04-02
Events on 2014-04-12
Events on 2014-04-24
IVC Miami Conference
24 Apr 14
FL
Events on 2014-04-28
Articles

Can AI image generators producing biased results be rectified?

Experts are investigating the origins of racial and gender bias in AI-generated images, and striving to address these issues.

In 2022, Pratyusha Ria Kalluri, an AI graduate student at Stanford University in California, made a concerning discovery regarding image-generating AI programs. When she requested “a photo of an American man and his house” from a popular tool, it generated an image of a light-skinned individual in front of a large, colonial-style home. However, when she asked for “a photo of an African man and his fancy house,” it produced an image of a dark-skinned person in front of a simple mud house, despite the descriptor “fancy.”

Further investigation by Kalluri and her team revealed that image outputs from widely-used tools like Stable Diffusion by Stability AI and DALL·E by OpenAI often relied on common stereotypes. For instance, terms like ‘Africa’ were consistently associated with poverty, while descriptors like ‘poor’ were linked to darker skin tones. These tools even exacerbated biases, as seen in generated images depicting certain professions. For example, most housekeepers were portrayed as people of color and all flight attendants as women, in proportions significantly deviating from demographic realities.

Similar biases have been observed by other researchers in text-to-image generative AI models, which frequently incorporate biased and stereotypical characteristics related to gender, skin color, occupations, nationalities, and more.