Events Calendar

Mon
Tue
Wed
Thu
Fri
Sat
Sun
M
T
W
T
F
S
S
23
24
25
26
27
28
1
2
4
5
6
7
8
9
11
12
13
14
15
16
17
18
19
20
21
23
26
27
28
29
30
31
1
2
3
4
5
Health IT Summit in San Francisco
2015-03-03 - 2015-03-04    
All Day
iHT2 [eye-h-tee-squared]: 1. an awe-inspiring summit featuring some of the world.s best and brightest. 2. great food for thought that will leave you begging for more. 3. [...]
How to Get Paid for the New Chronic Care Management Code
2015-03-10    
1:00 am - 10:00 am
Under a new chronic care management program authorized by CMS and taking effect in 2015, you can bill for care that you are probably already [...]
The 12th Annual World Health Care  Congress & Exhibition
2015-03-22 - 2015-03-25    
All Day
The 12th Annual World Health Care Congress convenes decision makers from all sectors of health care to catalyze change. In 2015, faculty focus on critical challenges and [...]
ICD-10 Success: How to Get There From Here
2015-03-24    
1:00 pm
Tuesday, March 24, 2015 1:00 PM Eastern / 10:00 AM Pacific Make sure your practice is ready for ICD-10 coding with this complimentary overview of [...]
Customer Analytics & Engagement in Health Insurance
2015-03-25 - 2015-03-26    
All Day
Takeaway business ROI: Drive business value with customer analytics: learn what every business person needs to know about analytics to improve your customer base Debate key customer [...]
How to survive a HIPPA Audit
2015-03-25    
2:00 pm - 3:30 pm
Wednesday, March 25th from 2:00 – 3:30 EST If you were audited for HIPAA compliance tomorrow, would you be prepared? The question is not so hypothetical, [...]
Events on 2015-03-03
Health IT Summit in San Francisco
3 Mar 15
San Francisco
Events on 2015-03-10
Events on 2015-03-22
Events on 2015-03-24
Events on 2015-03-25
Articles

Can AI image generators producing biased results be rectified?

Experts are investigating the origins of racial and gender bias in AI-generated images, and striving to address these issues.

In 2022, Pratyusha Ria Kalluri, an AI graduate student at Stanford University in California, made a concerning discovery regarding image-generating AI programs. When she requested “a photo of an American man and his house” from a popular tool, it generated an image of a light-skinned individual in front of a large, colonial-style home. However, when she asked for “a photo of an African man and his fancy house,” it produced an image of a dark-skinned person in front of a simple mud house, despite the descriptor “fancy.”

Further investigation by Kalluri and her team revealed that image outputs from widely-used tools like Stable Diffusion by Stability AI and DALL·E by OpenAI often relied on common stereotypes. For instance, terms like ‘Africa’ were consistently associated with poverty, while descriptors like ‘poor’ were linked to darker skin tones. These tools even exacerbated biases, as seen in generated images depicting certain professions. For example, most housekeepers were portrayed as people of color and all flight attendants as women, in proportions significantly deviating from demographic realities.

Similar biases have been observed by other researchers in text-to-image generative AI models, which frequently incorporate biased and stereotypical characteristics related to gender, skin color, occupations, nationalities, and more.