Events Calendar

Mon
Tue
Wed
Thu
Fri
Sat
Sun
M
T
W
T
F
S
S
26
27
29
30
31
1
2
5
7
8
12
13
14
16
17
21
22
23
24
25
26
27
28
1
Proper Management of Medicare/Medicaid Overpayments to Limit Risk of False Claims
2015-01-28    
1:00 pm - 3:00 pm
January 28, 2015 Web Conference 12pm CST | 1pm EST | 11am MT | 10am PST | 9AM AKST | 8AM HAST Topics Covered: Identify [...]
EhealthInitiative Annual Conference 2015
2015-02-03 - 2015-02-05    
All Day
About the Annual Conference Interoperability: Building Consensus Through the 2020 Roadmap eHealth Initiative’s 2015 Annual Conference & Member Meetings, February 3-5 in Washington, DC will [...]
Real or Imaginary -- Manipulation of digital medical records
2015-02-04    
1:00 pm - 3:00 pm
February 04, 2015 Web Conference 12pm CST | 1pm EST | 11am MT | 10am PST | 9am AKST | 8am HAST Main points covered: [...]
Orlando Regional Conference
2015-02-06    
All Day
February 06, 2015 Lake Buena Vista, FL Topics Covered: Hot Topics in Compliance Compliance and Quality of Care Readying the Compliance Department for ICD-10 Compliance [...]
Patient Engagement Summit
2015-02-09 - 2015-02-10    
12:00 am
THE “BLOCKBUSTER DRUG OF THE 21ST CENTURY” Patient engagement is one of the hottest topics in healthcare today.  Many industry stakeholders consider patient engagement, as [...]
iHT2 Health IT Summit in Miami
2015-02-10 - 2015-02-11    
All Day
February 10-11, 2015 iHT2 [eye-h-tee-squared]: 1. an awe-inspiring summit featuring some of the world.s best and brightest. 2. great food for thought that will leave you begging [...]
Starting Urgent Care Business with Confidence
2015-02-11    
1:00 pm - 3:00 pm
February 11, 2015 Web Conference 12pm CST | 1pm EST | 11am MT | 10am PST | 9am AKST | 8am HAST Main points covered: [...]
Managed Care Compliance Conference
2015-02-15 - 2015-02-18    
All Day
February 15, 2015 - February 18, 2015 Las Vegas, NV Prospectus Learn essential information for those involved with the management of compliance at health plans. [...]
Healthcare Systems Process Improvement Conference 2015
2015-02-18 - 2015-02-20    
All Day
BE A PART OF THE 2015 CONFERENCE! The Healthcare Systems Process Improvement Conference 2015 is your source for the latest in operational and quality improvement tools, methods [...]
A Practical Guide to Using Encryption for Reducing HIPAA Data Breach Risk
2015-02-18    
1:00 pm - 3:00 pm
February 18, 2015 Web Conference 12pm CST | 1pm EST | 11am MT | 10am PST | 9am AKST | 8am HAST Main points covered: [...]
Compliance Strategies to Protect your Revenue in a Changing Regulatory Environment
2015-02-19    
1:00 pm - 3:30 pm
February 19, 2015 Web Conference 12pm CST | 1pm EST | 11am MT | 10am PST | 9am AKST | 8am HAST Main points covered: [...]
Dallas Regional Conference
2015-02-20    
All Day
February 20, 2015 Grapevine, TX Topics Covered: An Update on Government Enforcement Actions from the OIG OIG and US Attorney’s Office ICD 10 HIPAA – [...]
Events on 2015-02-03
EhealthInitiative Annual Conference 2015
3 Feb 15
2500 Calvert Street
Events on 2015-02-06
Orlando Regional Conference
6 Feb 15
Lake Buena Vista
Events on 2015-02-09
Events on 2015-02-10
Events on 2015-02-11
Events on 2015-02-15
Events on 2015-02-20
Dallas Regional Conference
20 Feb 15
Grapevine
Articles

Can AI image generators producing biased results be rectified?

Experts are investigating the origins of racial and gender bias in AI-generated images, and striving to address these issues.

In 2022, Pratyusha Ria Kalluri, an AI graduate student at Stanford University in California, made a concerning discovery regarding image-generating AI programs. When she requested “a photo of an American man and his house” from a popular tool, it generated an image of a light-skinned individual in front of a large, colonial-style home. However, when she asked for “a photo of an African man and his fancy house,” it produced an image of a dark-skinned person in front of a simple mud house, despite the descriptor “fancy.”

Further investigation by Kalluri and her team revealed that image outputs from widely-used tools like Stable Diffusion by Stability AI and DALL·E by OpenAI often relied on common stereotypes. For instance, terms like ‘Africa’ were consistently associated with poverty, while descriptors like ‘poor’ were linked to darker skin tones. These tools even exacerbated biases, as seen in generated images depicting certain professions. For example, most housekeepers were portrayed as people of color and all flight attendants as women, in proportions significantly deviating from demographic realities.

Similar biases have been observed by other researchers in text-to-image generative AI models, which frequently incorporate biased and stereotypical characteristics related to gender, skin color, occupations, nationalities, and more.