Racial Bias in Lung Function Diagnostic Algorithm Leads to Underdiagnosis in Black Men

June 1, 2023

A study published in JAMA Network Open reveals concerning racial bias within a widely used medical diagnostic algorithm for lung function. This bias may lead to underdiagnoses of respiratory issues in Black men, with estimates suggesting up to 40% more patients could have received accurate diagnoses if the software were free from bias. The algorithm adjusts diagnostic thresholds based on race, impacting medical treatments and interventions. By shedding light on such harm-causing incidents, we highlight the importance of Project Cerebellum's efforts in establishing guardrails for AI that ensure safe and secure practices. JOIN US to learn more about how HISPI Project Cerebellum TAIM (Govern) can help prevent such biases in AI systems.

Matched TAIM controls

Suggested mapping from embedding similarity (not a formal assessment). Browse all TAIM controls

Alleged deployer
university-of-pennsylvania-health-system
Alleged developer
unknown
Alleged harmed parties
black-men-who-underwent-lung-function-tests-between-2010-and-2020-and-potentially-received-inaccurate-or-delayed-diagnoses-and-medical-interventions-due-to-the-biased-algorithm

Source

Data from the AI Incident Database (AIID). Cite this incident: https://incidentdatabase.ai/cite/582

Data source

Incident data is from the AI Incident Database (AIID).

When citing the database as a whole, please use:

McGregor, S. (2021) Preventing Repeated Real World AI Failures by Cataloging Incidents: The AI Incident Database. In Proceedings of the Thirty-Third Annual Conference on Innovative Applications of Artificial Intelligence (IAAI-21). Virtual Conference.

Pre-print on arXiv · Database snapshots & citation guide

We use weekly snapshots of the AIID for stable reference. For the official suggested citation of a specific incident, use the “Cite this incident” link on each incident page.