illustration of human being scanned by computers

Law’s Sharona Hoffman and computer science’s Andy Podgurski publish piece on discrimination in medical AI

There no argument that artificial intelligence (AI) holds great promise for improved health care outcomes—from analyzing tumor images, to helping doctors choose among different treatment options, combatting the COVID-19 pandemic.

But a husband-wife research duo from Case Western Reserve University warn that AI also comes with some substantial new hazards.

In their article, published in the Yale Journal of Health Policy, Law and Ethics, Sharona Hoffman and her husband, Andy Podgurski, outline concerns over algorithmic discrimination, a particular type of health care harm that has, so far, evaded legal scrutiny.

Hoffman is the Edgar A. Hahn Professor of Law, professor of bioethics and the co-director of the Law-Medicine Center; Podgurski is a professor of computer and data sciences.

The pair noted a well-known example is an algorithm used to identify candidates for “high risk care management” programs that routinely failed to refer racial minorities for these beneficial services. “Furthermore, some algorithms deliberately adjust for race in ways that hurt minority patients,” Podgurski and Hoffman wrote. “For example, according to a 2020 New England Journal of Medicine article, algorithms have regularly underestimated African Americans’ risks of kidney stones, death from heart failure, and other medical problems.”

Hoffman noted that algorithmic discrimination in medicine can violate civil rights laws such as Title VI and Section 1557 of the Affordable Care Act when it exacerbates health disparities or perpetuates inequities. It urges that algorithmic fairness constitute a key element in designing, implementing, and validating AI and that both legal and technical tools be deployed to promote fairness.

The authors called for a reintroduction of the disparate impact theory as a robust litigation tool in the health care arena and for the passage of an algorithmic accountability act.

Read their paper.