As a rising chorus of healthcare experts decries the harmful racial bias in clinical decision support tools (CDSTs), the House Ways and Means Committee has released a report spelling out in detail what the problem is, how it affects health equity, and how healthcare stakeholders can address it.
Among the issues described in the report is the widespread practice of applying “racial correction” modifiers to clinical guidelines and algorithms so as to adjust for the supposed biological differences among races and ethnicities — an assumption that has been disproved by abundant scientific evidence. For example, one CDST predicts the odds for success in a vaginal birth after cesarean delivery at 80.9% for White women, 68.4% for Black women, 68.2% for Latina women, and 52.3% for women of mixed Black and Latina heritage. The report notes that the rate of maternal mortality among Black, American Indian, and Latina women is two to three times higher than the rate among White women. The rate of cesarian delivery is also higher, despite the known health benefits of vaginal delivery.
House Ways and Means Committee Chair Richard Neal (D-MA) said in a news release that the feedback he’d gotten on racial/ethnic bias from medical societies and other industry voices was divergent. He said his powerful committee is looking for ways to do something about the unequal treatment of racial and ethnic minorities that has resulted from the bias in CDSTs.
“One thing is for certain: the status quo is unacceptable,” Neal declared. “Racial correction in clinical algorithms contributes to worse outcomes for patients of color receiving treatment for a broad range of conditions, from cancer, to osteoporosis, to end-stage renal disease, to child birth. The Ways and Means Committee remains committed to working with stakeholders and advancing policies that prioritize justice and equity in our health care system.”
How Bias in CDSTs Can Harm Patients
Part of the impetus for Neal to tackle this challenge, he said, was an article published in 2020 in The New England Journal of Medicine (NEJM) that showed that adjustments for race in CDSTs and clinical algorithms are harmful to patients who have a wide range of conditions.
“Many of these race-adjusted algorithms guide decisions in ways that may direct more attention or resources to White patients than to members of racial and ethnic minorities,” the investigators said. Citing the NEJM study, the authors of the report state, “While the use of race in clinical algorithms is largely driven by differences in health outcomes that are common to large datasets, these differences are most likely due to the effects of racism and other determinants of health, not ‘biological’ effects of one race versus another.
“As the researchers cautioned,” the report continues, “incorporating race data into algorithms can entrench disparities by potentially producing different treatment approaches for individuals that are not based on precision medicine” but on their race or ethnicity.
More broadly, CDSTs “introduce an element of health inequity because they consider race and ethnicity in ways that exacerbate existing racial disparities, reduce the rigor of personalized diagnostics, and result in fewer treatment options for people of color,” the report notes.
Widespread Bias Seen
The report cites racial/ethnic components in a number of widely used CDSTs, including these:
Black patients are systematically scored as being at lower risk for in-hospital death from heart failure.
In determining the glomerular filtration rate of people with kidney disease, a modifier estimates a healthier level of kidney function for Black patients than for White patients, using the same lab result.
Applying a correction factor to pulmonary function tests for Black or Asian patients can result in differences in the timing of diagnosis and in the offering of certain treatments.
Black patients are scored as being at lower risk than White women for osteoporosis; Black, Asian, and Latino patients are scored as having lower risk for fracture. Although rates of osteoporotic fractures are higher among White women than Black women, rates of morbidity and mortality from such fractures are higher among Black women.
In short-term surgical risk calculations, Black patients are scored as being at higher risk for death and complications. Asian, Latino, and Alaskan Native/Pacific Islander patients are assumed to be at higher risk for complications.
Two CDSTs predict lower 5- or 10-year risk for breast cancer for Black, American Indian, Asian, and Latina women. Rates of breast cancer screening are lower among women of color.
Industry Response Varies Widely
To gauge industry attitudes toward CDST bias, Neal sent a letter to a number of medical societies and issued a request for information (RFI) to other stakeholders, including academic institutions, health systems, and other professional associations. He received 18 responses to the RFI and 31 responses overall; these are summarized in the report.
“Several societies voiced interest in working in a coalition with other societies to develop a collaborative approach and/or task force to reevaluate the use of race and ethnicity in clinical algorithms,” the report says.
“In contrast, several societies said they were not planning to reevaluate the way their relevant CDSTs use race because their organization did not play a role in creating the algorithm and, thus, it was not their responsibility to reevaluate it. Still, they said, they believed the algorithms are grounded in a solid evidence base….
“Several [societies] suggested removing race and ethnicity data from the clinical algorithms, while others suggested rethinking their inclusion. Others proposed acknowledging race-related risks for patients, focusing on population health, communicating the changes in clinical algorithms to patients once implemented, and implementing algorithms that are shown to decrease inequity.”
Many professional societies “have focused heavily on organizational efforts to increase racial and ethnic diversity in their field and support for researchers from communities of color,” the report says. However, it was unclear how these programs were involved in correcting racial bias in CDSTs, and “none of the professional societies made commitments to improving specific racial inequities [in CDSTs].”
Last May, the American Medical Association released a lengthy report on the need for greater health equity, both inside and outside the AMA. The report set forth a 3-year strategic framework that calls for “ending the use of race-based clinical decision models (including calculators).” The AMA also wants to ensure that “augmented intelligence (AI) is free from harmful, biased algorithms.”
Ken Terry is a healthcare journalist and author. His latest book is “Physician-Led Healthcare Reform: A New Approach to Medicare for All.”