Urgent Action Needed to Address Biased Medical Devices

Urgent Action Needed to Address Biased Medical Devices

Immediate action is crucial in addressing the damaging effects of biased medical devices, according to an independent review. The review specifically highlighted the inaccuracy of pulse oximeter devices for individuals with darker skin tones, which can make it difficult to identify dangerous drops in oxygen levels. Furthermore, it warned that devices incorporating artificial intelligence (AI) may underestimate skin cancer in people with darker skin. Urgent development of fairer devices is necessary, as outlined in the review’s 18 recommendations for improvement. The government has accepted the report’s conclusions.

This review, commissioned in 2022 amidst mounting concerns about ethnic minorities facing higher Covid risks, closely examined three types of devices that have the potential to cause “substantial” harm to patients. One such device, the pulse oximeter, experienced frequent use during the pandemic to help determine if hospital admission and treatment were necessary. The review builds upon earlier research, revealing that these devices, which are clipped onto the finger, often overestimate blood oxygen levels in individuals with darker skin tones. Evidence from the US suggests that this can lead to worse health outcomes for black patients. Moreover, the problem is exacerbated by the fact that these devices are primarily tested and calibrated on individuals with lighter skin tones. The government has taken some action on this issue, including updated NHS guidance on pulse oximeters and increased funding for research on smarter devices. However, it is crucial that people continue to use pulse oximeters, as they provide valuable insights into oxygen level trends while efforts are made to address this issue.

The review was chaired by Prof Dame Margaret Whitehead from the University of Liverpool, who emphasized the urgent need for system-wide action. Prof Whitehead expressed the potential benefits of AI in medical devices but warned of the harm it could cause due to inherent biases against certain demographic groups. She explained that biases and injustices present in society could unknowingly influence every stage of the AI-enabled medical device lifecycle, leading to further magnification in algorithm development and machine learning.

One example highlighted in the review is the potential underdiagnosis of skin cancer in individuals with darker skin. This is likely a result of AI machines being predominantly trained on images of lighter skin tones. Another concern relates to AI systems used for analyzing chest x-rays, which are primarily trained on images of men, who typically have larger lung capacities. Consequently, this may lead to the underdiagnosis of heart disease in women, exacerbating an already existing issue. To address these concerns, the government has committed to eliminating bias in datasets and enhancing training for healthcare professionals.

The report also discussed the use of polygenic risk scores to predict an individual’s disease risk. However, similar issues arise in this area due to an overwhelming reliance on data from populations of European ancestry, rendering the results potentially inapplicable to individuals from other backgrounds. Additionally, these scores are only predictive and cannot definitively determine whether someone will develop a disease.

Prof Habib Naqvi, the chief executive of the NHS Race and Health Observatory, welcomed the findings of the review. He emphasized that access to better healthcare should not be determined by ethnicity or skin color, stressing the importance of medical devices being suitable for all communities. Prof Naqvi further highlighted the lack of diverse representation in health research, as well as the insufficient consideration of equity, leading to racial bias in medical devices, clinical assessments, and other healthcare interventions.

Andrew Stephenson, the Minister of State in the Department of Health, emphasized the significance of the review, stating that ensuring healthcare accessibility for all, regardless of ethnicity, aligns with the nation’s values. He emphasized the government’s commitment to creating a fairer and more straightforward NHS.

In conclusion, swift action is required to address biases in medical devices that disproportionately affect individuals with darker skin tones. The independent review highlights the need for fairer designs and emphasizes the potential harm that can arise from inherent biases in AI-enabled medical devices. It is essential to address biases and injustices present in society at every stage of the device lifecycle to ensure equitable healthcare for all communities.


Written By

Jiri Bílek

In the vast realm of AI and U.N. directives, Jiri crafts tales that bridge tech divides. With every word, he champions a world where machines serve all, harmoniously.