Explore our Topics:

AI bias due to human, not technical, issue

“It’s as much an issue of society as it is about algorithms,” says Trishan Panch, co-writer of study on AI bias.
By admin
Apr 28, 2023, 9:41 AM

Artificial Intelligence (AI) has become an essential technological advance in healthcare. From planning operations to streamlining prior authorizations, AI-powered technology is seen as a panacea to healthcare industry woes like staffing shortages and clinician burnout.

Though AI is currently widely used in medical research, diagnostics, and patient care – there are increasing concerns over bias in AI algorithms. Biased data can be caused by several factors, such as lack of diversity in the dataset, historical discrimination, and cultural biases.

“It’s as much an issue of society as it is about algorithms,” says Trishan Panch, co-writer of a 2019 paper on algorithmic bias in AI.

“If you look at algorithmic bias as just a technical issue, it will beget engineering solutions — how can you restrict certain fields such as race or gender from the data, for example. But that won’t really solve the problem alone. If the world looks a certain way, that will be reflected in the data, either directly or through proxies, and thus in the decisions.”

If an AI algorithm is trained using data that primarily represents one gender, race or ethnicity, it can lead to biased results when applied to another gender, or other races or ethnicities.

For example, a commonly used predictor for cardiovascular disease, the Framington Risk Score accurately predicts cardiovascular disease for Caucasians, but loses accuracy when predicting cardiovascular disease in Asian, Black, Latinx, and even low-income populations.

The development of AI algorithms depends on data inputs, and since the data is biased, the AI system will also be biased and reinforce existing health disparities based on socioeconomic status, race, ethnicity, and gender.

One study found that an algorithm used by hospitals to identify which patients would benefit from extra medical attention systematically underestimated the health needs of black patients, while overestimating those of white patients.

The algorithm relied on healthcare spending as a measure of illness, and because black patients tend to spend less on healthcare than white patients for the same illnesses, the algorithm failed to identify the health needs of black patients.

Panch continues, “There will probably always be some amount of bias, because the inequities that underpin bias are in society already and influence who gets the chance to build algorithms and for what purpose.”

The solution? He says, “will require normative action and collaboration between the private sector, government, academia, and civil society.”

Currently, some researchers are opting to include synthetic patients in order to compensate for lack of substantial data on women and minority patients.

Others are looking to industry-wide solutions to patch algorithmic ailments.

The first is the threat of class-action litigation. When researchers or other professionals can demonstrate the presence of bias in data analysis, they can create a system of incentives through class action lawsuits. This could encourage private companies to make changes or proactively identify and address biases before they become a problem.

The second is regulations that will help manage bias in AI. Formal regulations are not yet fully developed in AI and existing regulations aim to prevent unfair judgments by removing certain fields that are closely tied to demographics such as race, gender, socioeconomic status, and disability. But it’s complicated because these factors are essential for creating healthcare algorithms that cater to the needs of these specific groups.

Ultimately, addressing the biases in AI algorithms will require a multi-pronged approach that recognizes the role of society and stakeholders in creating an equitable healthcare system for all.


Show Your Support

Subscribe

Newsletter Logo

Subscribe to our topic-centric newsletters to get the latest insights delivered to your inbox weekly.

Enter your information below

By submitting this form, you are agreeing to DHI’s Privacy Policy and Terms of Use.