Medical AI Underestimates Symptoms in Women and Some Ethnic Groups

World news » Medical AI Underestimates Symptoms in Women and Some Ethnic Groups
Preview Medical AI Underestimates Symptoms in Women and Some Ethnic Groups

Artificial intelligence (AI) tools, designed to assist medical professionals, may be underestimating the symptoms of various illnesses in women and certain racial and ethnic groups. This inherent bias could lead to these populations receiving suboptimal or less comprehensive medical treatment. This concerning issue was brought to light by the Financial Times, referencing several independent scientific studies.

According to the FT`s report, many doctors are increasingly integrating both general-purpose chatbots, such as Gemini and ChatGPT, and specialized medical AI instruments into their daily practice. These advanced tools are commonly used to help with tasks like populating patient charts, pinpointing crucial symptoms, and providing concise summaries of complex medical histories.

In June, a team of researchers from the Massachusetts Institute of Technology (MIT) conducted a thorough analysis of several prominent chatbots and AI tools, including GPT-4 and Palmyra-Med. Their investigations concluded that many of these platforms recommended a significantly lower standard of care for women and frequently suggested self-treatment options for female patients. A separate MIT study revealed strikingly similar discriminatory patterns when these tools were used to provide medical recommendations for Black and Asian individuals.

A further study from MIT indicated a linguistic bias: chatbots were 7–9% less likely to recommend a doctor`s consultation for patients who made numerous typographical errors and grammatical mistakes due to English not being their native language, compared to those whose writing was more accurate. Concurrently, academics at the London School of Economics also found that Google`s Gemma AI model, which is deployed in many social care institutions across the UK, consistently underestimated physical and mental health issues in women when tasked with compiling and synthesizing their medical records.

This article is a rephrased and translated summary based on reports, including those by the Financial Times.