A recent study from the London School of Economics and Political Science (LSE) reveals that AI tools used by over half of England’s local councils may inadvertently perpetuate gender bias in care decisions. Researchers analyzed 29,616 AI-generated summaries from Google’s model, Gemma, discovering it often downplayed women’s physical and mental health issues compared to men. For instance, a case summary for an elderly man was described as having a “complex medical history,” while the equivalent summary for a woman suggested she was “independent” despite her limitations. This disparity may lead to unequal care provision, with women potentially receiving less support due to biased AI interpretations of their needs. The study highlights significant concerns regarding AI in healthcare, urging companies to address these biases. Google acknowledged the findings and indicated that improvements are expected in its third-version Gemma, which is not intended for medical use.
Source link

Share
Read more