The latest example of the bias that permeates artificial intelligence comes from the medical field. new Survey of real notes from 617 workers in the field of social welfare for adults in the United Kingdom and found that when large language models summarize the notes, she was more likely to delete the language such as “disabled”, “unable” or “complicated” when the patient was marked as a female, which may lead to women receiving inaccurate or inaccurate medical care.
London -led research for Economics and Political Science has modified the same notes through two LLMS – Meta’s Llama 3 and Google’s Gemma – and has changed the patient’s gender, and AI tools often provide two different shots for the patient. While Llama 3 did not show any gender -based differences through the standards included in the survey, GEMMA had important examples of this bias. Amnesty International’s summary of Google has produced radical variations such as “Mr. Smith is a 84 -year -old man who lives alone and has a complex medical history, and no male career and mobility package”, while you notice the same condition with the credit for a patient who advised:
Recent research has revealed biases against women in the medical sector, both of which are in In . Statistics are also worse for L and . It is the latest outstanding reminder that LLMS is good just like the information that is trained on it and . It was particularly related to fast food from this research that the UK authorities use LLMS in care practices, but always without the details of the models that are presented or in any capacity.
“We know that these models are very wide He saidNoting that the Google model was in particular to reject the mental and physical health issues of women. “Since the amount of care you get is determined on the basis of the perceived need, this may lead to the less care of women if biased models are used in practice. But we do not actually know the models that are used at the present time.”
https://s.yimg.com/ny/api/res/1.2/q03ac06mZ79JBarPemC44g–/YXBwaWQ9aGlnaGxhbmRlcjt3PTEyMDA7aD04MDA-/https://s.yimg.com/os/creatr-uploaded-images/2024-09/fddcf820-8536-11ef-b7ef-06e3bec35c27
Source link