Thu. Dec 12th, 2024

Who is responsible if artificial intelligence fails in hospitals and other big questions about AI in healthcare, answers<!-- wp:html --><p><a href="https://whatsnew2day.com/">WhatsNew2Day - Latest News And Breaking Headlines</a></p> <div> <p class="paragraph_paragraph___QITb">Artificial intelligence (AI) is already being used in healthcare. AI can look for patterns in <a target="_blank" class="Link_link__5eL5m ScreenReaderOnly_srLinkHint__OysWz Link_showVisited__C1Fea Link_showFocus__ALyv2" href="https://journal.achsm.org.au/index.php/achsm/article/view/861" rel="noopener">medical images</a> to help diagnose diseases. It can help predict who in a hospital room might <a target="_blank" class="Link_link__5eL5m ScreenReaderOnly_srLinkHint__OysWz Link_showVisited__C1Fea Link_showFocus__ALyv2" href="https://www.jmir.org/2021/9/e28209" rel="noopener">deteriorate</a>. Can <a target="_blank" class="Link_link__5eL5m ScreenReaderOnly_srLinkHint__OysWz Link_showVisited__C1Fea Link_showFocus__ALyv2" href="https://elicit.com/" rel="noopener">quickly summarize</a> medical research articles to help doctors stay up to date with the latest evidence.</p> <p class="paragraph_paragraph___QITb">These are examples of AI creation <a target="_blank" class="Link_link__5eL5m ScreenReaderOnly_srLinkHint__OysWz Link_showVisited__C1Fea Link_showFocus__ALyv2" href="https://theconversation.com/artificial-intelligence-wont-replace-a-doctor-any-time-soon-but-it-can-help-with-diagnosis-83353" rel="noopener">or shaping</a> decisions that health professionals made previously. More applications are being developed.</p> <p class="paragraph_paragraph___QITb">But what do consumers think about the use of AI in healthcare? And how should your answers determine how it will be used in the future?</p> <h2 class="Typography_base__sj2RP Heading_heading__VGa5B Typography_sizeMobile20__NUDn4 Typography_sizeDesktop32__LR_G6 Typography_lineHeightMobile24__crkfh Typography_lineHeightDesktop40__BuoRf Typography_marginBottomMobileSmall__6wx7m Typography_marginBottomDesktopSmall__CboX4 Typography_black__9qnZ1 Typography_colourInherit__dfnUx Typography_normalise__u5o1s">What do consumers think?</h2> <p class="paragraph_paragraph___QITb">AI systems are trained to look for patterns in large amounts of data. Based on these patterns, AI systems can make recommendations, suggest diagnoses, or initiate actions. Potentially, they can continually learn and get better at tasks over time.</p> <p class="paragraph_paragraph___QITb">If we unite <a target="_blank" class="Link_link__5eL5m ScreenReaderOnly_srLinkHint__OysWz Link_showVisited__C1Fea Link_showFocus__ALyv2" href="https://www.sciencedirect.com/science/article/pii/S0277953623007141#appsec1" rel="noopener">international</a> evidence, including <a target="_blank" class="Link_link__5eL5m ScreenReaderOnly_srLinkHint__OysWz Link_showVisited__C1Fea Link_showFocus__ALyv2" href="https://www.uow.edu.au/the-arts-social-sciences-humanities/research/acheev/artificial-intelligence-in-health/" rel="noopener">our own</a> <a target="_blank" class="Link_link__5eL5m ScreenReaderOnly_srLinkHint__OysWz Link_showVisited__C1Fea Link_showFocus__ALyv2" href="https://journal.achsm.org.au/index.php/achsm/article/view/861" rel="noopener">and so?</a> <a target="_blank" class="Link_link__5eL5m ScreenReaderOnly_srLinkHint__OysWz Link_showVisited__C1Fea Link_showFocus__ALyv2" href="https://humanfactors.jmir.org/2022/3/e34514/authors" rel="noopener">of others</a>It appears that most consumers accept the potential value of AI in healthcare.</p> <p class="paragraph_paragraph___QITb">This value could include, for example, increasing the <a target="_blank" class="Link_link__5eL5m ScreenReaderOnly_srLinkHint__OysWz Link_showVisited__C1Fea Link_showFocus__ALyv2" href="https://www.jmir.org/2022/8/e37611/" rel="noopener">diagnostic accuracy</a> or improve <a target="_blank" class="Link_link__5eL5m ScreenReaderOnly_srLinkHint__OysWz Link_showVisited__C1Fea Link_showFocus__ALyv2" href="https://mental.jmir.org/2019/11/e12942/" rel="noopener">access to care</a>. At present, these are largely potential benefits, rather than proven ones.</p> <p class="paragraph_paragraph___QITb">But consumers say their acceptance is conditional. They still have serious concerns.</p> <p><span class="Loading_loading__21MZU VideoMiddleware_loading__aGBo3"><span class="Loading_spinner__zmkAw Loading_spinnerSize32__Z_XId Loading_spinnerColourBrand__CqEIF"></span><span class="Loading_label__cTH1q">Charging…</span></span></p> <h3 class="Typography_base__sj2RP Heading_heading__VGa5B Typography_sizeMobile18__eJCIB Typography_sizeDesktop24__mJJ8n Typography_lineHeightMobile24__crkfh Typography_lineHeightDesktop32__ceKem Typography_marginBottomMobileSmall__6wx7m Typography_marginBottomDesktopSmall__CboX4 Typography_black__9qnZ1 Typography_colourInherit__dfnUx Typography_normalise__u5o1s">1. Does AI work?</h3> <p class="paragraph_paragraph___QITb">A basic expectation is that AI tools should work well. Consumers often say that AI should be at least as good as a <a target="_blank" class="Link_link__5eL5m ScreenReaderOnly_srLinkHint__OysWz Link_showVisited__C1Fea Link_showFocus__ALyv2" href="https://journal.achsm.org.au/index.php/achsm/article/view/861" rel="noopener">human doctor</a> in the tasks he performs. They say we shouldn’t use AI if it will lead to more misdiagnoses or medical errors.</p> <h3 class="Typography_base__sj2RP Heading_heading__VGa5B Typography_sizeMobile18__eJCIB Typography_sizeDesktop24__mJJ8n Typography_lineHeightMobile24__crkfh Typography_lineHeightDesktop32__ceKem Typography_marginBottomMobileSmall__6wx7m Typography_marginBottomDesktopSmall__CboX4 Typography_black__9qnZ1 Typography_colourInherit__dfnUx Typography_normalise__u5o1s">2. Who is responsible if the AI ​​makes mistakes?</h3> <p class="paragraph_paragraph___QITb">Consumers are also concerned that if AI systems generate decisions (such as diagnoses or treatment plans) without human input, it may not be clear who is responsible for errors. That’s why people often want doctors to remain responsible for final decisions and <a target="_blank" class="Link_link__5eL5m ScreenReaderOnly_srLinkHint__OysWz Link_showVisited__C1Fea Link_showFocus__ALyv2" href="https://www.nature.com/articles/s41746-021-00509-1" rel="noopener">protecting patients</a> Of Damage.</p> <h3 class="Typography_base__sj2RP Heading_heading__VGa5B Typography_sizeMobile18__eJCIB Typography_sizeDesktop24__mJJ8n Typography_lineHeightMobile24__crkfh Typography_lineHeightDesktop32__ceKem Typography_marginBottomMobileSmall__6wx7m Typography_marginBottomDesktopSmall__CboX4 Typography_black__9qnZ1 Typography_colourInherit__dfnUx Typography_normalise__u5o1s">3. Will AI make healthcare less fair?</h3> <p class="paragraph_paragraph___QITb">If health services are <a target="_blank" class="Link_link__5eL5m ScreenReaderOnly_srLinkHint__OysWz Link_showVisited__C1Fea Link_showFocus__ALyv2" href="https://theconversation.com/ms-dhu-coronial-findings-show-importance-of-teaching-doctors-and-nurses-about-unconscious-bias-60319" rel="noopener">already discriminatory</a>AI systems can learn these patterns from data and <a target="_blank" class="Link_link__5eL5m ScreenReaderOnly_srLinkHint__OysWz Link_showVisited__C1Fea Link_showFocus__ALyv2" href="https://www.science.org/doi/10.1126/science.aax2342" rel="noopener">repeat or worsen</a> The discrimination. Therefore, AI used in healthcare can worsen health inequalities. In our studies, consumers said this <a target="_blank" class="Link_link__5eL5m ScreenReaderOnly_srLinkHint__OysWz Link_showVisited__C1Fea Link_showFocus__ALyv2" href="https://journals.sagepub.com/doi/pdf/10.1177/20552076231191057" rel="noopener">Not well</a>.</p> <h3 class="Typography_base__sj2RP Heading_heading__VGa5B Typography_sizeMobile18__eJCIB Typography_sizeDesktop24__mJJ8n Typography_lineHeightMobile24__crkfh Typography_lineHeightDesktop32__ceKem Typography_marginBottomMobileSmall__6wx7m Typography_marginBottomDesktopSmall__CboX4 Typography_black__9qnZ1 Typography_colourInherit__dfnUx Typography_normalise__u5o1s">4. Will AI dehumanize healthcare?</h3> <p class="paragraph_paragraph___QITb">Consumers are concerned that AI will eliminate the “human” elements of healthcare and consistently say AI tools should <a target="_blank" class="Link_link__5eL5m ScreenReaderOnly_srLinkHint__OysWz Link_showVisited__C1Fea Link_showFocus__ALyv2" href="https://journals.sagepub.com/doi/full/10.1177/20552076221116772" rel="noopener">support rather than replace</a> doctors. This is often because AI is perceived to lack important human traits, <a target="_blank" class="Link_link__5eL5m ScreenReaderOnly_srLinkHint__OysWz Link_showVisited__C1Fea Link_showFocus__ALyv2" href="https://journals.sagepub.com/doi/full/10.1177/2055207619871808" rel="noopener">like empathy</a>. Consumers say that the communication skills, care and contact of a health professional are especially important when they feel vulnerable.</p> <h3 class="Typography_base__sj2RP Heading_heading__VGa5B Typography_sizeMobile18__eJCIB Typography_sizeDesktop24__mJJ8n Typography_lineHeightMobile24__crkfh Typography_lineHeightDesktop32__ceKem Typography_marginBottomMobileSmall__6wx7m Typography_marginBottomDesktopSmall__CboX4 Typography_black__9qnZ1 Typography_colourInherit__dfnUx Typography_normalise__u5o1s">5. Will AI downskill our healthcare workers?</h3> <p class="paragraph_paragraph___QITb">Consumers value human doctors and their expertise. In our <a target="_blank" class="Link_link__5eL5m ScreenReaderOnly_srLinkHint__OysWz Link_showVisited__C1Fea Link_showFocus__ALyv2" href="https://journals.sagepub.com/doi/full/10.1177/20552076231191057" rel="noopener">research with women</a> Regarding AI in breast screening, women were concerned about the possible effect on radiologists’ skills and experience. Women saw this experience as a valuable shared resource: if there was too much reliance on AI tools, this resource could be lost.</p> <h2 class="Typography_base__sj2RP Heading_heading__VGa5B Typography_sizeMobile20__NUDn4 Typography_sizeDesktop32__LR_G6 Typography_lineHeightMobile24__crkfh Typography_lineHeightDesktop40__BuoRf Typography_marginBottomMobileSmall__6wx7m Typography_marginBottomDesktopSmall__CboX4 Typography_black__9qnZ1 Typography_colourInherit__dfnUx Typography_normalise__u5o1s">Consumers and communities need a say</h2> <p class="paragraph_paragraph___QITb">The Australian healthcare system cannot focus solely on the technical elements of AI tools. Social and ethical considerations, including high-quality engagement with consumers and communities, are essential in shaping the use of AI in healthcare.</p> <p class="paragraph_paragraph___QITb">Communities need opportunities to develop <a target="_blank" class="Link_link__5eL5m ScreenReaderOnly_srLinkHint__OysWz Link_showVisited__C1Fea Link_showFocus__ALyv2" href="https://theconversation.com/chatbots-for-medical-advice-three-ways-to-avoid-misleading-information-213266" rel="noopener">digital health literacy</a>: <a target="_blank" class="Link_link__5eL5m ScreenReaderOnly_srLinkHint__OysWz Link_showVisited__C1Fea Link_showFocus__ALyv2" href="https://www.goodthingsfoundation.org.au/the-digital-divide/digital-health/" rel="noopener">digital skills</a> access reliable and trustworthy health information, services and resources.</p> <p class="paragraph_paragraph___QITb">Respectful engagement with Aboriginal and Torres Strait Islander communities must be fundamental. This includes upholding Indigenous data sovereignty, which the Australian Institute of Aboriginal and Torres Strait Islander Studies <a target="_blank" class="Link_link__5eL5m ScreenReaderOnly_srLinkHint__OysWz Link_showVisited__C1Fea Link_showFocus__ALyv2" href="https://aiatsis.gov.au/publication/116530" rel="noopener">describe how</a>:</p> <p class="paragraph_paragraph___QITb">“The right of indigenous peoples to govern the collection, ownership and application of data on indigenous communities, peoples, lands and resources.”</p> <p class="paragraph_paragraph___QITb">This includes any use of data to create AI.</p> <p class="paragraph_paragraph___QITb">This critically important consumer and community engagement must take place before administrators design (further) AI into health systems, before <a target="_blank" class="Link_link__5eL5m ScreenReaderOnly_srLinkHint__OysWz Link_showVisited__C1Fea Link_showFocus__ALyv2" href="https://theconversation.com/who-will-write-the-rules-for-ai-how-nations-are-racing-to-regulate-artificial-intelligence-216900" rel="noopener">regulators</a> create a guide on how AI should and should not be used, and before doctors consider purchasing a new AI tool for their practice.</p> <p class="paragraph_paragraph___QITb">We are making some progress. Earlier this year, we conducted a <a target="_blank" class="Link_link__5eL5m ScreenReaderOnly_srLinkHint__OysWz Link_showVisited__C1Fea Link_showFocus__ALyv2" href="https://www.uow.edu.au/the-arts-social-sciences-humanities/research/acheev/artificial-intelligence-in-health/" rel="noopener">citizen jury on AI in healthcare</a>. We supported 30 diverse Australians, from all states and territories, to spend three weeks learning about AI in healthcare and developing recommendations for policymakers.</p> <p class="paragraph_paragraph___QITb">Their recommendations, which will be published in an upcoming issue of the Medical Journal of Australia, are based on a recently published article. <a target="_blank" class="Link_link__5eL5m ScreenReaderOnly_srLinkHint__OysWz Link_showVisited__C1Fea Link_showFocus__ALyv2" href="https://aihealthalliance.org/" rel="noopener">national roadmap</a> to use AI in healthcare.</p> <h2 class="Typography_base__sj2RP Heading_heading__VGa5B Typography_sizeMobile20__NUDn4 Typography_sizeDesktop32__LR_G6 Typography_lineHeightMobile24__crkfh Typography_lineHeightDesktop40__BuoRf Typography_marginBottomMobileSmall__6wx7m Typography_marginBottomDesktopSmall__CboX4 Typography_black__9qnZ1 Typography_colourInherit__dfnUx Typography_normalise__u5o1s">Thats not all</h2> <p class="paragraph_paragraph___QITb">Healthcare professionals also need to be trained and supported to use AI in healthcare. They must learn to be critical users of digital health tools, including understanding their advantages and disadvantages.</p> <p class="paragraph_paragraph___QITb">Our <a target="_blank" class="Link_link__5eL5m ScreenReaderOnly_srLinkHint__OysWz Link_showVisited__C1Fea Link_showFocus__ALyv2" href="https://pubmed.ncbi.nlm.nih.gov/37071804/" rel="noopener">analysis</a> A series of safety events reported to the Food and Drug Administration shows that the most serious harms reported to the US regulator did not come from a defective device, but from the way consumers and doctors used the device.</p> <p class="paragraph_paragraph___QITb">We must also consider when healthcare professionals should inform patients that an AI tool is being used for their care and when healthcare workers should request informed consent for that use.</p> <p class="paragraph_paragraph___QITb">Finally, people involved in every stage of AI development and use must get into the habit of asking themselves: do consumers and communities agree that this is a justified use of AI?</p> <p class="paragraph_paragraph___QITb">Only then will we have the AI-based healthcare system that consumers really want.</p> <p class="paragraph_paragraph___QITb"><strong>Stacy Carter is Professor and Director of the Australian Center for Health Engagement, Evidence and Values ​​at the University of Wollongong. Emma Frost is a PhD candidate at the Australian Center for Health Engagement, Evidence and Values ​​at the University of Wollongong. Farah Magrabi is a Professor of Biomedical and Health Informatics at the Australian Institute for Health Innovation at Macquarie University. Yves Saint James Aquino is a researcher at the Australian Center for Health Engagement, Evidence and Values ​​at the University of Wollongong. This piece first appeared in <a target="_blank" class="Link_link__5eL5m ScreenReaderOnly_srLinkHint__OysWz Link_showVisited__C1Fea Link_showFocus__ALyv2" href="https://theconversation.com/artificial-intelligence-is-already-in-our-hospitals-5-questions-people-want-answered-217374" rel="noopener">The conversation</a>.</strong></p> </div> <p><a href="https://whatsnew2day.com/who-is-responsible-if-artificial-intelligence-fails-in-hospitals-and-other-big-questions-about-ai-in-healthcare-answers/">Who is responsible if artificial intelligence fails in hospitals and other big questions about AI in healthcare, answers</a></p><!-- /wp:html -->

WhatsNew2Day – Latest News And Breaking Headlines

Artificial intelligence (AI) is already being used in healthcare. AI can look for patterns in medical images to help diagnose diseases. It can help predict who in a hospital room might deteriorate. Can quickly summarize medical research articles to help doctors stay up to date with the latest evidence.

These are examples of AI creation or shaping decisions that health professionals made previously. More applications are being developed.

But what do consumers think about the use of AI in healthcare? And how should your answers determine how it will be used in the future?

What do consumers think?

AI systems are trained to look for patterns in large amounts of data. Based on these patterns, AI systems can make recommendations, suggest diagnoses, or initiate actions. Potentially, they can continually learn and get better at tasks over time.

If we unite international evidence, including our own and so? of othersIt appears that most consumers accept the potential value of AI in healthcare.

This value could include, for example, increasing the diagnostic accuracy or improve access to care. At present, these are largely potential benefits, rather than proven ones.

But consumers say their acceptance is conditional. They still have serious concerns.

Charging…

1. Does AI work?

A basic expectation is that AI tools should work well. Consumers often say that AI should be at least as good as a human doctor in the tasks he performs. They say we shouldn’t use AI if it will lead to more misdiagnoses or medical errors.

2. Who is responsible if the AI ​​makes mistakes?

Consumers are also concerned that if AI systems generate decisions (such as diagnoses or treatment plans) without human input, it may not be clear who is responsible for errors. That’s why people often want doctors to remain responsible for final decisions and protecting patients Of Damage.

3. Will AI make healthcare less fair?

If health services are already discriminatoryAI systems can learn these patterns from data and repeat or worsen The discrimination. Therefore, AI used in healthcare can worsen health inequalities. In our studies, consumers said this Not well.

4. Will AI dehumanize healthcare?

Consumers are concerned that AI will eliminate the “human” elements of healthcare and consistently say AI tools should support rather than replace doctors. This is often because AI is perceived to lack important human traits, like empathy. Consumers say that the communication skills, care and contact of a health professional are especially important when they feel vulnerable.

5. Will AI downskill our healthcare workers?

Consumers value human doctors and their expertise. In our research with women Regarding AI in breast screening, women were concerned about the possible effect on radiologists’ skills and experience. Women saw this experience as a valuable shared resource: if there was too much reliance on AI tools, this resource could be lost.

Consumers and communities need a say

The Australian healthcare system cannot focus solely on the technical elements of AI tools. Social and ethical considerations, including high-quality engagement with consumers and communities, are essential in shaping the use of AI in healthcare.

Communities need opportunities to develop digital health literacy: digital skills access reliable and trustworthy health information, services and resources.

Respectful engagement with Aboriginal and Torres Strait Islander communities must be fundamental. This includes upholding Indigenous data sovereignty, which the Australian Institute of Aboriginal and Torres Strait Islander Studies describe how:

“The right of indigenous peoples to govern the collection, ownership and application of data on indigenous communities, peoples, lands and resources.”

This includes any use of data to create AI.

This critically important consumer and community engagement must take place before administrators design (further) AI into health systems, before regulators create a guide on how AI should and should not be used, and before doctors consider purchasing a new AI tool for their practice.

We are making some progress. Earlier this year, we conducted a citizen jury on AI in healthcare. We supported 30 diverse Australians, from all states and territories, to spend three weeks learning about AI in healthcare and developing recommendations for policymakers.

Their recommendations, which will be published in an upcoming issue of the Medical Journal of Australia, are based on a recently published article. national roadmap to use AI in healthcare.

Thats not all

Healthcare professionals also need to be trained and supported to use AI in healthcare. They must learn to be critical users of digital health tools, including understanding their advantages and disadvantages.

Our analysis A series of safety events reported to the Food and Drug Administration shows that the most serious harms reported to the US regulator did not come from a defective device, but from the way consumers and doctors used the device.

We must also consider when healthcare professionals should inform patients that an AI tool is being used for their care and when healthcare workers should request informed consent for that use.

Finally, people involved in every stage of AI development and use must get into the habit of asking themselves: do consumers and communities agree that this is a justified use of AI?

Only then will we have the AI-based healthcare system that consumers really want.

Stacy Carter is Professor and Director of the Australian Center for Health Engagement, Evidence and Values ​​at the University of Wollongong. Emma Frost is a PhD candidate at the Australian Center for Health Engagement, Evidence and Values ​​at the University of Wollongong. Farah Magrabi is a Professor of Biomedical and Health Informatics at the Australian Institute for Health Innovation at Macquarie University. Yves Saint James Aquino is a researcher at the Australian Center for Health Engagement, Evidence and Values ​​at the University of Wollongong. This piece first appeared in The conversation.

Who is responsible if artificial intelligence fails in hospitals and other big questions about AI in healthcare, answers

By