Eight in 10 (81.4 percent) supported patient consent for AI model use during treatment decisions. In a scenario in which an AI decision model selected a different treatment regimen than the oncologist planned to recommend, most respondents said they would present both options and let the patient decide (36.8 percent), with those from academic settings more likely than those from other settings to let the patient decide (odds ratio, 2.56). Three-quarters of respondents (76.5 percent) agreed that oncologists should protect patients from biased AI tools, but only 27.9 percent were confident in their ability to identify poorly representative AI models.
Andrew James Buniack’s Post
More Relevant Posts
-
Wednesday Vibes 😎 ASCO is behind us, time flies. According to some information shared in an ASCO AI Community of Practice discussion, there were more abstracts submitted related to AI than ever before. So many great sessions!! With all this activity, you may have missed that they released Guiding Principles for AI in oncology https://lnkd.in/ebCUBGDB These include, transparency, informed stakeholders, equity and fairness, accountability, oversight and privacy and human-centered application I especially appreciate the discussion around the need for understanding of AI and human-centered application. The importance of patients having a seat at the table as well in these discussions is critical. I do hope that we'll also see more discussion around the implications of the use of AI for communication of information and dissemination of data. This, too, impacts the entire healthcare community. Our words matter, content matters -- and language is the 'currency' of these systems.
ASCO Sets Six Guiding Principles for AI in Oncology
society.asco.org
To view or add a comment, sign in
-
The American Society of Clinical Oncology (ASCO) released, “Principles for the Responsible Use of Artificial Intelligence in Oncology,” to guide ASCO’s consideration of all aspects of artificial intelligence (AI). With this manuscript, the Society joins colleagues across medicine in offering principles that should be applied in development and implementation of AI. Colleagues- give this a read and then perhaps read it again!
Wednesday Vibes 😎 ASCO is behind us, time flies. According to some information shared in an ASCO AI Community of Practice discussion, there were more abstracts submitted related to AI than ever before. So many great sessions!! With all this activity, you may have missed that they released Guiding Principles for AI in oncology https://lnkd.in/ebCUBGDB These include, transparency, informed stakeholders, equity and fairness, accountability, oversight and privacy and human-centered application I especially appreciate the discussion around the need for understanding of AI and human-centered application. The importance of patients having a seat at the table as well in these discussions is critical. I do hope that we'll also see more discussion around the implications of the use of AI for communication of information and dissemination of data. This, too, impacts the entire healthcare community. Our words matter, content matters -- and language is the 'currency' of these systems.
ASCO Sets Six Guiding Principles for AI in Oncology
society.asco.org
To view or add a comment, sign in
-
This article emphasizes that AI can greatly improve cancer care, but it risks racial and gender biases due to biased training data. To create ethical AI, researchers must diversify data, address health inequities, and balance performance across all groups. AI in oncology must be carefully designed to avoid exacerbating disparities, and regulatory bodies should ensure AI tools are tested across representative populations. Collaboration between experts is key to making AI fairer for everyone. Mendel.ai is open to collaborating on studies to validate our technology and reduce physician/clinician burnout. https://lnkd.in/d_WtZ_FG #MendelAI #clinicalAI #AIinHealthcare #ethicalAI
Limiting bias in AI models for improved and equitable cancer care - Nature Reviews Cancer
nature.com
To view or add a comment, sign in
-
Despite receiving adjuvant AI treatment for nearly two years, a certain proportion of patients failed to achieve the adequate threshold of E2 suppression. Our findings emphasize the significance of monitoring serum E2 levels during adjuvant AI therapy, particularly within the first two years. Further research is imperative to facilitate a more comprehensive comprehension of E2 monitoring.
Clinical significance of serum estradiol monitoring in women receiving adjuvant aromatase inhibitor for hormone receptor-positive early breast cancer
sciencedirect.com
To view or add a comment, sign in
-
Can’t wait to hear more about this research - feels like a new CPT code should be in the works- probably several and some new regulations or at least guidelines on AI for healthcare. #aiforgood #bcs #womenshealth #breastcancerresearch #fda #ama #ai #healthcareai
AI advances in breast cancer detection boost early diagnosis
local12.com
To view or add a comment, sign in
-
#LLMs offer people with cancer a WAZE-like guide needed to navigate the complexities of care. "Dave AI" is the #AI co-pilot everyone with this diagnosis should access. My Medika Life interview with Eliran Malki, Belong.Life Co-Founder, CEO, and an inventor of this incredible 24/7 info resource, showing how it improves #patientengagement with #health professionals and offers rapid connection to #clinicaltrials. This full-length interview reinforces that #AI, #ChatGPT and #LLMs aren't nifty tech toys - it's an example of how smart tech can sustain and potentially save people's lives. https://lnkd.in/gpFhyew8
LLM Cancer Mentor "Dave AI" Offers WAZE-like 24/7 Personalized Support, Making it a Game-Changer in Patient Care - Medika Life
https://medika.life
To view or add a comment, sign in
-
In this week's "Thought Leader Series" post, learn more about how OpenAI is used to help doctors improve cancer patient outcomes through automating the analysis process resulting in improved health outcomes. Read the full article here: https://bit.ly/3VAhkNK #Healthcare #AI #Technology #DataAnalysis #OpenAI Reach out to an Trexin Consulting Advisor today and begin preparing for an AI future: https://bit.ly/3jgLNB7
Color Health uses OpenAI to develop cancer screening copilot for doctors
healthcareitnews.com
To view or add a comment, sign in
-
https://zurl.co/MJdG Incredible performance from this AI. AI is absolutely transforming personalized medicine
io9’s AI cancer biomarker test outperforms NGS in new study data
clinicaltrialsarena.com
To view or add a comment, sign in
-
https://zurl.co/MJdG Incredible performance from this AI. AI is absolutely transforming personalized medicine
io9’s AI cancer biomarker test outperforms NGS in new study data
clinicaltrialsarena.com
To view or add a comment, sign in
-
🌐 The latest AI Act highlights the importance of addressing biases in AI models. In many discussions, whether with clients or during panels, the term "bias" is often used loosely, without a clear understanding of what it truly entails, aside from representing a specific population. 🩺 To identify effective examples of bias, we can look at current practices in medicine. This is explored in Usha Lee McFarling latest article on STAT News, which is part of a series called “Embedded Bias.” The article discusses how Black patients with low T cell counts are automatically classified as being in poor health, even though this count is normal for that population. 🔍 This bias in interpretation stems from a narrow perspective where medical standards are based on a Caucasian model. Addressing such biases is crucial for AI providers as they strive to build models that are more accurate, reliable, and fair. In this way, AI has the potential to correct decades of bias in traditional medicine, benefiting both patients and healthcare providers. https://lnkd.in/duy2c_49 #bias #healthcare #AI
She was told she might have cancer: How medicine pathologizes Black patients’ normal test results
https://www.statnews.com
To view or add a comment, sign in