New York, Aug 26 (IANS) A detailed analysis of more than 500 medical AI devices revealed that approximately half of the tools authorised by the US Food and Drug Administration (FDA) lacked reported clinical validation data, researchers said on Monday.
The findings, published in the journal Nature Medicine, warned that almost half of FDA-approved AI medical devices are not trained on real patient data.
“Although AI device manufacturers boast of the credibility of their technology with FDA authorisation, clearance does not mean that the devices have been properly evaluated for clinical effectiveness using real patient data,” said Sammy Chouffani El Fassi from the University of North Carolina (UNC) School of Medicine and research scholar at Duke Heart Center in the US.
AI has practically limitless applications in healthcare, ranging from auto-drafting patient messages to optimising organ transplantation and improving tumour removal accuracy.
Despite their potential benefit, these tools have been met with scepticism because of patient privacy concerns, the possibility of bias and device accuracy.
To find out more, a team of researchers at the UNC School of Medicine, Duke University, Ally Bank, Oxford University, Colombia University, and University of Miami have been on a mission to build public trust and evaluate how exactly AI and algorithmic technologies are being approved for use in patient care.
Of the 521 device authorisations, 144 were labelled as “retrospectively validated,” 148 were “prospectively validated,” and 22 were validated using randomised controlled trials.
Most notably, 226 of 521 FDA-approved medical devices, or approximately 43 per cent, lacked published clinical validation data, the team found.
A few of the devices used “phantom images” or computer-generated images that were not from a real patient, which did not technically meet the requirements for clinical validation.
Since 2016, the average number of medical AI device authorisations by the FDA per year has increased from 2 to 69, indicating tremendous growth in commercialisation of AI medical technologies.
“A lot of the devices that came out after 2016 were created new, or maybe they were similar to a product that already was on the market,” said Gail E. Henderson, PhD, professor at the UNC Department of Social Medicine.
Furthermore, the researchers found that the latest draft guidance, published by the FDA in September 2023, does not clearly distinguish between different types of clinical validation studies in its recommendations to manufacturers.
The team hopes that the findings would inspire researchers and universities globally to conduct clinical validation studies on medical AI to improve the safety and effectiveness of these technologies.
–IANS
na/
Disclaimer
The information contained in this website is for general information purposes only. The information is provided by TodayIndia.news and while we endeavour to keep the information up to date and correct, we make no representations or warranties of any kind, express or implied, about the completeness, accuracy, reliability, suitability or availability with respect to the website or the information, products, services, or related graphics contained on the website for any purpose. Any reliance you place on such information is therefore strictly at your own risk.
In no event will we be liable for any loss or damage including without limitation, indirect or consequential loss or damage, or any loss or damage whatsoever arising from loss of data or profits arising out of, or in connection with, the use of this website.
Through this website you are able to link to other websites which are not under the control of TodayIndia.news We have no control over the nature, content and availability of those sites. The inclusion of any links does not necessarily imply a recommendation or endorse the views expressed within them.
Every effort is made to keep the website up and running smoothly. However, TodayIndia.news takes no responsibility for, and will not be liable for, the website being temporarily unavailable due to technical issues beyond our control.
For any legal details or query please visit original source link given with news or click on Go to Source.
Our translation service aims to offer the most accurate translation possible and we rarely experience any issues with news post. However, as the translation is carried out by third part tool there is a possibility for error to cause the occasional inaccuracy. We therefore require you to accept this disclaimer before confirming any translation news with us.
If you are not willing to accept this disclaimer then we recommend reading news post in its original language.