HomeTop StoriesStudy warns patients not to rely on AI chatbots for drug information

Study warns patients not to rely on AI chatbots for drug information

New Delhi, Oct 11 (IANS) Artificial Intelligence (AI) powered search engines and chatbots may not always provide accurate and safe information on drugs, and patients shouldn’t rely on these, warned a study on Friday.

Researchers from Belgium and Germany conducted the study after finding many answers were wrong or potentially harmful.

In the paper, published in the journal BMJ Quality and Safety, they said that the complexity of the answers provided by the AI chatbot may be difficult to understand and might require degree-level education.

With the introduction of AI-powered chatbots search engines in 2023 underwent a significant shift thanks. The renewed versions offered enhanced search results, comprehensive answers, and a new type of interactive experience.

While the chatbots — trained on extensive datasets from the entire internet — can answer any healthcare-related queries, they are also capable of generating disinformation and nonsensical or harmful content, said the team from the Friedrich-Alexander-Universitat Erlangen-Nurnberg in Germany.

“In this cross-sectional study, we observed that search engines with an AI-powered chatbot produced overall complete and accurate answers to patient questions,” they write.

“However, chatbot answers were largely difficult to read and answers repeatedly lacked information or showed inaccuracies, possibly threatening patient and medication safety,” they add.

For the study, the researchers explored the readability, completeness, and accuracy of chatbot answers for queries on the top 50 most frequently prescribed drugs in the US in 2020. They used Bing copilot, a search engine with AI-powered chatbot features.

Just half of the 10 questions were answered with the highest completeness. Further, chatbot statements didn’t match the reference data in 26 per cent of answers and were fully inconsistent in over 3 per cent of cases.

About 42 per cent of these chatbot answers were considered to lead to moderate or mild harm, and 22 per cent to death or severe harm.

The team noted that a major drawback was the chatbot’s inability to understand the underlying intent of a patient question.

“Despite their potential, it is still crucial for patients to consult their healthcare professionals, as chatbots may not always generate error-free information,” the researchers said.

–IANS

rvt/

Go to Source

Disclaimer

The information contained in this website is for general information purposes only. The information is provided by TodayIndia.news and while we endeavour to keep the information up to date and correct, we make no representations or warranties of any kind, express or implied, about the completeness, accuracy, reliability, suitability or availability with respect to the website or the information, products, services, or related graphics contained on the website for any purpose. Any reliance you place on such information is therefore strictly at your own risk.

In no event will we be liable for any loss or damage including without limitation, indirect or consequential loss or damage, or any loss or damage whatsoever arising from loss of data or profits arising out of, or in connection with, the use of this website.

Through this website you are able to link to other websites which are not under the control of TodayIndia.news We have no control over the nature, content and availability of those sites. The inclusion of any links does not necessarily imply a recommendation or endorse the views expressed within them.

Every effort is made to keep the website up and running smoothly. However, TodayIndia.news takes no responsibility for, and will not be liable for, the website being temporarily unavailable due to technical issues beyond our control.

For any legal details or query please visit original source link given with news or click on Go to Source.

Our translation service aims to offer the most accurate translation possible and we rarely experience any issues with news post. However, as the translation is carried out by third part tool there is a possibility for error to cause the occasional inaccuracy. We therefore require you to accept this disclaimer before confirming any translation news with us.

If you are not willing to accept this disclaimer then we recommend reading news post in its original language.

RELATED ARTICLES
- Advertisment -

Most Popular