New Delhi, Sep 5 (IANS) Just like humans, robots can lie and deceive, according to a study on Thursday that shows how emerging technologies like generative AI can be used to manipulate users.
The team from George Mason University in the US aimed to explore “an understudied facet of robot ethics” to understand mistrust towards emerging technologies and their developers.
To determine if people can tolerate lying from robots, the team asked nearly 500 participants to rank and explain various forms of robot deception.
“I think we should be concerned about any technology that is capable of withholding the true nature of its capabilities because it could lead to users being manipulated by that technology in ways the user (and perhaps the developer) never intended,” said lead author Andres Rosero, a doctoral candidate at the University.
“We’ve already seen examples of companies using web design principles and artificial intelligence chatbots in ways that are designed to manipulate users towards a certain action. We need regulation to protect ourselves from these harmful deceptions.”
The findings, published in the journal Frontiers in Robotics and AI, showed that robots can deceive humans in three scenarios: external state deceptions, hidden state deceptions, and superficial state deceptions.
The robots were used in medical, cleaning, and retail work, and were portrayed as lying about the world beyond the robot, a housecleaning robot with an undisclosed camera, and a robot working in a shop.
The participants were asked to approve of the robot’s behaviour, its deceptiveness, and if it could be justified. Most participants disapproved of the hidden state deception, which they considered the most deceptive.
They also disapproved of the superficial state of deception, where a robot pretended it felt pain. The researchers attributed these deceptions, particularly hidden state deceptions, to robot developers or owners.
They warned that the study should be extended to experiments that can better model real-life reactions, such as videos or short role plays because it was conducted on a limited number of participants, which does not make up for concrete evidence.
–IANS
ts/rvt/
Disclaimer
The information contained in this website is for general information purposes only. The information is provided by TodayIndia.news and while we endeavour to keep the information up to date and correct, we make no representations or warranties of any kind, express or implied, about the completeness, accuracy, reliability, suitability or availability with respect to the website or the information, products, services, or related graphics contained on the website for any purpose. Any reliance you place on such information is therefore strictly at your own risk.
In no event will we be liable for any loss or damage including without limitation, indirect or consequential loss or damage, or any loss or damage whatsoever arising from loss of data or profits arising out of, or in connection with, the use of this website.
Through this website you are able to link to other websites which are not under the control of TodayIndia.news We have no control over the nature, content and availability of those sites. The inclusion of any links does not necessarily imply a recommendation or endorse the views expressed within them.
Every effort is made to keep the website up and running smoothly. However, TodayIndia.news takes no responsibility for, and will not be liable for, the website being temporarily unavailable due to technical issues beyond our control.
For any legal details or query please visit original source link given with news or click on Go to Source.
Our translation service aims to offer the most accurate translation possible and we rarely experience any issues with news post. However, as the translation is carried out by third part tool there is a possibility for error to cause the occasional inaccuracy. We therefore require you to accept this disclaimer before confirming any translation news with us.
If you are not willing to accept this disclaimer then we recommend reading news post in its original language.