HomeTechnologyMisinformation spread via deepfakes biggest threat to upcoming polls in India: Tenable

Misinformation spread via deepfakes biggest threat to upcoming polls in India: Tenable

New Delhi, March 24 (IANS) Misinformation and disinformation spread through artificial intelligence (AI)-generated deepfakes and fake content are the biggest threat to the upcoming election in India, exposure management company Tenable said on Sunday.

As per the company, these threats will be shared across social media and messaging platforms like WhatsApp, X (formerly Twitter), Instagram, and others.

“The biggest threats to the 2024 Lok Sabha elections are misinformation and disinformation as part of influence operations conducted by malicious actors against the electorate,” Satnam Narang, Senior Staff Research Engineer, Tenable, told IANS.

According to a recent report by Tidal Cyber, this year, 10 countries will be facing the highest levels of election cyber interference threats, including India.

Recently, deepfake videos of former US President Bill Clinton and current President Joe Biden were fabricated and circulated to confuse citizens during the coming presidential elections.

According to experts, the proliferation of deepfake content surged in late 2017, with over 7,900 videos online. By early 2019, this number nearly doubled to 14,678, and the trend continues to escalate.

“With the increase in generative AI tools and their use growing around the world, we may see deepfakes, be it in images or video content, impersonating notable candidates seeking to retain their seats or those hoping to unseat those currently in parliament,” Narang said.

Recently, the Indian government has issued directives to social media platforms such as X and Meta (formerly Facebook), urging them to regulate the proliferation of AI-generated deepfake content.

In addition, ahead of the Lok Sabha elections, the Ministry of Electronics & IT (MeitY) has issued an advisory to such platforms to remove AI-generated deepfakes from their platforms.

According to Tenable, the easiest way to identify a deepfake image is to look for text that is nonsensical or that looks almost alien-like in language.

–IANS

shs/vd

Go to Source

Disclaimer

The information contained in this website is for general information purposes only. The information is provided by TodayIndia.news and while we endeavour to keep the information up to date and correct, we make no representations or warranties of any kind, express or implied, about the completeness, accuracy, reliability, suitability or availability with respect to the website or the information, products, services, or related graphics contained on the website for any purpose. Any reliance you place on such information is therefore strictly at your own risk.

In no event will we be liable for any loss or damage including without limitation, indirect or consequential loss or damage, or any loss or damage whatsoever arising from loss of data or profits arising out of, or in connection with, the use of this website.

Through this website you are able to link to other websites which are not under the control of TodayIndia.news We have no control over the nature, content and availability of those sites. The inclusion of any links does not necessarily imply a recommendation or endorse the views expressed within them.

Every effort is made to keep the website up and running smoothly. However, TodayIndia.news takes no responsibility for, and will not be liable for, the website being temporarily unavailable due to technical issues beyond our control.

For any legal details or query please visit original source link given with news or click on Go to Source.

Our translation service aims to offer the most accurate translation possible and we rarely experience any issues with news post. However, as the translation is carried out by third part tool there is a possibility for error to cause the occasional inaccuracy. We therefore require you to accept this disclaimer before confirming any translation news with us.

If you are not willing to accept this disclaimer then we recommend reading news post in its original language.

RELATED ARTICLES
- Advertisment -

Most Popular