Powered by MOMENTUM MEDIA
cyber daily logo
Breaking news and updates daily. Subscribe to our Newsletter

Russia, China, Iran using AI tools to influence US presidential election

Russia, China, and Iran are all using AI tools to shape the US presidential election and persuade Americans against voting Kamala Harris into power, according to US officials.

user icon Daniel Croft
Wed, 25 Sep 2024
Russia, China, Iran using AI tools to influence US Presidential election
expand image

According to a release by the US Office of the Director of National Intelligence (ODNI), the US Intelligence Community (IC) detected use of AI technology by the three nations to influence the presidential election.

“The IC is observing foreign actors, including Russia and Iran, use generative AI technology to boost their respective US election influence efforts,” the release said.

“Methods to accomplish this include laundering material through prominent figures, publishing on inauthentic social media accounts or websites pretending to be legitimate news outlets, or releasing supposed ‘leaks’ of AI-generated content that appear sensitive or controversial.”

============
============

The ODNI adds that Russia is the top offender when it comes to creating AI content to influence the election, creating content “across all four mediums – text, images, audio and video”.

Russia has reportedly been using the content to promote former president Donald Trump and “denigrate” current Vice-President and Democratic presidential candidate Kamala Harris.

“For example, the IC assesses Russian influence actors were responsible for staging a video in which a woman claims she was the victim of a hit-and-run car accident by the Vice-President and altering videos of the Vice-President’s speeches,” the ODNI said, referring to a false story in which Harris reportedly hit a 13-year-old girl with a car before fleeing the scene in 2011.

China’s AI generation has been used more to shape the US’ view of China as a whole and regarding specific political issues.

“For example, pro-China online actors this year have used AI-generated news anchors and inauthentic social media accounts with AI-generated profile pictures to sow divisions on issues such as drug use, immigration, and abortion.”

Iran has reportedly used AI tools to write fake news articles and social media posts from fake news publications claiming to be real.

“This content, in both English and Spanish, has targeted US voters across the political spectrum on polarising issues, such as the Israel–Gaza conflict and on the presidential candidates,” the ODNI said.

ChatGPT creator OpenAI announced last month it had banned a swarm of Iranian ChatGPT accounts for creating false and misleading content relating to the US election.

In a post uploaded to OpenAI’s site on 18 August, the company said it detected accounts that were “generating content for a covert Iranian influence operation identified as Storm-2035”.

“We have banned these accounts from using our services and we continue to monitor for any further attempts to violate our policies.

“The operation used ChatGPT to generate content focused on a number of topics – including commentary on candidates on both sides in the US presidential election – which it then shared via social media accounts and websites,” it said.

Storm-2035, alongside six other Iranian threat groups, was identified in a report earlier this month by Microsoft as conducting “cyber-enabled influence operations” relating to the US election.

The Microsoft report, which was released on 9 August, specified that Storm-2035 was making use of AI to generate disinformation and misinformation, which would then be spread on social media.

The group reportedly established four websites that posed as legitimate news outlets, all of which have been around since at least 2020.

OpenAI also identified that the group was using ChatGPT to create this content, with two main types of content being generated – short social media comments and long-form articles.

The ODNI release said these foreign AI misinformation efforts are only becoming more common as the election approaches.

“The IC continues to judge that foreign actors are increasing their election influence activities as we approach November.”

Daniel Croft

Daniel Croft

Born in the heart of Western Sydney, Daniel Croft is a passionate journalist with an understanding for and experience writing in the technology space. Having studied at Macquarie University, he joined Momentum Media in 2022, writing across a number of publications including Australian Aviation, Cyber Security Connect and Defence Connect. Outside of writing, Daniel has a keen interest in music, and spends his time playing in bands around Sydney.

newsletter
cyber daily subscribe
Be the first to hear the latest developments in the cyber industry.