Most Consumers Can’t Identify AI-Generated FakesTake the Deepfake Detection TestTake the Deepfake Detection Test
New research from iProov, the world’s leading provider of science-based solutions for biometric identity verification, reveals that most people can’t identify deepfakes – those incredibly realistic AI-generated videos and images often designed to impersonate people. The study tested 2,000 UK and US consumers, exposing them to a series of real and deepfake content. The results are alarming: only 0.1% of participants could accurately distinguish real from fake content across all stimuli which included images and video.Key Findings:Key Findings:
- Deepfake detection fails: Just 0.1% of respondents correctly identified all deepfake and real stimuli (e.g., images and videos) in a study where participants were primed to look for deepfakes. In real-world scenarios, where people are less aware, the vulnerability to deepfakes is likely even higher.
- Older generations are more vulnerable to deepfakes: The study found that 30% of 55-64 year olds and 39% of those aged 65+ had never even heard of deepfakes, highlighting a significant knowledge gap and increased susceptibility to this emerging threat by this age group.
- Video challenge: Deepfake videos proved more challenging to identify than deepfake images, with participants 36% less likely to correctly identify a synthetic video compared to a synthetic image. This vulnerability raises serious concerns about the potential for video-based fraud, such as impersonation on video calls or in scenarios where video verification is used for identity verification.
- Deepfakes are everywhere but misunderstood: While concern about deepfakes is rising, many remain unaware of the technology. One in five consumers (22%) had never even heard of deepfakes before the study.
- Overconfidence is rampant: Despite their poor performance, people remained overly confident in their deepfake detection skills at over 60%, regardless of whether their answers were correct. This was particularly so in young adults (18-34). This false sense of security is a significant concern.
- Trust takes a hit: Social media platforms are seen as breeding grounds for deepfakes with Meta (49%) and TikTok (47%) seen as the most prevalent locations for deepfakes to be found online. This, in turn, has led to reduced trust in online information and media— 49% trust social media less after learning about deepfakes. Just one in five would report a suspected deepfake to social media platforms.
- Deepfakes are fueling widespread concern and distrust, especially among older adults: Three in four people (74%) worry about the societal impact of deepfakes, with “fake news” and misinformation being the top concern (68%). This fear is particularly pronounced among older generations, with up to 82% of those aged 55+ expressing anxieties about the spread of false information.
- Better awareness and reporting mechanisms are needed: Less than a third of people (29%) take no action when encountering a suspected deepfake which is most likely driven by 48% saying they don’t know how to report deepfakes, while a quarter don’t care if they see a suspected deepfake.
- Most consumers fail to actively verify the authenticity of information online, increasing their vulnerability to deepfakes: Despite the rising threat of misinformation, just one in four search for alternative information sources if they suspect a deepfake. Only 11% of people critically analyze the source and context of information to determine if it’s a deepfake, meaning a vast majority are highly susceptible to deception and the spread of false narratives.
...read more at iproov.com
pull down to refresh
related posts
One false positive, one false negative. But the timer makes it hard. I should have taken more time.
good job!
how can you even see the full motion in average 7 seconds per vid?
Some were just pictures, and for others, I didn’t need to watch till the end.
I guess this just means I'm a slow boomer lmao
easy
First try?
yep
Congratulations! You are part of the 0.1%.