May 2024 | This Month in Generative AI: Embracing Technology and the Continued Rise of Fake Imagery
Adobe Firefly

May 2024 | This Month in Generative AI: Embracing Technology and the Continued Rise of Fake Imagery

By Hany Farid , UC Berkeley Professor, CAI Advisor

Hardly a day goes by that I don't receive several emails or calls asking me to review a piece of content to determine if it is real or fake. Time permitting, I'm usually happy to oblige because I think it is important to help reporters, fact checkers, and the general public verify what is real and expose what is fake.

In this blog series as well as its earlier incarnation, I've spoken about a range of different analysis techniques that my team has developed and regularly uses to validate an audio file, image, or video. This type of reactive technology that works after a piece of content surfaces, alongside proactive tools like Content Credentials that work as a piece of content is being created or edited, helps us navigate a fast and complex online world where it is increasingly difficult to tell what is real and what is fake.

With OpenAI and other companies committing to labeling AI-generated content, consumers can soon look forward to Content Credentials showing up on LinkedIn, TikTok, and their other social media news feeds.

Over the past few weeks I have received emails from fact-checking organizations asking if I would analyze: 

  1. An image of a shark on a flooded Los Angeles highway that Senator Ted Cruz shared on X.
  2. An audio recording claiming to be Michelle Obama announcing she is running for President in the 2024 election.
  3. As nearly a billion people head to the polls in India, dozens of videos of Indian politicians up and down the ticket saying all manner of offensive or highly improbable things.

At first I thought that these otherwise reliable fact checkers had lost their minds. Why would they possibly be asking me about these obviously absurd pieces of content? After a quick online search, however, I found that each item was gaining traction.

We have entered a world in which even the most absurd and unlikely claims are believed by a not insignificant number of people. A decade ago, around 4% of Americans believed in unfounded theories including the idea that intergalactic lizard species control society, and 7% believe the moon landing was faked. Starting in 2020, however, the belief in unfounded theories saw a startling rise. 

This is a stunning and worrisome increase in the belief in unfounded conspiracies at the same time when the internet is supposed to be giving us unprecedented access to information and knowledge. But of course, while the internet has democratized the distribution of and access to information, it has made no distinction between good and bad information — and bad information seems to spread faster and further than good.

I am not saying that we should blindly believe everything we are told by the media, the government, or scientific experts. We should ask hard questions, consume information from a broad set of voices, be vigilant when confronted with day-to-day news, and be skeptical when confronted with particularly incredible claims. Whether you agree with them or not, reporters and fact checkers are, for the most part, serious people doing a serious job, and they are trying to get the story right. 

So by all means, let's use technology to help us distinguish the real from the fake, but let's not let technology be a substitute for common sense. This past month has seen a continued rise in the quality and sophistication of generative AI images, audio, and video, so buckle up because we are going to have to deploy all of our common sense alongside the types of technologies that I will continue to discuss in the coming months.

Finally, I don't know why, but a constant over my past 25 years in the space of media forensics is that images with sharks are almost always fake, so much so that I'm not even sure any more if sharks are real. 

To view or add a comment, sign in

Insights from the community

Others also viewed

Explore topics