The news: Integral Ad Science (IAS) said its AI-powered tool that helps advertisers avoid appearing alongside deepfake content is entering beta testing.
- “Marketers are looking to leverage new technologies to detect and mitigate the harmful effects of deepfakes and misinformation,” IAS CEO Lisa Utzschneider said in the Tuesday announcement, adding the company was excited “to offer this latest innovation ahead of the US elections.”
The deepfake problem: Concerns about AI-generated misinformation and deepfaked images in advertising have risen.
- A recent BBC investigation found that TikTok’s algorithm was recommending deepfaked videos of political candidates making controversial statements, raising alarms about the potential for AI-generated misinformation to spread on popular platforms during an election year.
- Beyond political misinformation, AI-generated deepfakes have become a problem for teens and schools. An April New York Times report found that high schools were struggling to rein in the use of generative AI, which students were employing to create and circulate faked nude images of teen girls.
The opportunity: IAS has called its deepfake tool an industry first—though OpenAI has launched similar tools—and is attempting to come to market with a solution before the problem spreads.
- Releasing a deepfake measurement tool in 2024 represents a lucrative opportunity given the large spike in political ad spending, which will total approximately $12 billion this year.
- Though concerns about deepfakes have grown with the number of world elections taking place in 2024, the problem is likely to worsen in coming years as generative AI becomes more widely available. The IAS tool seeks to capitalize on brand safety anxieties and establish itself as an early market leader.
Our take: Deepfakes are likely to pose more challenges for digital advertisers, and measures should be taken to curb their negative impact now—particularly for those looking to leverage election spending or place ads on popular platforms like TikTok.