Navigating the AI content boom: Risks, investments, and the urgent need for standards

The news: AI-generated content could account for as much as 90% of online information by 2026, per a study by Europol

Why it’s worth watching: The deluge in synthetically generated AI content will create challenges in curbing disinformation, as well as accelerate opportunities for human-generated content.

  • The sudden uptick of AI usage has led to a wave of competition as well as unprecedented investments. 
  • Pitchbook says private equity and VC investors bet $40 billion on AI startups in the first half of 2023.

Europol’s study was released last year, months before the surge in usage of generative AI tools like OpenAI’s ChatGPT and DALL-E, as well as Google Bard, Midjourney, and others.

  • NewsGuard identified more than 400 unreliable websites operating with little to no human oversight.
  • News outlet CNET published dozens of articles written by AI. The experiment  ended in disaster when stories were found to have multiple glaring errors requiring corrections.
  • Baidu’s VidPress AI tool can generate TikTok-like 2-minute videos with synthesized voice-overs using data from multiple sources.

The problem: The surge in AI-generated content has hindered the standardization of  labels to differentiate information created with an AI assist from human-generated content. 

  • The use of AI to replace content creators has resulted in a monumental backlash including the recent Hollywood writers strike, services like X and Reddit locking down API access, and The New York Times looking to sue OpenAI over copyright infringement.
  • Regurgitating and repurposing content created by AI results in “model collapse,” which quickly deteriorates the accuracy of future results. 

The opportunity: The coming deluge in AI-generated content will put a premium on reliable human-crafted content.

Our take: Stricter content guardrails and proper labeling of AI-generated content should come hand in hand with industry adoption of new technologies—a challenge that falls on AI companies, government regulators, and content providers. 

Dive deeper: For more on how generative AI is changing content, read our report on ChatGPT and Generative AI in the Creator Economy