Consumers, legislators, and courts seek greater protections for children on social media

The news: Social media platforms such as TikTok, Facebook, and Instagram are at an inflection point where going after the youngest users may no longer be tenable.

A clear majority supports various measures to make social media safer for children after President Joe Biden asked Congress to approve legislation to accomplish just that in his recent State of the Union address, per a recent Morning Consult/Politico poll.

In his speech, Biden urged lawmakers to hold platforms accountable for “the national experiment” they’re conducting on children for profit.

Greater accountability to come? Other elected officials from both parties appear to agree with Biden's plan.

  • US senators Marsha Blackburn (R-TN) and Richard Blumenthal (D-CT) presented legislation last month to protect children's internet privacy.
  • The proposed Kids Online Safety Act would provide new laws and safeguards to address some of the most pressing safety problems tied to kids’ usage of social media, requiring apps to include stricter safety measures for users under the age of 16 such as tools to protect against stalking, exploitation, addiction, and other “dangerous material.”

How we got here: Events from last fall brought children’s mental health and safety to the spotlight.

  • Frances Haugen, a former Facebook employee, leaked internal research showing the corporation knew about Instagram's detrimental health effects on some kids.
  • This led to Meta postponing Instagram for Kids, which had been a controversial and criticized initiative. Instagram had maintained it needed a version for kids since they already used the main app; detractors saw it as a way for the corporation to try to reel in more younger users.
  • Haugen appeared before a Senate subcommittee in October and urged Congress to act, claiming that the company's leadership knows how to make Facebook and Instagram safer but refuses to do so because it could hurt profitability.
  • TikTok, YouTube, and Snap were also called to testify before Congress regarding their initiatives to protect children.
  • TikTok released a report on teen safety in an attempt to preempt further regulator scrutiny; despite that, a UK High Court judge has allowed a class-action case against TikTok over its handling of children's private data to proceed.

Meta’s youth shakeup: After an internal shakeup, Pavni Diwanji, the Meta vice president who headed youth-focused product initiatives (including Instagram for Kids), is leaving.

  • Diwanji, who arrived in 2020 and formerly oversaw YouTube Kids at Google, “decided to move on” after the restructuring, according to Instagram chief Adam Mosseri.
  • She was in charge of creating products such as age verification and parental monitoring tools, as well as user experiences with Meta's apps for children under 13.
  • Mosseri said he would lead youth initiatives directly, saying that they remain a major focus across Meta’s properties, especially in VR and Messenger.

Is the metaverse next? While the metaverse isn’t inherently safer than social media, it’s worth noting that there have been a number of brand campaigns placed in immersive gaming experiences recently, so far with little pushback or criticism.

  • Ralph Lauren, Nike, Hyundai, and Vans have all launched recent campaigns on Roblox to woo younger customers.
  • As metaverse initiatives become more prominent, expect more attention to be paid to this space. Oasis Consortium, a group of game developers and online companies envisioning an ethical and safe internet, has introduced guidelines to promote safety in the metaverse.

The big takeaway: While kids are a coveted demographic, the prospect of increased legislation and potential reputational risks have to be concerns for Meta, TikTok, and others in this space. Expect social media platforms’ efforts around their youngest users to stay in the spotlight for some time—and for metaverse scrutiny to eventually follow.