7 min read

FTC lays out antitrust case, TikTok 'adds context to content' and Haidt analyses Snap Inc

The week in content moderation - edition #290

Hello and welcome to Everything in Moderation's Week in Review, your need-to-know news and analysis about platform policy, content moderation and internet regulation. It's written by me, Ben and supported by members like you.

Apologies that today‘s newsletter is out later than planned. I had a bank holiday-related scheduling snafu.

A quick welcome to incoming EiM subscribers from the last few weeks; Stanford University, Checkstep, eSafety Commission, Luminate, US State Government, Besedo, Vinted, ParshipMeet and others. Say hi if we haven't met IRL. And, if this edition of EiM was forwarded to you, sign up to join thousands who receive Week in Review (Friday) and T&S Insider (Monday) in their inboxes every week.

Although not squarely about Trust & Safety, Meta's big antitrust case feels more and more like a story that will shape how platforms govern speech for the next decade, even if that means that nothing happens at the end of the two-month trial. Hit reply and tell me what your take has been so far.

This is your Week in Review for the last seven days. Enjoy your weekend — BW


SPONSORED BY All Things in Moderation, the GLOBAL GATHERING fOR ONLINE COMMUNITY BUILDERS

It's finally here! Tickets are officially on sale for All Things in Moderation (ATIM) 2025 - the leading global gathering for anyone building safer, smarter, and more inclusive digital communities.

This year's two-day agenda (15-16th May) features:
- The latest in best-practice moderation and online governance
- New regulations reshaping digital platforms
- Building safer online spaces for young people
- Reclaiming community from the grip of social media monopolies
... and much more!

Whether you work in online community, social media, policy, product, or safety—or you simply care about the future of the internet—ATIM is a must-attend event.

REGISTER NOW

Policies

New and emerging internet policy and online speech regulation

The Federal Trade Commission this week brought its antitrust case against Meta with its lawyers arguing that the purchase of both WhatsApp and Instagram created a “monopoly power”. CEO Mark Zuckerberg took the stand for 13 hours over three days in a kind of Silicon Valley fever dream of his best but mostly worst ideas.

I’m not a competition expert by any means but so many of the difficult issues related to content moderation feel directly linked to the size and scale of the platforms and the moats each of them build around themselves. So I’ve been reading the following as closely as possible:

  • Wired gives a broad overview including the FTC’s key argument that Meta’s acquisitions allowed it to provide less data privacy to users and more buggy and expensive services to advertisers. 
  • In a blogpost littered with faux patriotism, Meta’s Chief Legal Officer says the FTC’s case “ignores how the market actually works and chases a theory that doesn’t hold up in the real world”. 
  • The Verge also reported on a sub-drama about poorly redacted slides, if you can call something about a formatted pdf a sub drama.

Can the Digital Services Act's (DSA) rely on platforms acting in "good faith" with the politicisation of major platforms that’s happened over the 6-12 months? That’s the question posed by doctoral candidate Nikolaus von Bernuth in a piece for Verfasungsblog. By following the principle of “enforced self-regulation: von Bernuth argues that the regulation risks under-enforcement and regulatory capture, particularly if transparency and independent scrutiny are weak. With the recent hooha (EiM #289), it’s a timely perspective.

Also in this section...

Why Can’t We De-Friend - Ctrl-Alt-Speech
In this week’s round-up of the latest news in online speech, content moderation and internet regulation, Mike and Ben cover:Inside Mark Zuckerberg’s Failed Negotiations to End Antitrust Case (Wall Street Journal)Mark Zuckerberg once suggested wipi…

Products

Features, functionality and technology shaping online speech

TikTok has launched a new feature called Footnotes, a Community Notes-like tool designed to allow users to add “more context to content”. According to the company, the feature will work similarly to X/Twitter’s implementation — helped by a bridge-based algorithm for canvasing feedback from a wider range of users — and will initially be tested in the US.

‘Suite of measures’: Unlike Meta’s Community Notes (EiM #276), the announcement made clear that Footnotes will not coincide with a step away from working with accredited fact-checking organisations. But I believe it should be taken in context of the labour disputes in Turkey (EiM #286) and the UK.

Alexios Mantzarlis on Meta’s ‘more speech, fewer mistakes’ announcement
Covering: Mark Zuckerberg’s accusations of fact checking bias, Community Notes and the power of users’ ‘directional sense’ and the decision to prioritise US vs global speech

Elsewhere, Fay Johnson — a name EiM readers might recognise (EiM #278) — has launched CLR:SKY, a new overlap for Bluesky designed to “encourage more respectful and productive interactions”.

The tool is based on the idea that providing real-time toxicity feedback can reduce harmful language and its Toxicity Weather Report has cute icons that change as you type (‘Sunny’ when you’re being civil, ‘Cloudy’ when you’re not). I gave it a go and, while I’m probably not the target market, I found it a subtle indicator of the route that a conversation might go down. Give it a go.

Also in this section...

Platforms

Social networks and the application of content guidelines

Researchers have warned that Roblox exposes children to “deeply disturbing” risks, including violent content, sexualised roleplay, and grooming behaviours — despite the platform’s safety claims. In a new report, researchers from Revealing Reality found that adult test accounts were able to find workarounds in order to interact with users under the  age of 13, despite Roblox recently bringing in stricter rules around private messages (EiM #285).

Mitigation factor?: While a useful deep-dive, preventing some of the scenarios raised by the researchers is expecting a lot of any platform. For example, the report  highlights that a user asked for a minor’s Snapchat details “using barely coded language” ie Snap spelt backwards. Does that mean “pans” should be added to the filter list?

Video streaming platform Kick — think Twitch but with looser moderation protocols — faced criticism in 2023 for its approach to safety. But, according to CEO Ed Craven, it has invested “tenfold, compared to what we used to, in moderation” and “Trust and Safety… has become one of our larger departments internally”. We don’t know specifics but ten times a very small number is still a small number.

Also in this section...

A reader asks: What should be on my ‘red line’ list?
Most T&S professionals—whether they admit it or not—have a line they won’t cross for their company. But when you’re in the middle of a major, public failure, it can be hard to know what to do. Here’s my take on what to consider before quitting.

People

Those impacting the future of online safety and moderation

If you haven’t already read Jonathan Haidt’s The Anxious Generation, you might have heard Mike and I discuss it on Ctrl-Alt-Speech.

In it, for anyone that isn’t familiar, he makes the case that the rise of teenage mental health can be put down to the proliferation of smartphones, social media and gaming. The book has many supporters, some notable, but many detractors too.

This week, he took to his Substack (I wonder if he knows about the the Nazi platforming story from a while back...) to share some analysis of Snap Inc’s harm towards its younger users. Analysing legal briefs, he alleges that company employees were aware of harms on the platforms but did nothing or were slow to react.

Despite that framing, he seems to be sympathetic of Snap's T&S team, single-handedly epitomising the difficult tradeoffs that Mike and I talk about each week. And his review of Netflix TV show Adolescence suggests he might be cottoning onto the idea that the internet is only one part of a large and complex puzzle.

Posts of note

Handpicked posts that caught my eye this week

  • "Patrick McAndrew and I have been polishing up our script for this 30-minute film, which also has some spinoff ideas. We are currently collecting ideas and interest as the project gets off the ground, as we look to film in NYC this summer." - All Tech is Human founder David Ryan Polgar is making a film. Want to join him?
  • "Chayn is looking for social media and/or comms professional with a global lens who has a passion for engagement with survivors of gender-based violence" - great role going at UK non-profit Chayn, which does a lot of work on online safety for women.
  • "As a child of the mid-90s internet, I am here for the return of robots.txt discourse." - I suspect many EiM readers will agree with Kevin Koehler.