The risk of 'rule convergence', Hillary on Section 230 and online speech as skiing
Hello and welcome to Everything in Moderation's Week in Review, your need-to-know news and analysis about platform policy, content moderation and internet regulation. It's written by me, Ben Whitelaw and supported by members like you.
There's a distinct whiff of irony in today's newsletter, where we have separate stories about users having "no further route to appeal" platform moderation decisions and, you guessed it, a new appeals body for users to appeal platform moderation decisions. Funny how the news agenda throws these things up.
I'm back in the chair with Mike for this week's Ctrl-Alt-Speech, where I plan to share a wild story of how EiM got banned from a major platform for (I think) writing about Pornhub. Subscribe to the podcast (Apple, Spotify), tune in and leave a review.
Here's everything in moderation from the last seven days — BW
Today’s edition is in partnership with the Tech Coalition, sharing outcomes from Initiate, their annual hackathon
Last week, experts and engineers from across the tech industry collaborated at Initiate 2024, resulting in new approaches to combat online child sexual exploitation and abuse. Explore some of the high-level outcomes and their potential for real-world impact.
Policies
New and emerging internet policy and online speech regulation
Users looking to appeal about platform moderation decisions under an innovative clause of the Digital Services Act now have a fourth entity at their disposal — the Appeals Centre Europe. The new out-of-court dispute settlement (ODS) body came about following a one-off $15m grant from the Oversight Board Trust, the entity that acts as a buffer between Meta and the Oversight Board, and joins others from Hungary, Malta and Germany that have been launched in recent months. ACE was formally certified this week by Coimisiún na Meán, the Irish online safety regulator.
What’s interesting is that ACE will hear cases about Facebook but also TikTok and YouTube, a fact that has attracted some criticism from experts who are concerned about platforms “converging on a similar set of rules” (It won't hear cases on Threads or Instagram — see below for why). Cases could be heard by the end of the year, according to the announcement.
How significant is this?: Article 21 and ODS bodies feel to me like a new frontier in online justice, not dissimilar to when report buttons were added to platforms in the late 2000s and 2010s. If if you’ve been keeping with EiM, you would know that Alice feels the same way about its massive potential to change the way user appeals work. She even half-predicted this latest news when she wrote in August:
DS bodies could, in theory, track and report on trends of the disputes they’ve been part of, which theoretically would put on additional pressure to platforms to do better, much like Meta’s Oversight Board does. This could be very valuable to fill in policy gaps for platforms and ensure enforcement is fair.