Inside the Oversight Board, Roth on app stores and third party moderation under threat?
Hello and welcome to Everything in Moderation, your global content moderation and online safety round-up. It's written by me, Ben Whitelaw, and supported by good people like you.
Today's newsletter touches on all the meaty, difficult, hard-to-resolve topics that makes content moderation such a fascinating issue to follow — transparency, accountability, labour rights, and the rule of law to name a few. I've linked to relevant past editions of EiM to help to give further context to this week's developments.
To new subscribers from Cornell, DCMS, SightEngine, The Amplifier Group and elsewhere, thanks for joining the club. If you enjoy today's edition, consider forwarding it to someone who might also find it useful or share it on the least badly moderated platform you know.
Here's everything you need to know in moderation this week — BW
Policies
New and emerging internet policy and online speech regulation
The Oversight Board, the independent-but-Meta-funded council created two years ago, aims to take a "more visible, consequential" role in the way Facebook and Instagram moderates speech, according to a long and detailed profile in Wired.
There are lots that we know in there — not least that Meta has implemented less than a quarter of the Board's recommendations — but also a number of details which I had missed: platform algorithm expert Renée DiResta being denied a spot on the board because "they were going in a different direction"; the difficulty getting access to Crowdtangle to conduct investigations and the furore over Facebook's u-turn on seeking a Ukraine advisory (EiM #153). Steven Levy, who writes the story, even sits in on the investigation of the Xcheck debacle (EiM #129) so chances are you'll learn something new. My read of the week.