Avoiding 'unevenly enacted' rules, anger at Meta's India report and why we need whistleblowers
Hello and welcome to Everything in Moderation, your handcrafted guide to the week's content moderation and online safety news. It's written by me, Ben Whitelaw.
Not all editions have a clear thread running through them but today's does. As you'll see, each section underlines the crucial role that non-governmental and civil society organisations play in seeking to address the abundance of content moderation challenges that we, as a global population, face.
Their importance can be seen in fierce reaction to a delayed human rights report, in the protection of whistleblowers and in the holding to account of platforms for their (lack of) protection of minority groups. It is there and it needs recognising.
It goes without saying that new subscribers from Pinterest, Amazon, Genpact, ByteDance, Image Analyzer, Spotify, Hinge, Feeld and Strava are very welcome here, as is everyone who signed up this week. One day, I'll find a way to get you all together but until then, enjoy the newsletter and, if you can, become a member to support its upkeep and the writing of exclusive articles like this one from Jen Weedon.
Here's what you need to know from the last seven days - BW
Policies
New and emerging internet policy and online speech regulation
The story I've been drawn to most this week has been Facebook/Meta's first annual human rights report or, rather, the reaction to it. Like many others, I wasn't very impressed.
A bit of background before we go further: in 2019, became plagued with false political information in the run-up to the general election. Research and investigation after investigation found that, as religious and caste violence accelerated, there was "little to no response from the company", despite the fact that India is Facebook's largest market. It led to calls for a human rights impact assessment (HRIA) which never arrived until this week's four-page "synthesis" buried within the 83-page pdf. Hardly comprehensive.