5 min read

AI models give 'false impression of safety', LGBTQIA+ policy ruling and Substack update

The week in content moderation - edition #231

Hello and welcome to Everything in Moderation, your guide to the policies, products, platforms and people shaping the future of online speech and the internet. It's written by me, Ben Whitelaw, and supported by members like you.

With last week's announcement at Discord and the news of impending job losses at Google, there's a Groundhog Day feeling about today's newsletter. But, as you read on, I hope you're also reminded about the "wonderful, committed people" working in the online safety space.

New readers from Northwestern University, ByteDance, Streamshield AI, NEF Europe, TikTok, Meta and a bunch of folks choosing to have EiM pumped straight into their favourite RSS reader; thanks for subscribing. If you enjoy today's newsletter, please forward it to a colleague or peer and/or share your favourite article on LinkedIn, just like Inbal did, to help grow EiM's readership.

Here's everything in moderation from the last seven days — BW


Today's edition is in partnership with TrustLab, which helps companies of all sizes handle their compliance with the Digital Services Act

TrustLab has created a useful DSA Compliance checklist to equip Trust & Safety and Compliance teams with confidence and clarity ahead of the DSA deadline on February 17th. The easy-to-navigate and customisable spreadsheet allows you to add deadlines, team members and any other DSA-related tasks and is the easiest way to make sure your company is compliant!

You can also check out their other DSA resources, including a budgeting tool, playbooks, and more, so you can be fully equipped for the Digital Services Act. 


Policies

New and emerging internet policy and online speech regulation

The Oversight Board has ruled that Meta must better enforce its LGBTQIA+ policies after overturning the company's decision to leave up a post which advocated for trans people to commit suicide. The independent-but-Meta-funded council of experts found that, despite 11 people reporting the post by a Polish user, it was allowed to stay up because all but two reports were automatically closed by its moderation systems. One Board Member told Rolling Stone that this showed that:

"Meta has the right ideals, it’s just living under those ideals, rather than living up to them"

Advocacy organisation GLAAD welcomed the decision and urged Meta to "urgently create and share an action plan for addressing the epidemic of anti-trans hate that runs rampant across Facebook, Instagram, and Threads".

Get access to the rest of this edition of EiM and 200+ others by becoming a paying member