The problem of algorithmic distribution, new Kenya court case and top 100 women list
Hello and welcome to Everything in Moderation, your all-in-one recap of the important online speech and content moderation news this week. It's written by me, Ben Whitelaw.
This is the final edition of 2022 and boy has it been a wild ride. Over the last 12 months, I've written 44 newsletters, produced 10 expert Q&As, penned more than 65,000 words and garnered almost 900 new subscribers (including you? Thank you if so). We've seen the topic of platform governance play out almost daily in the news and content moderation, more than ever, has become a lightning rod for political and economic discord across the world. 2023, as I predicted this week, will see more of the same.
A festive welcome to new subscribers from Google, Amazon, ByteDance, Tremau and elsewhere. A special thank you to every single EiM member whose contributions ensure the newsletter reaches your inbox every week.
I'm working on some exciting changes and collaborations for 2023 and will be back in January with more information (use this special offer to get in early). So, for the last time this year, here's everything in moderation - BW
Policies
New and emerging internet policy and online speech regulation
The UK's Online Safety Bill threatens to "threatens to undermine" the volunteer-driven governance model used by numerous big platforms and could chill speech, according to two Wikipedia Foundation executives. Writing on CEPA, Rebecca MacKinnon and Phil Bradley-Schmieg argue that the bill should "recognise the difference between centralized content moderation carried out by employees, and community-governed content moderation systems", as the Digital Services Act does.
Meanwhile, Global Partners Digital has published a blog post following the Bill's latest scrutiny period, pointing out that its design could "accentuate the existing network effects and monopolies of the most dominant platforms." Expect more legislation next year to try and mitigate that effect.