'Rinse and repeat' policies, YouTube's Indian election issue and regulatory thresholds
Hello and welcome to Everything in Moderation's Week in Review, your in-depth guide to the policies, products, platforms and people shaping the future of online speech and the internet. It's written by me, Ben Whitelaw and supported by members like you.
Today's edition has two chunky pieces of research by non-profits, underlining the continued importance of civil society organisations and digital rights groups in the online speech space. Sometimes —wrongly— I take their work for granted but it's a vital contribution and worth calling out. Look out for both studies as you read on.
A warm welcome to new faces from Google, TSPA, Thorn, TCEGSM, Meta, GWWP and a gaggle of others. If you enjoy the newsletter or find it useful, please do recommend it to colleagues and friends, share via your favourite professional social network and use your personal development budget to become a member.
Here's everything in moderation from the last seven days — BW
Want your company or product to reach thousands of platform workers, T&S practitioners and online speech experts? Become a EiM sponsor!
Every week, EiM is read by some of the most important people and influential thinkers in internet policy, product, research and regulation. Alongside its sister podcast Ctrl-Alt-Speech, produced in conjunction with Techdirt, it reaches thousands of decision makers in the biggest tech companies and most prestigious institutions around the world.
To find out more about how you can reach these people with your message, fill in a few details and I'll personally follow up with you to explain more — BW
Policies
New and emerging internet policy and online speech regulation
An new report has found that platform efforts to stem mis- and disinformation over the last decade are “formulaic and stuck in an ineffective rinse-and-repeat loop” that shows a “disregard for democracy in Global Majority countries”. Mozilla research Odanga Madung examined 200+ interventions by Meta, Google, and others across seven years and 27 countries. He found that almost two thirds of the 197 geography-specific interventions were directed towards the US and Europe and said it was a “glaring travesty” that companies resorted to “excessive policy coddling and protections” in just two markets. My read of the week.
You know them as Very Large Online Platforms (VLOPs) in Europe but, in the UK, services will be known as category 1, 2A and 2B according to documentation shared by Ofcom — the regulator of the Online Safety Act — this week. The biggest platforms will have additional duties to content with, including the difficult-to-enforce protections for news publisher and journalistic content and the provision of user empowerment features such as filtering “non verified users” (whatever that means). Ofcom is seeking evidence until 20th May.