The delicate balance of safety and speech, Nazis (again) and the realities of NCII abuse
Hello and welcome to Everything in Moderation's Week in Review, your in-depth guide to the policies, products, platforms and people shaping the future of online speech and the internet. It's written by me, Ben Whitelaw and supported by members like you.
Although this week saw a number of longstanding rear their heads once again, several others — notably personnel changes at the UK regulator and fresh testimony on the human impact of NCII abuse — remind us that complexity of internet harms is constantly shifting and can never be taken for granted.
If you're enjoying the newsletter, a courteous reminder to consider becoming an individual or organisational member to support EiM and its curation every week. Or as Kamala might say: "You think this newsletter just fell out of a newsletter tree?’”.
With my US political references in the bag, here's your EiM this week — BW
Today's edition is in partnership with TrustLab, the Smarter Content Moderation Solution for T&S Teams
From team that led T&S at Google, YouTube and Reddit, comes the smarter way for Trust & Safety teams to moderate content and keep users safe.
TrustLab's ModerateAI combines AI efficiency with human expertise, simplifying complex workflows, improving quality and reducing online safety costs.
Learn more about ModerateAI and secure your spot on the beta waitlist...
Policies
New and emerging internet policy and online speech regulation
As mentioned in last week’s EiM (#256), the US Senate this week passed the Kids Online Safety Act (KOSA) and Children and Teens’ Online Privacy Protection Act (COPPA) in what was called a “historic and emotional milestone” for parent and child safety lobby groups.
The bills seek to affording greater protection to child on social media but look unlikely to pass in the House of Representatives, with digital rights group Fight for the Future suggesting it is already dead. That’ll please 300-odd kids who, as The Verge reported, think that internet policy is being decided by a bunch of 65-year-olds.
Worth remembering: the major social media platforms have very differing views on KOSA. Snap and X execs both indicated support at the Congressional hearing in January (EiM #233) while Meta and TikTok are in favour of parts of it but not others.
Another story I touched on last week but has moved on: Malaysia has moved quickly to introduce a regulatory framework for social media and internet messaging companies. The new framework will mean sites with more than eight million users — which conveniently captures all of the major platforms — will register for an annual Application Service Provider Class Licence. It will apply from 1st January 2025, meaning a short turnaround to comply, with fines of RM500,000 ($112,000) or 5 years imprisonment for failure to produce a license.
Mixed reaction: Article 19 and the Centre for Independent Journalism have said they are “deeply concerned” and encouraged greater consultation between the Malaysian government and stakeholders.
Finally for this section, a notable personnel move: Ofcom’s group director for online safety has stepped down. It’s not clear whether Gill Whitehead, who was appointed last year, has moved on voluntarily or fallen foul of pressure to be harder on online platforms. She will, however, continue as Chair of the Global Online Safety Regulators Network.