New call for platform transparency, YouTube's spam problem and talking 'algospeak'
Hello and welcome to Everything in Moderation, the weekly newsletter that keeps you on top of online safety and content moderation news and what it means. It's written by me, Ben Whitelaw.
Welcome to new subscribers from Ranking Digital Rights, MediaHack, Hearken and a host of smart Danish folks that I met at the International Journalism Festival in Perugia. You can watch back my panel on the internet's essential workers with four excellent women here.
If you missed the recent Q&As with the head of Trust and Safety at Clubhouse and the former research lead for conversational AI at Jigsaw, made possible by EiM's founding members, both are worth catching up on. There'll be more in the coming weeks too.
Without further ado, here's this week's round-up — BW
Policies - emerging speech regulation and legislation
The Digital Services Act "could undermine platforms' ability to effectively and timely moderate content, keep users safe and promote trust online", according to a blog post from the Disruptive Competition Project, run by the Computer and Commissions Industry Association. It also calls for the creation of online legal practices for the mediation of disputes, which is a topic I've covered here over the last few years (EiM #72) but which doesn't seem to have moved on very much in that time.
Calls for greater transparency about the content moderation processes of big platforms are not new (EiM #71) but a new joint op-ed by three politicians in response to the Sama/Facebook revelations in Kenya (EiM #150) is a notable development. Writing for The Independent, Damian Collins (UK), Sean Casten (US) and Phumzile van Damme (South Africa) call for public audits of moderation teams, the disclosure of contractors and the protection of whistleblowers. Not very far away from what I called for in 2019 (EiM #68).