5 min read

📌 ‘Technical glitch’ is no longer an excuse

The week in content moderation - edition #67

It’s been the kind of week that has made me think a newsletter on content moderation isn't what the world needs right now. I’ve found it hard to motivate myself, which is why today’s edition is dropping into your inbox later than usual.

At the same time, I recognise that the issues of free speech, online abuse and platform regulation are at the heart of the George Floyd protests and the political response to it, both in the US and elsewhere. It doesn’t make any sense to stop highlighting the inconsistencies of online platforms and their guidelines and policies now. Now is the time that it's needed most. So here it is, your weekly content moderation roundup.

Stay safe and thanks for reading — BW

PS I’m looking to interview EiM subscribers over the coming weeks — read on for more details...


🔎 Platforms should investigate, not just apologise

Enough was happening in the US this week before a tweet went viral accusing TikTok of censoring the #blacklivesmatter hashtag.

This is the very same video platform, don’t forget, that has been heavily criticised for allowing racist users and content to propogate on its platform (#EiM 61) and for lacking transparency about its moderation processes. Now, millions of users were seeing 0 views for hashtags relating to the George Floyd protests and, perhaps rightly, presuming the worst.

The reality was different. All hashtags, as TikTok pointed out, were at 0 as a result of a 'technical glitch’ which only occurred in the Compose screen of the app. In a blog post, published on Monday, it reiterated the diagnosis and acknowledged how it may have looked to supporters of the movement.

TikTok, however, wasn’t the only platform to pass off a fuck-up as a glitch this week.

Facebook also resorted to blaming a ‘technical error’ for deactivating the accounts of 60 high-profile Tunisian journalists and activists without warning. Facebook is huge in the north African country and was a vital communication tool during the 2011 revolution. Haythem El Mekki, a political commentator whose account was deactivated, said to The Guardian: “It would be flattering to believe that we had been targeted, but I think it’s just as likely that an algorithm got out of control.”

Get access to the rest of this edition of EiM and 200+ others by becoming a paying member