Who's in the corner of T&S?
I'm Alice Hunsberger. Trust & Safety Insider is my weekly rundown on the topics, industry trends and workplace strategies that trust and safety professionals need to know about to do their job. This week, I'm asking: who is in Trust & Safety's corner?
Get in touch if you'd like your questions answered or just want to share your feedback.
Also, while I have you – if you're planning on going to TrustCon (or any other T&S conference), might I interest you in my super cool merch? You can get shirts, stickers, or buttons, sold at cost.
Here we go! — Alice
We're exhausted.
There’s a narrative emerging about Trust & Safety that, when you look closely enough, is held almost universally: in the business community, among politicians, within the media, by regulators, and perpetuated by some parts of civil society.
The narrative is this: Trust & Safety is broken. It’s not working. We’re removing too much content, they say, or not enough content, or the wrong type of content. It needs to be fixed, says everyone, but there’s little agreement as to how.
Meanwhile, those of us who live and breathe Trust & Safety every day are exhausted. We’ve been trying our hardest for years to get people on the internet to just be nice to each other, goddamnit. We work in Trust & Safety because we believe that it’s important work. We know it has to be done. But our desire to do the work endlessly, thanklessly, while under attack is wearing thin..
Trust & Safety and content moderation work has always been under appreciated. Within companies, T&S leaders struggle to get more funding for their departments because there’s not always a clear link to ROI. Internal tools for moderators are often buggy and out of date. We deal with the most awful content and often have wellness support only as an afterthought. Users complain about being banned, sometimes even going so far as to make death threats (it’s happened to me multiple times, and to almost every other T&S leader I know).
To cap it all, when you go to a party and tell people what you did for a living, you receive a sea of blank stares. Trust and Safety? Is that like censoring people online? For a long while, it seemed like the only upside was that T&S professionals were mostly ignored by the public and so we were left to do our own thing.
Now, we’re no longer able to hide in the background. All of a sudden more of the general public know what content moderation is, but they’re not happy about it. We’ve seen the entire Trust & Safety profession dubbed “the enemy” and just this week, as a result of political pressures and baseless lawsuits, we learned that the Stanford Internet Observatory is shutting down.
And this stuff trickles down to individuals: Yoel Roth, former head of Trust & Safety at Twitter, was forced to move because the threats against him were so significant. Other practitioners have had similar experiences. It’s exhausting. What I want to know is — who is in our corner?
It’s not our lawmakers, that's for sure. In the US, some states are bringing content moderation cases to the Supreme Court, claiming that platforms are censoring too much, or in the wrong way, while arguments about Section 230 in Congress are often completely wrong. Very few politicians anywhere have a decent grasp about how the internet works.
It’s not the media. When most outlets cover T&S, it’s often around how teams have made mistakes, or aren’t doing enough. Journalism is designed to pressure companies to do better, but the result is that most articles simply make T&S teams look incompetent and retreat from doing important internet safety work. Nuanced views don’t get headlines and, while there are certainly a few journalists who are covering T&S responsibly, they’re largely working independently and outside of mainstream media.
It doesn’t feel like civil society is in our corner, even those supposedly advocating for a safer internet. These organisations put pressure on T&S teams from very specific perspectives. We have privacy advocates, free speech absolutists, and upset parents all asking for completely different solutions which are sometimes antithetical to each other. Similar to journalists, it seems like they are trying to kick up as much fuss as possible to attract headlines and only succeeding to put Trust & Safety in the crosshairs.
It’s also often not the companies we work for. Although some are great and do prioritise Trust & Safety, many others do not. Over the last few years in particular, we’ve seen huge layoffs in our industry and a shift to compliance-based T&S (the minimum required) rather than strategic investments because it’s the right thing to do.
Regulators are some of the strongest advocates for Trust & Safety professionals, especially Australia’s eSafety Commissioner and Ofcom, who regularly reach out to talk to T&S teams and want to have a collaborative relationship. However, T&S leaders find it hard to openly collaborate with regulators when their personal views are different to the leaders and companies they work for. NDAs and time pressure mean it’s frustrating on both sides.
We’re all exhausted, but I, for one, am not giving up. Conferences like TrustCon are a great place to come together as a community (I’ll be leading a workshop on the systemic causes of burnout in our industry if you care to join me). We must remember that we have the power to talk publicly about our experiences and show people how difficult the work is.
Too much is at stake for us to stop now.
I Ask, You Answer?
Get in touchAlso worth reading
It's my job to yell at people (Trust in Tech/ Integrity Institute)
Why? I interviewed Jenni Olson at GLAAD about their Social Media Safety Report. Jenni told me her job is to yell at people to do better, but she said it in a loving way– she works well with people at platforms and is an example of an advocate who also understands the difficulties of Trust & Safety work.
Preventing “Torrents of Hate” or Stifling Free Expression Online? (The Future of Free Speech)
Why? A review of content removal in France, Germany, and Sweden finds:
contrary to prevalent narratives – over removal of legal content may be a bigger problem than under removal of illegal content.
Being a Young Man Online (Australian eSafety Commissioner)
Why? I love to see a study focused on men's experiences online. In order to prevent online harms, we need to get to the root of many of the issues, which must include how and why men can be so toxic online.
Some participants described online communities, such as male-dominated gaming spaces, as arenas where harm and abuse are common and are even accepted as normal. While young men in the study reflected on the harms that others can cause in online communities, some also spoke of engaging in these harms themselves.
Member discussion