6 min read

Are T&S professionals part of the problem?

A new report argues that, without industry-wide standards or codes of practice, T&S professionals are vulnerable to corporate pressures and destined to always be reactive to company's conflicting priorities. The answer? Greater independence.

I'm Alice Hunsberger. Trust & Safety Insider is my weekly rundown on the topics, industry trends and workplace strategies that trust and safety professionals need to know about to do their job.

I'll be in London soon for the Trust & Safety Summit, which is very exciting! Spending time with others in the T&S industry is always very energising for me, and I'm looking forward to seeing some EiM subscribers there.

This week, I've written about another T&S industry report but the bigger question is: do you enjoy it when I draw attention to these reports or would you rather I skip them?

Get in touch to let me know or to arrange to meet up when I'm in the UK. Here we go! — Alice


Today’s edition is in partnership with Safer by Thorn, a purpose-built CSAM and CSE solution

Powered by trusted data and Thorn’s issue expertise, Safer helps trust and safety teams proactively detect CSAM and child sexual exploitation conversations.

Safeguard your platform and users with proprietary hashing and matching for verified CSAM and a classifier for finding possible novel CSAM. Plus, Safer’s new text classifier provides much needed signals to help trust and safety teams find conversations that violate your child safety policies, such as those containing sextortion or requests for self-generated content from a minor.


Are T&S professionals part of the problem?

Why this matters: Trust & Safety professionals don't have industry-wide standards or codes of practice. A new report from the 5Rights Foundation argues that this makes them vulnerable to corporate pressures and that, without certification and oversight, T&S remains reactive, not protective. The answer? Professionalisation — before platforms drift further from accountablility.

When I used to manage a customer experience department, my team would compile Voice of the Customer reports every month. Their purpose was to would tell the rest of the company about issues that our customers were frequently complaining about.

We put a lot of time and love into these reports and tried various formats, but disappointingly they were read by only a small group of my colleagues. I think this was because people already had their priorities, so they weren’t particularly interested in being told about a bunch of new problems to add to their already long list of to-dos. 

I share this anecdote because I feel like we are in a similar situation with some of the excellent reports on Trust & Safety that have been published lately. There’s some really helpful information out there, but much of it is a) answering questions that few people are asking or b) shares truths that people might not want to hear.

I'll explain what I mean using a report that came out just last week.

A homogenous entity?

Advancing Trust & Safety: systems and standards for online safety professionals is latest in a line of T&S reports that have come out these past few weeks. It was produced by 5Rights Foundation and written by Alexandra Evans, a former TikTok policy employee and current advisor for 5Rights and UK regulator Ofcom. 

The report is targeted at politicians and regulators but has enough interesting ideas within it to be interesting for industry professionals too. One such idea is to make T&S professionals as independent as possible in order to allow them to "make the case for [user] safety rather than being empowered to ensure tech companies set and follow minimum standards".

The way to do that, the report explains, is to have codes of practice, standards and oversight mechanisms in place that are adopted by the whole profession (it's a topic I've written about before):

Whilst it may seem counterintuitive, introducing professional standards and oversight is in the best interests of those who are subject to them because it enables them to push back if they are asked – either implicitly or explicitly – to operate outside prescribed professional boundaries. There is therefore a strong case to be made for T&S professionals to work together on developing codes of practice.

Because Evans has worked at a major platform, she gets the difficulty that T&S workers face when they push back against company priorities in service of safety. The following sentence made feel seen and understood (as a former T&S executive at a platform) in a way that few things have:

The challenge that T&S professionals describe when their interests and priorities differ from those of their employers may be exacerbated by the external view of tech companies as homogenous entities. As a result, T&S professionals may find themselves under fire from both directions: they encounter internal resistance when making the case for enhanced safety standards, whilst also facing legitimate criticism from safety advocates when tech companies fail in their responsibility to keep users safe.

Part of the problem?

While it acknowledges the tough situations that T&S workers often find themselves in, the report certainly doesn’t pull any punches in squarely calling for us to do better too. The introduction, from Baroness Beeban Kidron, 5Rights founder & chair, starts with a biting statement:

“Already, Trust & Safety is a global profession of well over a hundred thousand people, but unless and until it operates according to understood and enforced standards, looks after its own, and fulfils its purpose of keeping those that engage with tech products and services safe, it remains part of the problem not the solution.”

She’s not wrong. While there are pockets of people getting stuff done, there's clearly is a lot more that we could be doing as a profession. But we’re also generally overworked, burned out, and under appreciated, which doesn’t exactly put us in the physical or mental state of mind to organise, take on more or transform the profession that we find ourselves in.

Product of the system

The report also reminded me of the stark realities of how T&S functions within major platforms:

The systems and processes in which T&S professionals operate are optimised for profit rather than safety. This undermines T&S’s central duty to make digital products and services safe for citizens, society and users. 

Given these structural constraints, it's useful to think of the T&S profession as a system, just like we analyse the systems of technology itself. A system-wide approach could offer cover to professionals, particularly if industry-wide standards and codes of practice were in place. If everyone adhered to the same professional principles, it would make it difficult for CEOs to switch out one T&S person for another who is more compliant or willing to bend ethical standards.

However, the challenge is that there’s no such professional framework currently exists. There is no certification system, no licensing body and frankly few precedents of professional accreditation in tech more broadly. As the report points out, this could be a critical first step towards the professionalisation of the industry as a whole. 

The need is more urgent than ever, particularly as companies drift further away from T&S best practices and, more worryingly, human rights standards. At this inflexion point on online safety, practitioners must stand up for human rights, dignity, and safety for everyone. And that can start with holding each other to higher standards, and pointing out when work being done in the name of T&S isn’t meeting the professional codes we have committed to.

You ask, I answer

Send me your questions — or things you need help to think through — and I'll answer them in an upcoming edition of T&S Insider, only with Everything in Moderation*

Get in touch

Also worth reading

I was a guest on the Computer Says Maybe podcast talking about how platforms moderate LGBTQ+ users. I'm the least important guest on the show, though. I recommend you listen to it for the first-person accounts of being harassed for being trans, and the great insights from the other guests. I also really liked this companion piece by show producer Georgia Iacovou.

He used to make videos about real estate. Then he became radicalized by the right. (Slate)
Why? A story of one content creator's slide into conspiracy theories, and how his followers increased as it happened.

Schools use AI to monitor kids, hoping to prevent violence. Our investigation found security risks (AP)
Why? Student surveillance without proper disclosure from the schools seems like a major privacy issue. "An LGBTQ+ advocate reported to the Board of Education that a Gaggle alert about self-harm had led to a student being outed to their family, who were not supportive."

Utah Legislature Passes Completely Unworkable App Store ‘Age Verification’ Bill (Techdirt)
Why? "The law, as written, is a vague, unworkable mess that requires minors to verify their age with documents they can’t possibly possess."
Related: Meta is trying to ‘offload’ kids safety onto app stores with new bills, Google says (The Verge)

Reddit Is Restricting Luigi Mangione Discourse—but It’s Even Weirder Than That (Slate)
Why? I wouldn't call this weird – it's clearly complicated to discourage glorification of violence yet still support freedom of expression. It looks like Reddit's T&S team is trying a lot of things to keep their communities safe.
Related: Reddit's rule check feature will help users avoid breaking subreddit rules (Engadget)