6 min read

What the T&S community predicts for 2025

Over the last 12 months, T&S leaders and practitioners have seen unprecedented change in an industry that many have worked in for years and many others are brand new to. I asked a few of them what their predictions were for 2025

I'm Alice Hunsberger. Trust & Safety Insider is my weekly rundown on the topics, industry trends and workplace strategies that trust and safety professionals need to know about to do their job.

This week, I've collected predictions from the T&S community about what's coming in 2025. While it's hard to say for sure, the consensus from those I spoke to is that it's a natural continuation of what we've seen this year. Despite that, I predict it will feel pretty hectic.

Speaking of feeling hectic; while I enjoy the chaos of writing about whatever I feel like every week, but it's probably better for you, dear reader, if I write about things you care about. Ben and I have teamed up with Katie Harbath (from Anchor Change) to launch a year-end survey for T&S newsletter readers, and we'd really appreciate it if you took five minutes to tell us what you'd like us to cover.

I'll be taking the next two weeks off for the holidays, but I'll see you in January. As always, get in touch if you'd like your questions answered or just want to share your feedback. Here we go! — Alice


The more things change, the more they stay the same

Why this matters: T&S leaders and practitioners are seeing unprecedented change in an industry that many have worked in for years and others are landing fresh into. As platforms juggle regulation, innovation, and user safety, being able to anticipate and adapt to emerging trends in the next 12 months will be critical for staying ahead of operational, policy, and technological shifts.

Predictions are a bit like leftovers. You don’t think you can handle any more, but when a good one comes along, you realise that you could squeeze it in after all, even if it is a bit painful.

So, with that in mind, I've spoken to a handful of T&S experts about what they are predicting to come down the track in 2025. Several of these chats were under Chatham House Rule, so I can’t attribute them. But, in aggregate, I believe they are a helpful list for EiM readers to mull over between now and January.

As you'll see below, much of what they shared feels like a natural continuation of what we’ve seen this year — trends picking up steam, challenges growing sharper, and the stakes getting higher. Here's what they said:

Social media users will start to feel the impact of safety regulation

This is an obvious one, but significant. We saw some impact from online safety regulations from around the world in 2024, but mostly around transparency reporting and risk assessment – not anything that would significantly affect users.

One T&S leader working on regulatory compliance for a Very Large Online Platform (VLOP) told me that, as more transparency reports, risk assessments, and other documentation emerges, we’ll start to see some consensus among the VLOPs around what to disclose and how. Smaller platforms will follow, and regulatory compliance will start to become more routine. 

That said, there will be some shifting elements next year. Users will start feeling the impact of safety regulation through more aggressive content moderation, extensive age assurance/verification, and even people being denied access to platforms (e.g. TikTok may be banned in the US; social media will be banned for youth in Australia).

Content moderation will be a focus

T&S folks know that there’s a lot more to what we do than “content moderation”, but we’re already seeing a lot of talk about content moderation and censorship from the incoming US government and we’ll continue to see that. Heather Rasley, Director of T&S at Glif, says, 

“Content moderation teams are going to have to be on the defensive against multiple government-level efforts to dismantle them. As a result, they will probably have to dedicate a lot more time to documenting their work, interacting with governmental bodies, and generally needing to justify their existence even more than they already do.”

As a result, platforms will pay more attention to T&S comms and PR — either through in-app experiences and education, using influencers, participating in feel-good projects and collaborations, and more. 

The industry will continue to figure out the problem of reality 

We’ll continue to see misinformation and disinformation online, particularly with AI-generated content, but some platforms may decline to do much about it given shifting political pressures.

One leader said that pretty soon we won’t be able to tell what’s AI-generated or not, so T&S policy teams will have to be ready with policies around behaviour or outcomes rather than origin or whether something is “real” or not. 

The internet will significantly change for young people

We’ll see an increase in regulation, policies, and feature changes around youth experiences online, especially apps and services featuring generative AI. There's a strong chance we’ll see age verification on dating apps, social media, and more, as KOSA may pass in the US, and regulation from the UK and Australia comes into effect. Some platforms may restrict their platforms to 18+ to play it safe. 

The team at Thorn sent me some of their predictions, and said,

“caregivers will be caught off guard by increased requests to override age restrictions on new profiles and greenlight features to which their teens previously had access. [...] These efforts to protect children will often fall short of their intent - leaving many young people to navigate the same risks that existed before the restrictions but now with less support.” 

We’ll also see platforms balance the need for privacy and safety, with potentially competing regulations and social pressures. 

T&S will continue to professionalise, and roles will change

As a result of the influx of regulation, T&S operations/enforcement roles will get more complex (and I hope these folks get the pay raises to go along with it). We’ll see platforms hire more specialists, not generalists. There may be less entry-level roles, and less organic entry to T&S — expect to come in with prior applicable experience. 

And as T&S shifts more towards compliance and professionalises more, the demographics and culture of T&S may shift as well– one person told me they expected to see less women and more suits at TrustCon in the future. 

We’ll see more innovation and growth around AI and T&S tooling

Generative AI has enormous promise but, as far as I know, platforms haven’t yet tapped its full potential for really changing how we do Trust & Safety. I hope that 2025 is the year that we’ll finally see innovation in T&S to create systems and user experiences that were never possible before. 

We’ll continue to see an expansion of T&S tech vendors, particularly in the AI space. We’ll also see more open-source tooling, information sharing, and collaboration, at least from the people who aren’t focused solely on compliance. 

Alongside that, you can expect more discussion around AI governance, especially around AI agents, and a deeper discussion of what safety protocols we should see in place for AI companies. So far much of this discussion has been around tech insiders, but as the industry matures and AI becomes more commonplace in people’s everyday lives, I suspect we’ll start to see more interest from the general public on AI safety as well. 

You ask, I answer

Send me your questions — or things you need help to think through — and I'll answer them in an upcoming edition of T&S Insider, only with Everything in Moderation*

Get in touch

Also worth reading

Online safety on community platforms: An ubuntu perspective (Fadzai Madzingira/ Protimos)
Why? Ubuntu is an African philosophy of interconnectedness and communal flourishing, and in this essay, Fadzai Madzingira argues for using ubintu as a human dignity and community framework when building policies and products: "To reduce human behaviour to a collection of rules and regulatory checkboxes is to do a disservice to the expression of users everywhere." A refreshing take on T&S, and my read of the week.

New KOSA, same as the old KOSA (Eternally Radical Idea)
Why? Greg Lukianoff, co-author of The Coddling of the American Mind with Jonathan Haidt, writes about why he opposes the Kids Online Safety Act.
Also read: New KOSA, Same As Old KOSA, But Now With Elon’s Ignorant Endorsement (Techdirt)

Attacker Has Techdirt Reclassified As Phishing Site, Proving Masnick’s Impossibility Law Once Again (Techdirt)
Why? "It’s a reminder that bad actors will try basically anything to try to find weaknesses in a system. So many of the laws around content moderation around the globe, such as the DSA, often seem to assume that basically everyone is an honest broker and well-meaning when it comes to moderation decisions. But, as we see here, that assumption can help allow bad actors to wreak havoc."

Where are all the digital literacy programs? (Dr. Rachel Kowert)
Why? A three-pronged approach to digital literacy in the modern age, and a call to action for more resources: "Only you can… prevent disinformation from crumbling democracy."