6 min read

People want niche online spaces - how do we make them safe?

A survey from The Verge and Vox suggests that users are moving away from global platforms and towards small, niche communities. But how will that affect Trust & Safety practices? And where should we focus our attention?

I'm Alice Hunsberger. Trust & Safety Insider is my weekly rundown on the topics, industry trends and workplace strategies that Trust & Safety professionals need to know about to do their job.

This week, I'm thinking about a survey about future of the internet, which shows that users want to participate in well-moderated, smaller communities. Historically, most resources for T&S have been created for large, global platforms, so what do we need to do to ensure that small platforms are able to keep their communities safe as well?

Get in touch if you'd like your questions answered or just want to share your feedback, or if you'll be in London for the T&S Summit and want to say hi.

Here we go! — Alice


Today’s edition is in partnership with Safer by Thorn, a purpose-built CSAM and CSE solution

Powered by trusted data and Thorn’s issue expertise, Safer helps trust and safety teams proactively detect CSAM and child sexual exploitation conversations.

Safeguard your platform and users with proprietary hashing and matching for verified CSAM and a classifier for finding possible novel CSAM. Plus, Safer’s new text classifier provides much needed signals to help trust and safety teams find conversations that violate your child safety policies, such as those containing sextortion or requests for self-generated content from a minor.


What does T&S look like in small online communities?

Why this matters: If the future of the internet is moving away from global platforms and towards small, niche communities, then how does that affect Trust & Safety practices? We need to think about the ways that small platforms are uniquely affected by regulation, as well as what approaches we can take to support the principles of safety and trust at a smaller scale.

Good news: people want moderation online. At least according to one new survey.

The Verge and Vox recently published research on the future of the internet. In it, they surveyed 2,000 people in the US and found that most want authenticity, community, and curated content from their online experience. The majority believe that good governance (community guidelines, moderation enforcement) is key to fostering positive communities. 

Screenshot from the The Verge/Vox report on the future of the internet
Screenshot from the The Verge/Vox report on the future of the internet

Another survey that also came out last week found that people want well-governed platforms: of 13,500 people surveyed by University of Oxford and the Technical University of Munich:

Only 17% think that users should be permitted to post offensive content to criticize certain groups of people. The country with the highest level of support for this stance is the USA (29%) and support is lowest in Brazil (9%). In Germany 15% hold this view.
“The study shows that the majority of people in democracies want platforms that reduce hate speech and abuse. This applies even in the USA, a country with a long-standing commitment to freedom of speech in the broadest sense”

It’s good to have confirmation that the public believe in Trust & Safety and don't want the shift away from moderation that some platforms have taken (EiM #276). But there was another part of the Verge/Vox results that also caught my attention.

Small is beautiful?

The Future of Internet survey also found that:

The desire for smaller, more intimate communities is undeniable. People are abandoning massive platforms in favor of tight-knit groups where trust and shared values flourish… and where content is at the core. The future of community building is in going back to basics. 

Small, purpose-driven communities can be wonderful places. I started my journey in T&S by running a small, DIY forum, and I’ve found great solace through communities on Reddit during some of my hardest times. 

That said, I’ve also learned the hard way that community governance is a really difficult thing, especially when it relies on volunteers (which small groups often do). It takes a lot of time, effort, and knowledge. The work of initiatives like New_ Public and Independent Federated Trust & Safety (IFTAS) is vital to supporting volunteer moderators and people who govern niche communities but it’s not enough.

If this vision of small, interactive, and content-driven spaces is to be realised, there is a lot that to figure out about how we maintain and moderate these types of spaces. So, borrowing from how Ben structures his Friday newsletter, here are a few things that we can be working on, both as an industry and as individuals:

Policies

  • Examples of policies that account for cultural and linguistic nuances, historical usage of terms, and the intent behind posts. These issues are easily missed in smaller communities where there isn’t the scale to easily spot edge cases.
  • Open-source templates and libraries so it’s easy for new communities to create community guidelines that make sense for them. 
  • Clear enforcement frameworks, like decision trees, which help new moderators learn quickly and apply policies fairly and without bias. 

Products

  • Affordable top-level moderation tooling for the most egregious content, which would most likely include signal sharing and hash matching tools (like Lantern and StopNCII). 
  • Open-source (like ROOST) or lower-cost tools tailored for smaller communities as well as a push to get more vendors to contribute to these initiatives without damaging the market.
  • Tools that provide explanations for moderation actions and allow for appeal mechanisms can improve trust in the system and make compliance for regulation easier. 
  • Custom/generative AI moderation solutions that allows niche communities to enforce their unique moderation needs; for example, use of in-group reclaimed slurs which might traditionally be over-moderated by generic systems.

Platforms

  • Cross-community collaboration and knowledge sharing, both around harmful content, but also best practices, so that no one has to reinvent the wheel.
  • Interventions to reduce the impact of user self-selection into “bubbles” and to reduce the spread of harmful content. 
  • Shared resources on user education and media literacy, so that users can be empowered to participate in keeping their communities safe. This reduces the burden on moderators.

People

  • Access to professional mental health services, including proactive support and emergency interventions, so that volunteer moderators avoid burnout and additional trauma. This is something moderators at large platforms get access to but it rarely exists outside of large Business Processing Outsourcers (BPOs) or companies.
  • Diverse group leadership or governance to ensure that policies and interventions are inclusive and effective across different user groups.
  • Incentive structures, stipends, or partnerships that support volunteer moderators and mean that community-driven moderation is not solely reliant on unpaid labor.
  • Training in de-escalation tactics, community mediation, and restorative justice techniques. In small communities, moderation challenges can sometimes come from interpersonal conflicts and general toxicity rather than outright policy violations so this skillset is vital.

The other thing to note is that regulation can be a threat to smaller, intimate spaces. Regulatory compliance for small platforms is a huge burden and we're starting to see examples of the UK's Online Safety Act leading to the closure of long-running forums. With the threat of removing protections from Section 230 in the United States, the liability may become too high for many other community forums and spaces. As Mike Masnick wrote on Techdirt: “When you regulate the internet as if it’s all just Facebook, all that will be left is Facebook.”

I’d love to know what you think about my to-do list. Hit reply or get in touch. Are you:

  • Working on an initiative tailored for small, niche or DIY internet communities?
  • A member or moderator of a small community and have experience of the challenge?
  • Doing research/analysis of potential solutions to address the problems mentioned here?
  • Thinking through how to support moderators within small communities?

You ask, I answer

Send me your questions — or things you need help to think through — and I'll answer them in an upcoming edition of T&S Insider, only with Everything in Moderation*

Get in touch

Also worth reading

New transparency resources (Integrity Institute)
Why? If only this had been out when I wrote about transparency recently!

The Safety Triforce (Psychgeist)
Why? I love it when Rachel goes on a rant about user education: "'Digital literacy' has become such a catch all, aspirational ideal that I think we have lost sight of how we can be operationalising it and promoting it within our individual spaces and communities at a smaller scale."

Doubling down on the layoff lie (Matt Motyl)
Why? If you were laid off recently (or if you're a hiring manager), this is a must-read.

Threat surfaces in games (GIFCT)
Why? A paper as a result of a working group, looking at best practices for combatting terrorism content in online games.

A childhood neighbor terrorized my family. It prepared me for Trump's takeover. (Slate)
Why? An intense personal story on how a reporter has the resilience to cover the alt-right.