6 min read

Is it prosocial design’s time to shine?

With some platforms retreating from a reactive, enforcement-driven approach to Trust & Safety, there’s a stronger case than ever to lean into proactive and prosocial practices that prevent toxicity from happening in the first place. Here's where to start.

I'm Alice Hunsberger. Trust & Safety Insider is my weekly rundown on the topics, industry trends and workplace strategies that Trust & Safety professionals need to know about to do their job.

This week, I'm thinking about prosocial design and how it fits into a holistic Trust & Safety strategy. Below I share some case studies, research, resources, and a couuple of updates from the prosocial design world.

Plus, scroll to the links at the bottom to learn how YouTube quietly updated their policies, whether US adults have become more extreme in their politics, and more.

Get in touch if you'd like your questions answered, want to share your feedback, or just let me know how you're doing.

Here we go! — Alice


Today’s edition is in partnership with Safer by Thorn, a purpose-built CSAM and CSE solution

Powered by trusted data and Thorn’s issue expertise, Safer helps trust and safety teams proactively detect CSAM and child sexual exploitation conversations.

Safeguard your platform and users with proprietary hashing and matching for verified CSAM and a classifier for finding possible novel CSAM. Plus, Safer’s new text classifier provides much needed signals to help trust and safety teams find conversations that violate your child safety policies, such as those containing sextortion or requests for self-generated content from a minor.


Is it prosocial design’s time to shine?

Why this matters: With significant pushback on the reactive/enforcement side of Trust & Safety, there’s a stronger case than ever to lean into proactive/prosocial practices that prevent toxicity from happening in the first place. That doesn’t mean we shouldn’t still invest in reactive measures – they’re more important than ever, I think– but we can also meet the moment by embracing the positive, community-focused messages of prosocial thinking. 

Community norms — those invisible but powerful signals of what’s encouraged, accepted, and celebrated — can make or break a space. 

An academic study published a few years back examined behaviour on Reddit and found that users change the tone of their posts depending on the subreddit they were posting in:

“Overall, around 16 per cent of people in the data set were responsible for toxic posts and 13 per cent for toxic comments. However, that behaviour could and did change depending on the community. Four in five people showed changes in the average amount of toxicity in their posts, depending on the subreddit they posted in.” 

Relatedly, banning users from a platform doesn’t make them more likely to have “learned their lesson” and behave elsewhere. In fact, a study showed that:

“Comparing content from the same users on Twitter and Reddit versus Gab, users tend to become more toxic when they are suspended from a platform and are forced to move to another platform. They also become more active, increasing the frequency of posts.”

From this research, we can see that users change their behavior based on where they are and who they’re around. This means there’s an enormous opportunity for social platforms to take an active role in helping their communities thrive. 

Reminders work - if they’re at the right time

So what can we do to shape the where and the who? Timely reminders and prompts to adjust behaviour can be one effective example. 

One of my favourites is Tinder’s “Are you sure?” prompt, which “reduced inappropriate language in messages sent by more than 10 percent in early testing.” This result was exceeded by Nextdoor’s Kindness Reminders, which in early tests in the United States showed that “1 in 5 people who saw Kindness Reminder hit “edit” on their comment, resulting in 20% fewer negative comments".

These kind of product changes don’t always work as well as that. When I was head of T&S at a major dating app, I ran an experiment: we sent messages to users in which we explained the community guidelines upfront (this was long overdue, and was something I’d been advocating for for a long time). 

We took a look to see if it changed user behaviour and, unfortunately, didn’t find any significant differences versus the benchmark. My guess as to why this failed is because users were learning social norms from each other, more than they were paying attention to another boring pop-up from the platform. Further, users had shaped their own norms and practices as they’d gone so long without an active reminder from the platform upon signup. So a word of warning: it doesn't always work in the way you might expect.

Feature innovation

I’m increasingly interested in new platform features that are trying to prompt a different kind of behaviour or seize upon an emerging norm.

For example, Hinge just released a new feature called Match Note, which lets users add a private note to their match, sharing a bit more about themselves upfront. This came from discussions with TransTech Social Enterprises and Disability:IN, where they learned that users wanted to be able to share information about themselves which is often a deal-breaker to others, but without having to post it publicly for the whole world to see.

According to Hinge’s data, 68% of users said it helped them better assess compatibility, and 66% said it helped them show up more authentically. That's an amazing uplift.

I can imagine that such a feature also prevents a lot of toxic comments aimed at users. Now users can quietly unmatch when they're faced with a dealbreaker, rather than having to navigate the disappointment and potentially lashing out or saying something they shouldn't. I’m curious to see if Hinge shares follow-up data on whether this change reduces user reports.

Stewarding the stewards

Another way to shape a space’s norms is to ensure that the community is involved in defining what they look like. This is the model for community-moderated spaces such as Reddit and Discord, which make it a core part of the code of conduct and their  really excellent guides for their moderators (Discord’s moderator resources & Reddit’s Mod Help Center). 

However, this can be taken a step further. New__Public and non-profit Hylo have teamed up to find ways to encourage users to be involved and co-design solutions. They had explored community norms and recently just released a full report with everything they learned. 

I love how they’re thinking about community stewardship as something that everyone is involved in:

“Most platforms today think of stewardship in binaries: you are either a moderator/leader or a regular member. This can place a high burden on stewards when it comes to promoting and maintaining health engagement norms in their communities.
In this work we instead envision stewardship as a path that every member is on–some may have just begun their journey, and others may be further along, but everyone is headed in the same direction and is contributing to the success of the community. We explore and share recommendations for inviting members into stewardship, as a way to help get people onto this path."

I believe most people are fundamentally good — and want to be part of safe, thriving communities. The research backs this up: people often behave better when their peers model positive behaviour. But those peers need support. That’s where we come in.

Useful resources

Luckily, this shift isn’t just a hopeful vision: it’s already happening in pockets across the internet. We have years of research in this area, and an increasing number of resources available for practitioners. Here are just a few that I’m aware of: 

  • For gaming, the Digital Thriving Playbook offers a goldmine of ideas and concrete frameworks to implement
  • For social networks, the Prosocial Design Network offers blueprints as well as research
  • For a niche community spaces, New_Public is laser-focused on helping people create civil, collaborative local networks online.

You ask, I answer

Send me your questions — or things you need help to think through — and I'll answer them in an upcoming edition of T&S Insider, only with Everything in Moderation*

Get in touch

Job hunt

Tiny plug here, but if you're a Machine Learning Engineer and want to work with me, Musubi is hiring.

If that's not you, here are some T&S specific (or adjacent) job boards to check out:


Also worth reading

YouTube removes 'gender identity' from hate speech policy (User Mag)
Why? Yet another rollback for the LGBTQ+ community, this time from Google.

Social Media and Politics from 2023-2025 (Psych of Tech)
Why? An examination of the political identities, extremity, and polarisation of US adults across platforms over a 2-year national survey. (It may not show what you think).

We Need an Interventionist Mindset (Tech Policy Press)
Why? "This talk invites everyone listening to shift their orientation away from solutionism in order to meaningfully challenge the existing arrangement of money and power that configures our contemporary sociotechnical environment."

Trade, Tariffs, and Tech (Stratechery)
Why? This article goes through a few different scenarios and looks at how tech companies might be affected. A little snippet that I found interesting: "It’s difficult to overstate the extent to which every aspect of modern life rests on global supply chains, which are so long and complex that no one can truly understand the effects of messing with them."