6 min read

Some personal news, as they say

Much of the work I've done over the years has circled around the problem of how to balance human moderation and automation. Now I'll be doing it within an AI moderation company

I'm Alice Hunsberger. Trust & Safety Insider is my weekly rundown on the topics, industry trends and workplace strategies that Trust & Safety professionals need to know about to do their job.

This week, I'm sharing why I decided to switch jobs, and what's next for me (ahh it's so exciting!). Plus, a lot of links at the end – it was a busy week in T&S!

Get in touch if you'd like your questions answered or have ideas about what I should write about next (maybe riffing on one of the links I share below?). Here we go! — Alice


Today’s edition is in partnership with Thorn, working to transform how children are protected in the digital age.

New research from Thorn reveals deepfake nudes are a harmful reality for youth. In fact, 1 in 8 young people reported personally knowing someone who has been targeted. While nonconsensual image abuse isn’t new, deepfake technology represents a dangerous evolution in this form of child sexual exploitation.

This research underscores the urgent need for digital platforms to take a Safety by Design approach. Now is the time to implement systems to detect and prevent deepfake image creation and distribution before harm occurs. As deepfake technology grows more accessible, we have a critical window of opportunity to understand and mitigate this form of digital exploitation—before it becomes normalised in young people’s lives.


My next career chapter

This week I start a new job as Head of Trust & Safety at Musubi Labs, an AI company focusing on T&S and content moderation services. (Yes, they're named after spam musubi because they tackle spam, among other things. As some of you will know, spam is very close to my heart!).

A woman called Alice in a spam costume
This was for Grindr's annual costume contest a few years back. I won!

Some EiM subscribers might be wondering why I've made this move. So I thought I'd unpack my thinking a little.

Much of the work I've done over the years has circled around the problem of how to balance human moderation and automation. I’ve done this from all angles: designing moderation automation systems from scratch, building human teams (both outsourced and in-house), hiring vendors and working for a vendor, and figuring out when to use each. The one area of the T&S ecosystem I haven’t worked in yet is technology services, and yet this is the fastest-growing part of the sector.

AI is changing what is possible in Trust & Safety so quickly, and has so much potential. I strongly believe that humans are still needed, but that their roles should be elevated. AI can learn from what talented moderators do and extend the impact of their work, and it can also provide a level of customisation that would be really difficult to do with a purely human team. There are also a lot of important questions around bias, ethics, and best practices that we’re still figuring out as an industry. All of this will be so fascinating to work on. 

When I left Grindr for PartnerHero in early 2024, I underestimated how much I would miss working with engineering and product teams. At Grindr, I set the product strategy for T&S and I collaborated with engineering constantly on spam and other at-scale moderation issues.  Over the last year, I’ve filled the gaps somewhat by writing (a lot) about AI here, but I'm excited to get back to working on tech directly. 

All that said, I wasn’t actively job searching (I was happy at PartnerHero, and will actually still be doing some advising for them) but I also believe in saying yes when great opportunities come your way. Tom Quisel, the CEO of Musubi, and I have known each other for 15 years. We worked together at OkCupid and Grindr, where he was CTO at both companies. We’ve tackled a lot of moderation problems together, and I know we're a good team.

When he started Musubi, I cheered him on, introduced him to folks at conferences, and had chats with him about what products I’d love to see built. This was all without expectation of him being able to hire me. That’s generally how I approach my career: I try to be as generous as possible with my time and advice, especially with friends. It means I do a lot for free, but I’ve also found that unexpected opportunities come my way as a result. 

At this point in my career, it comes down to being able to work on interesting problems, with good people, with the autonomy to set strategy and build cool stuff. My dad was a pioneering computer scientist interested in ethics, AI, and emerging technology, and, as my career has progressed, I find myself following in his footsteps more than I thought I ever would. Honestly, it makes me a bit emotional thinking about it in this way. He died when I was 22 and never got to see me working in tech and Trust & Safety, but I know he'd have been thrilled to see me here today.

I'm lucky that Musubi will continue to allow me to write for Everything in Moderation and work with Ben. In future editions of the newsletter where I write about AI, I plan to disclose that I work for an AI moderation company.

However, I won’t pretend as though my work won’t influence the way that I think about certain things; it's all connected, after all. But I hope to share helpful insights here every week and to learn from T&S Insider readers with more experience in the tech services and AI space. I’m stoked to start this new chapter.

You ask, I answer

Send me your questions — or things you need help to think through — and I'll answer them in an upcoming edition of T&S Insider, only with Everything in Moderation*

Get in touch

Also worth reading

Digg is coming back (The Verge)
Why? I love the way Kevin Rose and Alexis Ohanian (EiM #284) are thinking about this – it speaks to what I wrote about above and the potential for using AI in T&S:

"...the real trick, the thing nobody has yet done properly, is to give the communities the tools they actually need to operate. This is where AI comes in. So much of a moderator’s job, Rose says, is just grunt work: fighting spam, reviewing obvious policy violations, litigating pointless fights. “How can we remove the janitorial work of moderators and community managers,” he says, “and convert what they do every day into more of a kind of ‘director of vibes, culture and community’ than someone that is just sitting there doing the laborious crappy stuff that comes in through the front door?”"

Deepfake nudes and young people (Thorn)
Why? Deepfake nudes already represent real experiences that young people have to navigate. This report goes into detail about access to technology, how many young people are affected, and how they think about deepfakes. Thorn's research is always really excellent!

How tech created the online fact-checking industry (Pirate Wires)
Why? A really interesting look at the issues with creating top-down policies at scale. I don't love that it downplays the brilliance and dedication of so many people working in Trust & Safety, but I do agree that content moderation is complicated and often flawed.

Trump promises to abuse Take It Down Act just as we warned (Techdirt)
Why? "You can’t just write a law that says “take down the bad stuff.” I mean, you can, but it will be a disaster. You have to think about how people might abuse it." – and now the abuser may be the president.

2024 National Survey on the mental health of LGBTQ+ young people by state (The Trevor Project)
Why? "These findings underscore the unique challenges faced by LGBTQ+ young people, a group that is disproportionately impacted by suicide not because of who they are, but because of how they are mistreated and stigmatized in society"

Who should control social media? (New_Public)
Why? So many decisions around how to structure social media come down to who controls it and what their incentives are. This is a thoughtful discussion on how to ensure that social media benefits users.

The infinite manosphere (Untangled)
Why? "Boys and young men don’t start out wanting to become misogynists. But they often discover manosphere groups and discussion boards online because they’re looking for help, community, or a sense of belonging." – we need to tackle these issues by supporting boys and offering community and belonging in more positive ways.