9 min read

How I’m talking to my kid’s school about phones and social media

Conversations about kids and digital safety are often clouded by moral panic and oversimplification. By exploring the trade-offs of digital devices and platforms, we can empower parents and schools to make decisions that prioritise both safety and connection for young people.

I'm Alice Hunsberger. Trust & Safety Insider is my weekly rundown on the topics, industry trends and workplace strategies that trust and safety professionals need to know about to do their job. This week, I'm thinking about:

  • What I'm going to tell the parents at my kid's school when they worry about social media harms.
  • Plus, how to find me on Bluesky.

We had a great first EiM hangout/barn dance on Friday with a stellar group of experts. The next one is this Friday November 22nd at 7am PST / 10am EST / 3pm GMY / 11pm SGT. Send us an email if you want to join — ben@everythinginmoderation.co or hi@alicelinks.com — and we'll add you to the calendar invite.

Here we go! — Alice


The letter I'll share with my kid's school about online safety

Why this matters: Conversations about kids and digital safety are often clouded by moral panic and oversimplification but the underlying issues deserve nuanced, evidence-based discussions. By exploring the risks, benefits, and trade-offs of digital devices and platforms, we can empower parents and schools to make informed decisions that prioritise both safety and connection for young people.

My son is in first grade at a school that just banned phones. A parent of one of his classmates sent out an email asking if anyone wants to talk about our kids and digital safety.

“I’m not sure if you know this…” I wrote back, “but I work in digital safety for tech companies. I can give an insider’s view on some of this.”

Now, I’m not an expert in youth safety — I’ve spent most of my Trust & Safety career working for 18+ platforms— but I’ve  probably got more experience in this than anyone else in our tiny rural school district. That’s due to the work I do, but also in large part because I’ve helped to raise my two stepkids through the elementary school years into young adulthood. 

Coincidently, age assurance and youth online safety came up in our Everything in Moderation community call last week (read more about how you can join a future edition).  A few folks suggested I write about it so I thought I’d share what I plan to pass on to my son’s school and the other parents in his class. 

Leave your thoughts in the comments or get in touch via hi@alicelinks.com. Here we go…


There is no causal evidence linking social media use to increased anxiety or depression: Dr. Candace L. Odgers studies youth mental health and social media use, and has a balanced and evidence-based viewpoint which I respect. She says: “When associations are found, things seem to work in the opposite direction from what we’ve been told: Recent research among adolescents—including among young-adolescent girls, along with a large review of 24 studies that followed people over time—suggests that early mental-health symptoms may predict later social-media use, but not the other way around.”

So— depressed kids use social media more, instead of social media making kids depressed. The internet is good at amplifying what people seek out. Algorithms can send people down weird rabbit holes, but it’s often (though not always) because those people are seeking out problematic content to begin with. But the internet and social media are not inherently harmful in themselves.

We should pressure social media companies to do better when it comes to young people: I’ll quote Dr. Odgers again: “Two things can be true: first, that the online spaces where young people spend so much time require massive reform, and second, that social media is not rewiring our children’s brains or causing an epidemic of mental illness. Focusing solely on social media may mean that the real causes of mental disorder and distress among our children go unaddressed.”

Companies should invest more in Trust & Safety teams and safety by design programs— especially with youth in mind. After a lot of pressure, we’re seeing some positive changes from Instagram, Snapchat, and Roblox, and I hope to see more in the future, as there’s certainly more work to do. (Ask any Trust & Safety leader and they’ll probably tell you they’re under-resourced).

Social media can be a lifeline for some: When parents are scared for their kids, it’s easy to want to find simple answers. Over the years, people have scapegoated many things that I love and have found immense community in — Dungeons & Dragons, heavy metal, the internet. We look back and laugh at the panic over D&D, and it was certainly more baseless than the panic over social media, but there’s a lesson to be learned here. 

Digital youth safety expert Vaishnavi J talks about the moral panic and comparison of social media to tobacco by saying: “it is more appropriate to compare social media to cars than to tobacco. Like cars, social media can be incredibly valuable for youth when designed well. It helps them build community and some of the most powerful youth activism today is fuelled by the scale and visibility social media provides. Tobacco, on the other hand, has no such benefits, and grouping it with social media ignores the value that the latter adds to young people’s lives.” 

I’m an example of that. As a seriously depressed, socially awkward, queer teenager experiencing culture shock after moving from Oxford, England to North Carolina, the internet showed me that I wasn’t alone. I found connection in a way that I wasn’t getting in person — and that has held true for me for most of my life (except when I lived in Brooklyn, where there’s a scene for everyone).

For me, the risks were worth the reward, and I’m not the only one with experiences like this online. TrevorSpace connects LGBTQ+ youth to support each other when their families and communities do not. A recent film, The Remarkable Life of Ibelin shows how important online community can be. Technology can be beneficial for some neurodivergent youth as well. Removing access to beneficial technology and tools will be devastating for some kids.

It’s a child’s right to access the web: Sometimes we forget that children have the same human internet rights as adults. They have the right to access information and express themselves, as well as the right to safety and privacy. 

Parents should actively monitor risks and harms and make decisions with their kids: Social media use can be beneficial for some people, and the youth safety changes that platforms are making are a good start, but even still, harms can’t be eliminated completely. Trust & Safety programs can’t fix societal problems and there will always be some risk on social media (as there is offline). Managing these risks and making informed decisions isn’t one-size-fits-all. Some parents may be right to keep their kids off the internet completely, whereas for others the right decision may be monitored use that tapers into complete freedom. 

Even platforms with strict youth controls can also be terrible in other ways. We tried YouTube Kids, but the consumerist toy content with fast editing made my son, who is neurodivergent, jumpy and obsessive. There is better kid-appropriate content on regular/adult YouTube, but the only way to limit algorithmic suggestions — which inevitably ended up somewhere violent or inappropriate despite my attempts to control it — is to download shows and put the iPad on Airplane mode. Ultimately, we talked about it with our kid, and together we decided to do away with YouTube for him completely. Because we discussed what he’d been seeing, and made the decision together, the transition off YouTube was pretty painless.

Involving kids in decision-making teaches them about risks and gives them agency over their own experience.  Platforms should also involve kids in co-designing age-appropriate online experiences.  Resilient, well-educated kids with a good support system are less susceptible to online (and offline) harms. Our kids need to know warning signs of what is suspicious, what common scams are, how algorithms work and how social media companies use them (and, frankly, most adults need these lessons too). These programs already exist — the FBI even has one for kids — and parents should be ensuring that digital literacy is part of their kids education, instead of just outright banning anything to do with the internet.

Almost everyone I know who works in Trust & Safety for platforms is sceptical about age assurance/verification as a solution: It’s not just tech workers who think this. The Australian government did a study of age verification last year, and found that “age assurance technologies are immature, and present privacy, security, implementation and enforcement risks.”

Even if age verification on social media is required in the US (as it will be in Australia and the UK soon enough), it won’t be perfect. We must understand that to keep minors off social media, everyone will have to verify their age, and this poses real privacy, safety, and security risks. That’s not forgetting equity problems for adults without ID, or for whom biometric features don’t work well. We can’t rely on age-gating to keep smart and curious teenagers away from social media; they will find workarounds, or will enter unregulated spaces which could be far more risky than the major platforms are.

Decisions, including phone bans, should be evidence-based and up for review: Our school district is reporting their phone ban as a success so far. Teachers in the middle and high school say they see kids interacting more at recess and they spend less time arguing about phones in the classroom, and some kids are saying it’s positive too. But as any Trust & Safety practitioner will tell you, enforcing policies consistently is hard and phone bans can’t always be successful. 

Some districts are even reversing their phone bans as parents want a way to keep in touch with their kids during an emergency. My son’s school had a (thankfully false) shooter threat recently and we were terrified. Unfortunately in the US, the fear of violence in schools is real, and phones can be a lifeline. To justify a phone ban policy, schools should present evidence that the policy is being enforced evenly and that the school environment is significantly better, and to revise and adjust where necessary.

We need to focus on community and connection: Youth mental health is most affected by stressors in the environment, primarily at home and at school where kids spend the most time and have the closest relationships. Bullying and other harms exist in person as well as online. We need mental health and addiction support in communities, for parents as well as kids. 

Many families don’t have insurance that covers mental health adequately, or there’s an unreasonable wait to get psychological services. Many school districts are underfunded and overcrowded and aren’t able to offer individual support. “Third places” where kids can hang out freely are disappearing. Adults have fewer friends and social connections than they used to and aren’t always modelling healthy behaviours. And we live in frightening times, filled with dread about climate change and social unrest. Removing access to phones or social media won’t fix these things– our kids may still be anxious and depressed.

I’m not saying anything revolutionary here, so it’s a less satisfying ending. But the simple truth is that technology can be hugely beneficial for some, and not for others. The ways in which we mitigate risks for some will put others at more risk. We should be able to have these nuanced conversations with each other instead of getting swept up in moral panic. We must look at the evidence and understand tradeoffs before making big decisions. This is how we teach our kids to navigate the nuanced, complicated world we live in, so they become resilient young adults.

You ask, I answer

Send me your questions — or things you need help to think through — and I'll answer them in an upcoming edition of T&S Insider, only with Everything in Moderation*

Get in touch

Are you joining the cool kids over at Bluesky?

If you follow the same newsletters that I do, you'll find that everyone is talking about Bluesky. I'm giving it a go after swearing off all social media except LinkedIn. It's going pretty well, but I definitely need to find some cute animal feeds to lower my stress levels after I've read too much about current events.

If you've migrated over there, follow me and Ben, and also check out the Trust & Safety Feed and the Trust & Safety Starter Pack. Let me know if you've found any other good feeds or starter packs for T&S and I'll share them!


Also worth reading

Social media punishment does not need to be a Kafkaesque nightmare (New_ Public)
Why? When I worked at dating apps, one of the most frustrating things was when seemingly otherwise good users would break the rules. It was always really hard to figure out why they'd act against their own interests, and they'd also get really frustrated when there were consequences to their behaviour. New_ Public digs into this phenomenon and shares some research on why good people behave badly and how social media platforms can think about shaping more positive behaviour. (By the way, they're hiring).

Advocates and Researchers Set Up “Global Majority House” in Brussels to Engage on Digital Services Act (Tech Policy Press)
Why? A coalition of civil society and research organisations is setting up a “Global Majority House” in Brussels to ensure the DSA (and other regulations) are equitable and inclusive.

The WIRED Guide to Protecting Yourself From Government Surveillance (Wired)
Why? This would have been good to share in last week's newsletter, but alas, Wired hadn't written it yet. A good precautionary read.