Mind the internet safety gap
I'm Alice Hunsberger. Trust & Safety Insider is my weekly rundown on the topics, industry trends and workplace strategies that trust and safety professionals need to know about to do their job.
This week, I'm thinking about what a new survey on the attitudes of girls and women says about the state of internet safety. And I wonder: do we really understand the delta between what targeted and marginalised groups want from the industry to keep them safe and what they are being provided?
Get in touch if you'd like your questions answered or just want to share your feedback. Here we go! — Alice
Today's edition of T&S Insider is in partnership with Checkstep, the AI content moderation solution for Trust & Safety leaders
Last week we helped one of our oldest clients update their Content Moderation Policy.
Our in-house expert, Sanket, has shared the most common problems that he helps to address:
- Lack of clarity, which can lead to moderation errors
- Reviewing policies that lead to under- or over-moderation
- Policy gaps
To help Trust & Safety leaders build or review their policies, we've decided to share our full Content Moderation template with 21 policies that you can customise for your platform. Do not hesitate to contact us through our website if you have any questions or need help!
A survey that shows the work we need to do
New data was published last week from CNN As Equals and Plan International about the online safety experiences of more than 600 young women and girls aged 13-24 across nine countries in Africa, South America, and Asia. Although not representative of the experience of all women and girls online, it shows the stark reality in which girls and young women regularly come across harmful content online (75% said as much), but feel alone in navigating it.
As I was reading through it, and discussing with Ben — EiM's founder and editor — something else became clear; not only were these girls and women very likely to encounter unwanted sexual content or hate speech — something we sadly know all too well — but there were giant gaps between what they wanted to help them address that and what they were being provided with in different guises.
Let me explain.
Prepared for abuse
A surprisingly positive statistic is that over a large majority of girls and women asked reported feeling at least somewhat prepared to protect themselves online (86%). That's very high. Especially when you consider that three quarters of respondents said they had faced online abuse.
Now, people are notorious for over-estimating their preparedness but even so, this stat raises questions about what type of abuse do they feel prepared for, what has got them to the point of feeling prepared and do they really know the full extent of what can go wrong.
We can assume that their preparedness comes from personal experience because respondents called for better education on digital literacy (70%); this suggests that the majority of those who feel prepared are self-taught. But, as we know, simply having prior experience with certain types of abuse does not necessarily equip someone to handle more complex or severe forms of online harm. So there's a potential gap here.
Responsible for their own safety
When asked who is responsible for their safety online, the majority said that staying safe was their own responsibility (67%). Both Ben and I were surprised by this and the reasons are not immediately clear. One theory we had was that, because user reporting exists, these girls and women see themselves as responsible for shaping their online experience.
However, these reporting buttons and follow ups are often hard to find in a platform’s UI, are not very explicative and are sometimes only available to users in certain regions, most often Europe. And the effect of that can be a feeling of needing to do it yourself. There was an quote that spoke to this; when asked about platform’s Trust & Safety teams, one respondent said: "We don’t know if these people are really listening.”
To address this, some regulation, such as European Commission's Digital Services Act, requires platforms to be transparent to users about the result of their reports. We don't know what the effect will be but, when designed well, communication like this can help victims of online abuse to feel heard, validated, and supported, instead of alone and solely responsible. And there may be other ways we could address gap this too.
Digital literacy
Finally, the survey results was loud and clear about what is lacking to provide online protection: there was strong preference for digital literacy education (70%) but also different forms of public communication including awareness campaigns for girls (39%), improved parental guidance and education (39%) and increased awareness campaigns generally (37%).
The need for stronger legal measures was the fifth response, followed by other actions that platforms can take, such as strict enforcement and enhanced privacy settings. Transparent reporting (16%) and comprehensive legislation (19%) — two things that are actually happening — loitered at the bottom.
Once again, this suggests — and the data is not representative so it is just suggests — a gap between what girls and women want and what they're being afforded.
In other words, all the work being done to improve the lot of women and girls online by platforms, civil-society groups, government and educational institutions is seemingly not known widely enough or not hitting the mark. As the article makes clear, women and girls “feel that parents and schools are too uninformed to help, reports to platforms are sent to bots and go unanswered, and authorities don’t hold perpetrators adequately accountable.”
Where to go from here
If this data indicates a gap, a follow-up question is: how do we close it? A few high-level thoughts but I'm keen to hear from you, T&S Insider readers, too:
- It’s clear that we must prioritise comprehensive digital literacy training, and to address online harm holistically. Historically, we’ve seen digital literacy training include information about platform’s online safety tools, but based on this survey, it seems that young women and girls feel prepared in this area already, although they are often teaching themselves or each other.
- We need to ensure that all groups — not just parents — understand online harms and take them seriously. That teachers are a resource for educating about online harm, and in supporting victims emotionally. That boys are taught about consent. And that perpetrators causing this harm must face consequences, both online and, for serious cases, in the legal system. As PEN America’s online harassment field manual states:
“The reality is that, when it comes to online harassment, the burden of educating local law enforcement about existing cyber laws often lies with the victim. […] Law enforcement officials who are not well versed in technology and online harm may not treat online harassment as significant or urgent.”
What happens if we don't address this gap? The CNN/Plan International research shows examples of participants closing their accounts, hiding their profiles, and taking breaks from social media as a response to the online abuse. They are forced to retreat, step back, hide from view.
In the US, the recent story about young politician Sabrina Javellana is also a warming to us. Javellana chose to no longer run for public office after facing AI-generated abuse for representing her community and continues to have to weigh the need to participate publicly in online spaces with potential harassment.
Certainly there is a need for platforms to invest in Trust & Safety and to make good design choices that protect users. However, it’s just as critical to look at other ways that we can close the gap that this survey demonstrates.
Let me know
Do you know of any studies or surveys that look at the root cause of abuse online? How can platforms prevent sexual abuse and harassment in the first place? What kinds of education help men and boys understand consent and respect? I'd love to know!
Get in touchAlso worth reading
How do we work our way towards utopia? (Untangled)
Why? The always thought-provoking Charley Johnson with an AI history lessons for the ages.
Algorithms should not control what people see, UN chief says, launching Global Principles for Information Integrity (UN)
Why? The Secretary General boldly launches new Global Principles for Information Integrity. Whether it's possible is another thing.
Ding Dong KOSA's Dead (For Now) (Techdirt)
Why? The Kids Online Safety Act has a lot of fans and lot of detractors. Which is why the fact that it is dead in the House of Representatives is such a big story (as covered on the latest episode of Ctrl-Alt-Speech).
Meta Has Run Hundreds of Ads for Cocaine, Opioids and Other Drugs (Wall Street Journal)
Why? The WSJ does a follow up on reporting that it did about ads for illegal drgus on Meta's marketplace. Nothing has changed.
Member discussion