'Explorations' interrogate questions about content moderation and online safety that are open-ended or in flux. Each Exploration follows the same structure and is updated regularly as new information emerges. Submit your own question for exploration.
This week: media coverage as a vehicle to understanding and improving content moderation.
The Wall Street Journal published a series of stories under the banner of The Facebook Files based on a huge tranche of internal documents from Frances Haugen, a former Facebook employee and whistleblower. The documents contained internal research that showed that the company knew about harms perpetuated by its platform but did little to mitigate them.
Haugen has subsequently appeared in a Senate committee hearing and did a bunch of press (including 60 Minutes) and then released a further trove of documents to a consortium of mainly US outlets including The Atlantic, CNN and Fox Business. This week's coordinated dump of stories (you can read roundups on Tech Policy Press and The Verge) is apparently just the first of a six-week-long avalanche of stories.
Why is it interesting?
There are a few threads here worth teasing out:
1) Facebook's reaction
It's an understatement to say that the company didn't take the mostly negative coverage lying down:
- Haugen's testimony was refuted and her experience minimised on account of having "no direct knowledge of the topic".
- Documents were published on Facebook's company blog ahead of a planned release to the Wall Street Journal, in contravention of the strict embargos that most corporates and media abide by.
- Mark Zuckerberg made jokes about the criticism and used an earnings call this week to claim that Facebook faced "a coordinated effort to selectively use leaked documents to paint a false picture of our company."
- The company's press officer Andy Stone even became a story in his own right for his "brash and combative Twitter presence", courtesy of Input Magazine.
On the whole, it was hostile response.
2) New voices in the discussion
A number of new people with experience of working at Facebook have come to the fore during this story cycle:
- Samidh Chakrabarti, the former Civic Integrity lead, has written a number of very good Twitter threads lifting the lid on the internal machinations of the trade-offs of Trust and Safety decisions but also charting the way forward.
- Nu Wexler, a former spokesperson, has spoken to the media before about his time at the company but chimed in with fresh insights, not least on staff use of the internal messaging tool Workplace.
- Former public policy director Katie Harbath supported Haugen in comments made to mainstream media and wrote in her Substack newsletter about the "emotional rollercoaster" she has been on "seeing the last few years of your work life come out to the public". She was also announced as one of the Integrity Institute's fellows this week.
In the recent past, former staff speaking out about their time working for big tech has not been common, not least because of strict non-disclosure agreements. This feels like something that's changing.
3) Criticism of the consortium coverage
I've noticed a few examples of criticism from onlookers who aren't typically in Facebook's camp (or at least don't have an obvious axe to grind), including the charge that reporting wasn't as rigorous as it could've been and was predicated on leaks that been "optimised for engagement".
Some journalists, perhaps burnt by the decision to be left out of the consortium, also characterised the group as a "bit slapdash" while Ben Smith in the New York Times noted that "competitive pressures have remained close to the surface", which is a polite way of saying it didn't work as a consortium should have.
Dylan Byers, writing for Puck, even went as far as to suggest that "Facebook has essentially replaced the former president as the preferred pinata du jour for the media, to the point where the coverage has gotten sloppy", a point that was contested by reporters from the New York Times and the Wall Street Journal.
4) The sheer number of stories
I don't know for sure but I'd bet more than a few dollars that content moderation has never before seen this level of interest or the number of stories it has done in recent weeks. With UK parliamentary subcommittees and US Senate sub-committees no doubt reading much of the coverage, it will be interesting to see what real-world impact this tidal wave of stories has over the coming weeks.
What can we extrapolate from this?
While we won't know how this reporting cycle will play out, we are starting to build up a better picture of the fractious relationship between the media and platforms that do a lot of content moderation.
It boils down
We can see that on the platform side too. As part of some research into the challenges that Trust and Safety professionals face in their work, I found that media coverage played an interesting role in their daily work:
- Positive media coverage (or sometimes an absence of negative press) is often used as an indicator of success, often at the expense of other metrics used to quantify their work. Getting a write-up up about a new feature or sharing some positive data with a friendly reporter was, naturally, a win for Trust and Safety teams used to being under pressure.
- Negative media coverage at their platform (think takedown errors, AI errors and episodes from your big-money host having to be removed) also came up as a source of frustration during several interviews. These pieces had the effect of compromising team morale and making practitioners feel deflated.
This goes some way to explaining Facebook's combative approach to the criticism it has received, even if it doesn't justify it.
What happens next?
I wrote on Twitter this week about how the coverage of issues hasn't changed in almost 15 years, with the same headlines repeating themselves over and over again.
Is it media's job to fix the problems of platforms? Certainly not. But, as the industry has recently started to realise with the climate crisis, the bare minimum is not always enough.