All kinds of revelations have emerged about the Facebook Whistleblower leaks over the past few weeks. Perhaps most disturbing is that Facebook is aware of the detrimental effect on teenage girls by increasing levels of anxiety and depression. Yet it doesn’t do everything in it’s power to fight this.
Not only that, the company has ‘blamed’ their own researchers for negative press, reports the US network NBC.
According to the whistleblower, Facebook conducted internal research on teenage girls and Instagram, which is in its family of social platforms along with WhatsApp. One slide from the company reads: “We make body-image issues worse for one in three teenage girls”. Facebook research found that when 32% of teen girls felt bad about their bodies, Instagram made them feel worse. 13% of UK teens traced a desire to kill themselves to Instagram. (!)
Parents whose children spend time on Instagram know firsthand how addictive and consuming it can be for them. And tragically we know of how 14-year-old Molly Russell was inundated with graphic images of suicide and self harm on the Instagram platform. In 2017 she ended her own life.
Dangerous not just for kids
The whistleblower, who we now know is 37-year-old Frances Haugen, is an ex-Facebook product manager who worked on civic integrity. Before she left Facebook, she downloaded tens of thousands of internal communications as evidence that the company isn’t doing all it can in keeping the platform from being dangerous.
It’s not just teenage girls that Facebook/Instagram negatively affects. The platform, which has 3 billion users worldwide (that number represents 70% of the world’s Internet users), amplifies hate, misinformation and political unrest.
For Facebook, doomscrolling pays
At the heart is a news algorithm that priortises content that gets a strong reaction from readers. Facebook’s own research shows that content that’s hateful, divisive and polarising is what keeps people on the platform and scrolling. This means the company can display more ads and make more money. If they changed the algorithm to show less of this content, they would lose revenue.
If you want to read more about social algorithms (not just Facebook) we suggest you read Sinan Aral’s The Hype Machine: How Social Media Disrupts Our Elections, Our Economy and our Health.
Facebook’s same old response
Facebook says it is doing all that it can to mitigate the negative effects of its platforms. With these new whistleblower revelations, it’s clear they are not. Perhaps cannot. You don’t have to think Mark Zuckerberg is malevolent to recognise that he has created a monster that he and the tens of thousands of employees that work for him are unable to control.
This insight from an article in Politico really struck us:
It’s time to regulate Facebook
Sorry, Facebook. After years of our being told that you could police yourself and to trust you were working hard to keep users safe from your platform’s ill effects, it’s time for someone else to step in. We need an entity whose mandate is to protect society and regulate social media, rather than a company that has its eyes on the bottom line.