Whistleblower says Online Safety Bill must stop “legal but harmful” content

online safety bill, frances haugen
© Aprescindere

Speaking to UK politicians, Facebook whistleblower Frances Haugen suggests the Online Safety Bill should address “legal but harmful” content – the type that leads to self-harm

Today (25 October) Facebook whistleblower, Frances Haugen, spoke to the Joint Committee that is working to create the Online Safety Bill in the UK.

The concept of online harm via social media is not new – there are countless studies and examples of people getting eating disorders and extremist ideologies from the content their algorithms feed them. The Bill, created initially after the suicide of 14-year-old Molly Russell in 2017, will be the first real legislation to protect children and adults from online harm.

However, it has been waiting in limbo since 2018.

Frances Haugen revealed that Facebook is fully aware of harm caused to minors and adults by the platform, with incentives for workers to create further harm – as divisive, extreme content keeps people hooked on using the application. While this is a standout revelation, she also explained other, crucial privacy and ethics violations.

“Coming of age in an increasingly digital world”

An investigation found Instagram is aware of negative effects on teenage girls, as early as March, 2020.

There has been a 193% increase in UK anorexia admissions to hospital, for girls under the age of 19. This happened over a period of seven years, from 2010 to 2017. We asked Dr Lynne Green, a consultant clinical psychologist, about the startling rise in figures.

She explained: “Eating disorders do most commonly affect girls between the ages of 13 and 17. This group has several common factors – the most important being: they are going through puberty, so their bodies are changing; they are female, so our culture particularly subjects them to sexualised and unrealistic body images that they may not identify with; and they are coming of age in an increasingly digital world, so it is difficult for them to create a safe space that is reflective of real bodies and lifestyles.”

When speaking to the Joint Committee today, Frances Haugen said clearly that Facebook has been able to operate without meaningful oversight.

Ms Haugen further explained that the company, which also owns Instagram and WhatsApp, has been able to lie to the public about the realities of using the platform. The tech giant has also been able to choose its own interests, when forced to make a decision between advancing profits and the public good.

Facebook AI continues to amplify risky content

She points out the infamous capitol riots of 6 January, 2021. At some point, she explains, the Facebook team should have stopped amplifying content that promoted plans of attack. But AI kept working to share more content about the impending riot and then live footage of the ongoing violence. However, despite “conscientious” and good people working there, she says that nothing meaningful was done because of how Facebook intrinsically works.

Growth is rewarded, not limitation. Even if the limitation is due to ethical considerations of content amplification.

When asked about what the UK should consider in finalising a draft of the long-awaited Online Safety Bill, Ms Haugen emphasised that the Committee should not play “whack-a-mole” on issues. Essentially, the Bill should cohesively tackle the risks of Facebook and other social media platforms, as opposed to isolating one or two issues that could later evolve into several others.

the Committee should not play “whack-a-mole”

She also said that “legal but harmful” content was a key part of what hurts people online. Many of the materials that are circulated, which impact mental health and encourage radicalisation, are not technically illegal – giving moderators a loophole when faced with them. The harm done by the content will continue, especially if it arbitrarily passes a moderation.

Call 116 123 to speak to a Samaritan

LEAVE A REPLY

Please enter your comment!
Please enter your name here