Hate and rabbit holes: Facebook whistleblower shares "damning" evidence with British panel

London, UK - Former Facebook employee-turned-whistleblower Frances Haugen has made numerous blistering claims about the tech giant, revealing concerns about child safety, the spread of hate speech, and people being pushed towards extreme interests.

Fromer Facebook employee Frances Haugen has become one of the company's biggest public critics.
Fromer Facebook employee Frances Haugen has become one of the company's biggest public critics.  © Collage: IMAGO / ZUMA Press & 123RF/gilc

Haugen spoke with British lawmakers and peers on Monday for two and a half hours about the inner workings of Facebook.

The data engineer has risen to prominence after exposing thousands of pages of internal research documents she secretly copied before quitting her job in the company's civic integrity unit.

Haugen began by telling a joint committee that Facebook is "very good at dancing with data" and claimed the firm's own research showed Instagram is more dangerous for teenagers than other social media such as TikTok and Snapchat.

Is ByteDance planning to sell TikTok after US ban?
TikTok Is ByteDance planning to sell TikTok after US ban?

She said Facebook could make a "huge dent" in the problem if they wanted to, but they fail to do so because "young users are the future of the platform and the earlier they get them the more likely they'll get them hooked."

"Facebook's own research says now the bullying follows children home, it goes into their bedrooms," she explained.

"The last thing they see at night is someone being cruel to them."

"The first thing they see in the morning is a hateful statement and that is just so much worse."

"Unquestionably" making hate speech worse

Haugen gave evidence to British lawmakers in London at the Houses of Parliament.
Haugen gave evidence to British lawmakers in London at the Houses of Parliament.  © IMAGO / ZUMA Wire

Providing key evidence to the Draft Online Safety Bill (Joint Committee), Haugen said there was a "weak spot" on who you could turn to for escalating concerns within the firm.

She repeatedly said the social network is filled with "kind, conscientious" people, but systems that reward growth make it hard for Facebook to change.

"When I worked on counter espionage, I saw things where I was concerned about national security and I had no idea how to escalate those because I didn't have faith in my chain of command," she claimed.

"I flagged repeatedly when I worked on integrity that I felt that critical teams were understaffed.

"Right now there's no incentives internally, that if you make noise, saying we need more help, people will not get rallied around for help, because everyone is underwater."

Asked about hate speech, Haugen said: "Unquestionably it is making hate worse."

The platform is "hurting the most vulnerable among us" and leading people down "rabbit holes," she added.

Facebook continues to deny claims

A campaigner protesting against Mark Zuckerberg outside the Houses of Parliament ahead of Haugen's testimony.
A campaigner protesting against Mark Zuckerberg outside the Houses of Parliament ahead of Haugen's testimony.  © IMAGO / ZUMA Press

Elsewhere, the whistleblower claimed that Facebook's artificial intelligence struggles to work with some languages, suggesting that even content by British audiences might be under enforced compared to American English.

Facebook has repeatedly rejected Haugen's claims, saying her attacks on the company were "misrepresenting" the work it does.

A Facebook spokesperson said while the firm has rules against harmful content, it agrees regulation for the whole industry is needed.

"Contrary to what was discussed at the hearing, we've always had the commercial incentive to remove harmful content from our sites.

"People don't want to see it when they use our apps and advertisers don't want their ads next to it."

Reacting to Monday's hearing, Andy Burrows, head of child safety online policy, at the National Society for the Prevention of Cruelty to Children, said: "Frances Haugen's damning evidence has highlighted once again that safety is simply not a priority for those at the top of Facebook.

"She was also explicit about the scale of the challenge needed to make the company's services safe for children after years of putting profit and growth first.

"We agree that criminal sanctions would make senior managers take child safety seriously and will be vital to engineer a culture change in which tech firms make their products safe-by-design, not safe after-the-event.

"Transparency is also key and the regulator will need the power to request data, risk assessments and research if it is to successfully hold platforms to their duty of care for young users."

Cover photo: Collage: IMAGO / ZUMA Press

More on Facebook: