Social media behemoth Facebook has been accused of “tearing our societies apart and causing ethnic violence around the world” in a bombshell interview with a former employee-turned-whistleblower, fuelling concern about the tech giant's influence in Australia.
Pressure is growing on the Mark Zuckerberg-led firm to do more to counter hate speech and extremism on its platforms after the identity of the whistleblower behind a recent series of Wall Street Journal articles exposing details of internal documents dubbed the 'Facebook Files', was publicly revealed for the first time on Sunday.
Whistleblower Frances Haugen said the internal documents she leaked to the Wall Street Journal demonstrate Facebook had embraced algorithms that amplified hate speech in pursuit of profits, claims the company has strongly disputed.
Ms Haugen called for greater regulation of the company, saying "Facebook over and over again has shown it chooses profit over safety".
What do the revelations mean for Australia?
Dr Kaz Ross, an independent researcher of far-right extremism and conspiracy theories in Australia, said Ms Haugen’s allegations highlight the need for more transparency about Facebook’s inner workings.
“The way the Facebook algorithm works is opaque … we only have their word to take for it,” Dr Ross told SBS News.
“For the whistleblower to come out and say, ‘well, it's actually optimised for hate’, that doesn't surprise me at all, because the ultimate basis of the Facebook algorithm is engagement. And we already know that fear and anger are the emotions that drive engagement.
“So you will have much more engagement on posts that are controversial, that stir up dissent and debate, then you will where everybody's just happily holding hands and agreeing.”
Mark Zuckerberg-led tech giant Facebook has been accused of amplifying hate speech in pursuit of profits. Source: Getty
Despite this, Dr Ross said Facebook had made progress in some areas, including tackling COVID-19-related misinformation and racism.
“Last year, Facebook was a major site for the anti-Chinese sentiment that was expressed in our society, people were setting up fake Chinese-sounding profiles ... which were all basically just racial hate aimed at Asian people in general and Chinese people specifically. That seems to have dropped off a bit this year,” she said.
“It is true that they have responded to what people have said, and they've improved their products by putting in blocks and warnings. They've dramatically improved the ability of people who run Facebook pages to moderate by introducing the ability to turn off comments.”
Facebook, Instagram a 'gateway' for extremism?
Features such as Facebook Live have been exploited by those looking to amplify Melbourne’s illegal anti-lockdown protests, Dr Ross warned, with the platform often acting as a “gateway” to encrypted platforms where extremist content can be shared more freely.
“Facebook Live is one of the prime drivers of interest in the Melbourne anti-lockdown protests,” Dr Ross said.
“The so-called citizen journalists that are out there live streaming from the protests … and they can be on there for eight hours a day. And it is absolutely true that there are hundreds and hundreds of people who've joined the anti-lockdown protests in Melbourne because they’ve seen it on Facebook Live.”
Dr Ross also warned that photo-sharing app Instagram, which Facebook bought in 2012 for USD$1 billion (A$1.37 billion), has also become a hub for extremism.
“When we focus on Facebook, we have to remember they also own and run Instagram, and that tends to fly under the radar,” she said.
“We’ve seen recently some young people who are facing charges for violence and terrorism were very, very active on two platforms. One is an encrypted app, and the other one is Instagram … Instagram is certainly the home of the COVID-denying influencer, the wellness influencers on Instagram, and the stuff they do on Instagram, I'm not sure that would even fly on Facebook.”
Ms Haugen told 60 Minutes that "the version of Facebook that exists today is tearing our societies apart and causing ethnic violence around the world".
She blamed the company for failing to do enough to prevent the US Capitol riots on 6 January, saying safety settings activated to prevent the spread of misinformation prior to the 2020 US election were turned off in order to "prioritise growth over safety". She described the act as "a betrayal to democracy".
“It’s paying for its profits with our safety”, she said, adding she didn’t trust that the tech giant was "willing to invest what actually needs to be invested to keep Facebook from being dangerous".
Who is the Facebook whistleblower?
Ms Haugen, 37, outed herself in an interview with CBS’ 60 Minutes, broadcast in the United States on Sunday night.
Ms Haugen, who worked as a product manager on Facebook’s civic integrity unit before quitting earlier this year, accused the firm of putting “growth before safety” in her interview with journalist Scott Pelley.
Former Facebook employee Frances Haugen has accused the company of "tearing societies apart". Source: CBS
Ms Haugen had been in her role at Facebook for nearly two years before departing. She had previously worked at Google and Pinterest.
She said she left Facebook in exasperation and frustration over the company's handling of hate speech.
Ms Haugen took with her internal memos and documents which have been shared with the Wall Street Journal as part of their coverage on the story over the last three weeks.
How has Facebook responded to the claims?
Ms Haugen said there was a “conflict” between “what was good for the public and what was good for Facebook”, and that the firm “chose over and over again to optimise for its own interests - like making more money”.
The firm’s vice president of policy and global affairs, former United Kingdom deputy prime minister Nick Clegg, slammed claims the company contributed to the 6 January riots as “misleading”, prior to the broadcast of the 60 Minutes episode on Sunday.
Contrary to Ms Haugen’s claims that Facebook’s algorithms boosted hate and anger, Mr Clegg said the firm aimed to “mitigate the bad, reduce it and amplify the good”.
However, an internal document leaked by Ms Haugen showed that the company estimated that it “may action as little as 3-5 per cent of hate” and around 0.6 per "violence and incitement" on Facebook; “despite being the best in the world at it”, with misinformation, toxicity, and violent content “inordinately prevalent among reshares”.
“We have evidence from a variety of sources that hate speech, divisive political speech, and misinformation on Facebook and the family of apps are affecting societies around the world,” another leaked document said.
Facebook published a statement disputing the points that Ms Haugen made after the televised interview.
"We continue to make significant improvements to tackle the spread of misinformation and harmful content," said Facebook spokesperson Lena Pietsch.
"To suggest we encourage bad content and do nothing is just not true."
SBS News put questions to Facebook but did not receive a response by deadline.
In what forums is the whistleblower providing evidence?
Documents have been shared with attorneys general from several states, including California, Vermont and Tennessee.
Ms Haugen filed complaints to the US Securities and Exchange Commission on the basis that as a publicly-traded company, Facebook is required to not lie to its investors, or even withhold material information.
The complaints compare Facebook's internal research with its public statements on the issues it researched, according to the 60 Minutes interview.
Ms Haugen's lawyer, John Tye said she had also spoken with politicians in Europe and is scheduled to appear before the British parliament later this month, in hopes of spurring regulatory action.
Mr Tye said he and his client are also interested in speaking with politicians from countries in Asia, since many of the issues that motivated Ms Haugen stem from the region, including the ethnic violence in Myanmar.
Why is Instagram under fire?
Ms Haugen is also due to testify about Facebook’s practices before a US Senate subcommittee on Tuesday.
The hearing titled ‘Protecting Kids Online’ will examine Instagram’s impact on the mental health of younger users.
The hearing follows a recent backdown by Facebook on plans to launch a version of Instagram tailored for kids under the age of 13, following revelations that the company’s own internal research showed the app harmed teenage girls’ mental health.
Last month, head of Instagram Adam Mosseri announced that the company would halt work on its ‘Instagram for kids’ while it worked on building “parental supervision tools”.
“We believe building 'Instagram Kids' is the right thing to do, but we're pausing the work,” Mr Mosseri said in a .
“This will give us time to work with parents, experts, policymakers and regulators, to listen to their concerns, and to demonstrate the value and importance of this project for younger teens online today.”