There are calls for social media companies to more closely police their platforms after the Christchurch mosque massacre. Videos of the attack were streamed live by the attacker, and viewed and shared online by millions around the world. Social media experts say companies need to learn from the incident.
For Sydney lawyer and devout Muslim, Ahmed Dib, news of the terrorist attack in Christchurch was beyond comprehension. But what added to the horror was seeing it unfold on social media, as the gunman live-streamed the massacre on Facebook.
"It's absolutely heartbreaking when one considers that I take my son to the mosque and, in a place of worship where you're meant to feel safe, you're meant to feel at one with God. One can only think about someone being murdered, but we saw a whole mosque and group of people at a mosque massacred. Nobody wants that to be online."
It's prompting calls for social media companies to be more vigilant of their platforms, to ensure videos of things like the Christchurch massacre can't be widely circulated online.
Facebook says it removed 1.5 million videos of the attack in the 24 hours after it, and is working to ensure content it terms "violating" is removed as fast as possible from the platform.
YouTube and Twitter say they have also been active in trying to stop the video circulating.
New Zealand Prime Minister Jacinda Ardern says it's important similar videos don't get a chance to go viral in the future.“I would call on our social media platforms of all variety to demonstrate the kind of responsibility that both lead to these events and that includes those who perpetuate the messages in the aftermath. There's a lot of work that needs to be done,” she said.
Social media experts agree.
Most social media platforms have complex algorithms able to quickly locate and remove content in violation of copyright law.
Questions are now being asked why similar safeguards in regard to violence aren't as robust.
Dr Dinesh Kanchana Thilakarathna a lecturer in Distributed Computing at the University of Sydney's School of Computer Science, hopes society can learn from the Christchurch attack. "This type of terrorist event will add urgency to the industry as an academia to find the right tools to improve these types of inappropriate content-sharing and come up with the right tools to automatically detect these types of activities," he said.
Dr Emily van der Nagal, a lecturer in Communications at Monash University, says until then, a handful of mitigation strategies could be viable.
But strangely, she says, one possible tactic wasn't deployed. "Shutting down these platforms, or even just, for example, the live-stream component of Facebook or the auto-play video function on Twitter, closing that for a brief period of time, I think, would have been a completely acceptable option. To be honest, I'm surprised that's something these platforms didn’t even seem to consider."
Dr van der Nagal says social media platforms need to be able to manage what people post.
"Social media platforms have an increasing role, not just as neutral facilitators of social media content, but also as publishers. They have a role, not the only say of what goes up, but they certainly have a role to play in what goes on across their platforms and what stays there."
Prime Minister Scott Morrison has added his voice to others calling for social media companies to be more accountable for what is posted on their platforms.
The Prime Minister says it is not enough for these companies just to reap the financial benefits, without also accepting the moral responsibility to ensure their sites aren't used to foster hate.
"If you can write an algorithm to make sure that the ads they want you to see can appear on your mobile phone, then I'm quite confident they can write an algorithm to screen out hate content on the social media platforms. So we have to work with them."
Mr Morrison is not only putting pressure on social media companies in Australia, but wants to see world leaders addressing the issue in a collective way.
He says social media companies need to have clear guidelines on how all nations expect them to monitor hate speech. "Australia can take action in this area and we are looking at some practical proposals in this area right now. But for those actions to have greater impact on social media companies and the technology companies, it has to be done in concert with other big economies around the world".
"These social media companies have built this technology, they've created these capabilities and in the overwhelming majority of cases they're available for peaceful and happy purposes, but we do know that they can be used and weaponised by terrorists," he added.
Scott Morrison has written to Japanese Prime Minister and G20 chairman, Shinzo Abe, urging him to make the issue of social media governance top order at the upcoming summit in June.
In the letter Mr Morrison wrote it is "unacceptable to treat the internet as an ungoverned space".
The Prime Minister also highlighted the need to develop clear consequences for technology and social media companies should they facilitate the spreading of hateful acts.
Mr Morrison is hopeful the G20 Summit in Osaka will provide the ideal environment to discuss and create guidelines for social media companies.
"The G20 has worked together to make sure these big companies pay their taxes, (so) I'm sure we can work together to make sure they protect our citizens by ensuring that their tools that they developed are not used by terrorists as weapons to advance their agendas of hate."
The government's criticism of the major social media platforms has received bipartisan support, following the Christchurch terrorist attack.
Opposition leader Bill Shorten says social media companies need to be responsible for what they allow to be published and have the moral obligation to remove hateful content.
“Social media should not be a hole for the haters to hide in. Social media and those digital keyboard hitmen and the digital underbelly of it, shouldn't be able to carry out their hate speech without accountability."