The pressure is mounting for Facebook to develop more open and responsive ways of dealing with online hate, write Dr Fiona Martin and Dr Jonathon Hutchinson.
The escalating debate about who Facebook should protect, ban or report to local authorities, and how fast it should intervene, is a reaction to the way social media companies are carving out their own policies.
It took some persuading, but Facebook has聽聽an international social media task force to help combat online hate in the wake of anti-refugee xenophobia on its pages.
It鈥檚 a good outcome for the German Chancellor, Angela Merkel, and her Justice Minister, Heiko Maas, who last week聽听迟辞听聽in line with German law. This came after users complained that Facebook was聽聽of racist abuse and threats.
It鈥檚 also a relative win for the tech giant, which recently boasted about one billion active users in a day and has a聽.
Facebook is keen to avoid any new legislative limits on its operations and to minimise direct censorship. The company said it preferred to allow 鈥溾 debate and discussion, rather than deletion.
But Germany has now joined the聽,听听补苍诲听聽governments in asking Facebook to remove dangerous, offensive or illegal content.
So the pressure is mounting for it to develop more open and responsive ways of dealing with these problems.
The escalating debate about who Facebook should protect, ban or report to local authorities, and how fast it should intervene, is a reaction to the way social media companies are carving out their own transnational, libertarian policies.
To a large extent, this imposes a US free speech paradigm on countries used to more interventionist media regimes, even though the legal limits of that paradigm are being聽聽by hate speech.
Facebook would much rather we police the pages and posts we make and read, rather than it having to regulate other people鈥檚 bad behaviour. Safety,听, is 鈥渁 conversation and a shared responsibility鈥. Users are advised to keep themselves safe by hiding or deleting offensive comments and blocking abusers.
Where content does breach local laws but not its聽聽Facebook says 鈥渨e may make it unavailable only in the relevant country or territory鈥.
But neither strategy stops hate posting, they just reduce its social visibility.
Another way the free speech push plays out is with Facebook鈥檚 policy on public figures. Its聽聽say the company will act on complaints of harassment and direct threats against private individuals, but it allows more critical discussion of public figures.
The company鈥檚聽聽is worryingly broad:
We permit open and critical discussion of people who are featured in the news or have a large public audience based on their profession or chosen activities.
This would include academics, journalists and community spokespeople.
The presumption seems to be that people who enter public debate should expect abuse, or that they are better equipped to deal with it than average users. This premise is demolished by the聽聽who was the聽on social media.
Facebook鈥檚 standard partly explains why it didn鈥檛 immediately act on explicit, sexualised threats made recently against journalist聽, after she聽, posting a selfie that included some explicit language written on her bare chest.
Ford claims moderators moved to temporarily聽聽because she had breached the community standards. Facebook denies this.
As the company is not publicly accountable for the policing of its standards, we do not have a clear account of what actually happened.
Ford鈥檚 experience, and that of UNSW after its聽聽twice recently, illustrate the problems that Facebook has in managing and accounting for its procedures for tackling online violence.
贵补肠别产辞辞办鈥檚听聽shows how complicated the workflow is for responding to a complaint.
There鈥檚 frustration among those Facebook business partners who find they can鈥檛 get a quick resolution to reports of anti-social or illegal activities. The ABC聽聽for several months to get聽聽taken down after presenter Jill Meagher鈥檚 murder.
There鈥檚 no doubt that Facebook is investing in research, policy and education measures to combat online violence. Its聽,听聽and other聽聽demonstrate this.
But promoting self-protection is a small part of a larger equation. Facebook needs more open, collaborative approaches to tackle violence online.
At the recent聽聽of Australian online community managers, conference co-founder Venessa Paech noted that Facebook had yet to formally consult members of its network about the efficacy of its universal standards.
She said the community managers were keen to give feedback about the challenges of applying these standards across very different types of communities, many of which are built on Facebook聽聽or its聽.
As one of the world鈥檚 largest digital intermediaries, Facebook is at the vanguard of a new industry sector that is confronted by violent online behaviour every day.
So while the company is rightly wedded to the free and open credo of internet communication, it has to recognise that collaborative policy development 鈥 with governments and professionals 鈥 is paramount.
It鈥檚 the principle of working with all your stakeholders, rather than on behalf of them, and it鈥檚 vital to our mutual investment in social media.