高清福利片

a man looks at the camera, his face is serious
Analysis_

Where to next in Australia's battle with Elon Musk and X?

1 May 2024
The Australian government continues to battle with social media platforms
Professor Terry Flew, in Digital Communications and Culture, says the Australian government's battle with Elon Musk and X over violent content appears admirable, but will it change anything for those vulnerable to its harm?

Over three days in April, Australia's delicate social media ecosystem was blown apart.

The thin line between online regulation and the circulation of disinformation unravelled. In its place now sits a new polarised debate between free speech absolutism and the safeguarding of users from violent content.

Two stabbing attacks in Sydney - including one which was live streamed -聽 narrowed the focus on the significant role of social media platforms when public violence occurs. After the attacks that killed six people and wounded many others in the Bondi Junction Westfield shopping mall on聽 April 13, social media wrongly identified the . The allegation was picked up by one of Australia鈥檚 leading commercial TV networks before it was corrected, leading to racist and anti-Semitic online commentary about the individual because of the false rumours.

The stabbing of Bishop Mar Mari Emmanuel two days later at an Assyrian Church of Christ in south-western Sydney was live streamed to the world, sparking demand the raw, unedited footage be removed from social media platforms altogether.

Australia鈥檚 eSafety Commissioner ordered social media platform X to remove graphic videos of the church stabbing from its site.聽

While X complied with the requirement to delete the content from its Australian sites, it rejected the call to apply a global takedown. In doing so, it risks fines of AUD$782,500 per day for with the directive by the eSafety Commissioner, Julie Inman Grant, to take down material that would be refused classification under the Classification Act.

Elon Musk vs Australia

X聽 CEO, Elon Musk, mocked the Commissioner as 鈥渢he Australian Censorship Commissar鈥 and an image associating X with a brightly-lit castle proclaiming free speech and truth and other platforms with a castle beset by dark clouds and lighting that pointed to censorship and propaganda.

Musk鈥檚 self-proclaimed stance as a 鈥渇ree speech warrior鈥, while , is a part of his global branding of X as 鈥渁nti-woke鈥, which has seen him described as 鈥溾.

An unusual degree of political bipartisanship emerged in the responses of Australian political leaders.

Prime Minister Anthony Albanese called聽 Musk an 鈥溾, while Assistant Treasurer Stephen Jones described X as a 鈥溾. Opposition Senate leader Simon Birmingham聽聽that 鈥淭hey (social media companies) absolutely should be able to quickly and effectively remove content that is damaging and devastating to the fabric of society,鈥 while Greens communications spokesperson Sarah Hanson-Young聽聽Musk as a 鈥渃owboy聽鈥 making money and profiting off outrage and hatred.鈥

Australian PM takes on 鈥榓rrogant billionaire鈥 Musk in censorship row

E-Safety and compliance

The conditions are now in place for a protracted battle between X and the Australian Government about whether compliance can be forced on the company to a directive of the eSafety Commissioner. Three issues can be identified as likely to play out.

First, there is the question of whether Australian Internet laws can be extended internationally. X has argued that the Australian eSafety Commissioner cannot demand a , as such decisions can only be made through international law. In response, the Commissioner argues that the use of VPNs and other devices to evade geo-blocking means that violent content that is clearly illegal under Australian law can still be accessed by Australians.

Second, it presents a significant challenge to a model of digital platform regulation that combines heavy fines for breaches of guidelines with the expectation that industry self-regulation and corporate social responsibility will mean that they will not be enacted in practice.

Such an approach was as a way of 鈥渞atcheting up鈥 platform conduct without seeking to directly regulate online content by putting in place penalties that were sufficiently severe to constitute a credible threat to the companies鈥 financial bottom line. As eSafety Commissioner since 2015, Julie Inman Grant has referred to this approach in the Australian context as , working with tech companies to incorporate higher regulatory standards into their everyday business practice.

X's experience in Australia draws attention to the limits of this 鈥溾 based approach. X withdrew from the Australian Code of Practice on Disinformation and Misinformation聽(ACPDM) administered by the Digital Industry Group Inc. (DIGI) after an against it in a case undertaken by the advocacy group . As a result, X effectively sits outside of the self-regulatory framework to which as Twitter, it had originally been a signatory. The company is clearly prepared to contest fines against it in the courts rather than choose the path of compliance assumed under the co-regulatory, safety-by-design-model.

The question arises as to whether the Australian Federal Government can, or should, set in place its own laws to govern X鈥檚 conduct with regard to issues such as misinformation or content regulation, given that in being able to enforce X鈥檚 compliance with industry guidelines. The Australian Government is reintroducing its Combating Misinformation and Disinformation Bill to Parliament after a consultation process that in response.

The relationship between and existing powers under the Online Safety Act will be the subject of considerable debate, and recent developments have given new impetus to calls for governments to act to set rules for the conduct of global social media platforms.

Finally, the case of X and the global reach of Australian Internet laws points to a broader set of issues around national governments and global digital platforms. What has been termed the 鈥溾 in Internet governance has seen governments increasingly seek to in areas such as competition policy, content regulation, dealings with content providers such as news publishers, and ethical issues related to the uses of artificial intelligence (AI).

The impetus for such measures has often been the sense that the global tech giants simply disregard requests to change and use their market power to steamroll governments.

Until now, this has only disempowered citizens seeking some form of agency against these tech giants. A change of posture from governments could help shift that narrative.


Terry Flew is a Professor of Digital Communication and Culture at the University of Sydney's Faculty of Arts and Social Sciences. Professor Flew is leading a team of researchers in developing the International Digital Policy Observatory, an online database to track policies and regulations dealing with misinformation, AI regulation, online harms, cybersecurity and digital identity.

This story was originally published under by 鈩. Top photo of Elon Musk by Jordan Strauss/AP/AAP Photos

Media contact

Elissa Blake

Media Adviser

Related Articles