I want to support eSafety, but ...
Should we get behind Australia’s latest attempt to limit the power of digital platforms, or should we be concerned about regulatory overreach?
In the last newsletter I discussed the ways in which social media platform X has rebuffed Australian regulatory authority. I pointed to three separate actions by X, two of which involved the eSafety Commissioner.
What a difference a fortnight makes. In that time, the conflict between eSafety and X has escalated dramatically. On Wednesday afternoon, the two parties faced the Federal Court for the second time. eSafety sought continuance of its interim injunction which seeks to give effect to its previous notice ordering X to hide videos of the stabbing of Bishop Mar Mari Emmanuel in suburban Wakeley on April 15. Although X had used geoblocking to restrict access in Australia, it appears eSafety regards this action as insufficient because people using VPNs can still access the material.
The removal notice and the seeking of the injunction prompted Elon Musk to ask: ‘Should the eSafety Commissar (an unelected official) in Australia have authority over all countries on Earth?’ Not surprisingly, this attitude from Musk and X Corp helped to galvanize public opinion in Australia, or at least that of Australia’s main political parties. The Prime Minister said that Musk considers himself above the Australian law. The Coalition, the Greens and some independents have all appeared to back the government in standing up to Musk and this international tech company, although Peter Dutton later criticised any attempt to extend Australian law to other jurisdictions.
But is it that clearcut? Part of the problem in answering this arises from the confusing commentary on the topic and the absence of primary sources such as the original removal notice issued to X by eSafety. For example, on the weekend, multiple news reports relying on a Reuters report appeared to suggest that eSafety was seeking the removal of content without specifying that content. That appears to have been quite wrong, with later reports claiming that the order concerned 65 tweets, eSafety announcing that the order identified ‘specific URLs’, and a subsequent discussion of the futility of nominating URLs that are replaced by others as the material is reposted.
In trying to make sense of this, there are two legal issues I’ve been grappling with. The first concerns the grounds on which the removal notice was issued – ie, what is it that eSafety is seeking to have removed and what is it about this content that contravenes the law. We know from a statement issued by eSafety on Tuesday that the removal notice relates to ‘extreme violent video content’ depicting ‘the violent stabbing attack on Bishop Mar Mari Emmanuel’ and that this did not relate to ‘commentary, public debate or other posts about this event, even those which may link to extreme violent content’. And we know from the Federal Court order that the removal notice was issued under s 109 of the Online Safety Act, and that as a result eSafety regards the Wakeley video as ‘class 1’ material. This last aspect means that, if it were classified by the Classification Board, it would fall into the ‘Refused Classification’ category. RC material is not able to be sold or distributed in Australia – in other words, it is banned.
While the concept of ‘class 1 material’ comes out of the Online Safety Act and the industry codes of practice that apply to social media and other online services, the RC category itself comes out of the Classification Act, the National Classification Code and three sets of guidelines including the Guidelines for the Classification of Films (which apply to videos such as the Wakeley stabbing) . Among the content that will be classified RC are: offensive depictions of violence ‘with a very high degree of impact or which are excessively frequent, prolonged or detailed’ (the Guidelines); videos that ‘promote, incite or instruct in matters of crime or violence’ (the Code); or a film that ‘advocates the doing of a terrorist act’ (the Act). eSafety has not told us which of these or other provisions are engaged here, but in media commentary on Tuesday, the Minister referred to the fact that the video depicts actual violence and that there is a terrorism element. She told Radio National: ‘“Class 1” depicts real violence, it has a very high degree of impact, in a way that's gratuitous and likely to cause offence to a reasonable person. In this case, the very high degree of impact is reached by virtue of the terrorism designation that has been given to this particular event’.
So it could be that eSafety regards the video as class 1 on the basis of it containing actual violence of a very high impact, with the terrorism element contributing to that level of impact. Of course, this is just speculation, and there is a lot of room for argument in classification decisions. Opinions will differ on whether the video alone – in the absence of some other content that might, for example, amount to incitement – should be of sufficient harm to support a blanket prohibition. The content here is not the same as the content we saw in the case of the Christchurch mosque killings – it's not live-streamed content of mass murder, broadcast by the perpetrator himself. If accompanied by comments that, for example, incite racially motivated violence, it’s hard to see many of us objecting to a strong regulatory response to that. But the video alone?
The second legal issue concerns the scope of the notices issued by eSafety: the ‘all countries on Earth’ point, as Musk puts it. The Online Safety Act certainly supports the removal of material from social media services so that it not available to people in Australia using the service. The commentary on this matter suggests eSafety is seeking to have the material removed altogether from public access worldwide. Or is it just pushing X to plug the gaps – such as VPN access – that still allow people in Australia to view the video? I don’t know whether the Act would support a more comprehensive, worldwide ban and in any event there would be all the practical problems of trying to enforce an Australian court order in another jurisdiction. But there is also a question of principle here – whether this is something that an Australian regulator should be seeking to do. When we learn the full story, we might agree that eSafety should take action in relation to an event that occurred in Australia and was streamed from Australia. We might even agree that, as regulators in countries with a similar approach to online safety attempt to work together to address the harms that arise from digital platforms, it’s Australia’s responsibility – in this case – to take action to seek to restrict access more broadly. But a worldwide ban?
This regulator has an outstanding record of explaining to the community the laws under which it works and the reasons for its actions. And I have no hesitation supporting the other action it has taken against X, which is looking more and more like a rogue operator. But on this latest action, if eSafety wants to bring the community with it – including those of us who are inclined to support it in most situations – it will need to do more to explain why the video needs to be banned and why (if this is what it’s seeking) the order needs to be enforced beyond Australia.
This was featured in our fortnightly newsletter | Subscribe for free for important insights and research news!
Derek Wilding, CMT Co-Director