At your discretion
The release of the exposure draft of the Combatting Misinformation and Disinformation bill in July last year was met with widespread criticism – less a chorus than a cacophony. Chief amongst many fears was the potential threat to free speech from new powers the bill would grant to the Australian Communications and Media Authority. More than a year after consultation on the exposure draft closed, the government has finally introduced a revised bill to parliament.
The revised bill includes several changes that should allay some of the concerns over free speech. Most significantly, clause 67 proscribes the imposition of any rule that would require a platform to remove content or user accounts, except where such a rule would address ‘inauthentic behaviour’ – the use of automated systems or coordinated action to deceive users (noting that this is quite narrowly defined). Conversely, nothing in the bill prevents a platform from removing content or user accounts. Content-moderation policies are therefore left to the discretion of digital platforms. This change gives substance to the government’s claims that the legislation is designed to promote transparency and accountability over platform systems and processes.
The revised bill also includes minimum requirements that apply outside any code registered or standard imposed by ACMA. These include obligations to conduct risk assessments and publish the results, to publish a policy or provide information on the platform’s approach to misinformation, and to publish a media literacy plan explaining how the platform empowers users to identify and respond to misinformation. ACMA may augment these requirements by imposing rules relating to these areas as well as to user complaints and record keeping. These new provisions represent a significant improvement in the ability of the legislation to promote platform accountability and transparency.
Unfortunately, other problems remain unresolved. As the exposure draft did before it, the revised bill undermines its ability to promote accountability by setting too narrow a scope, both through restrictive definitions of misinformation and disinformation that require reasonable likelihood of serious harm and through the exclusion of professional news and private messaging. A narrow scope protects freedom of expression from government overreach. But it also limits accountability by excluding the majority of platform content moderation decision-making from its scope. If a platform chooses to remove my post (or say, a professional news story), but that post does not surpass the threshold of serious harm, then this bill will not ensure I have recourse to dispute the platform’s decision. It is outside ACMA’s powers to impose a rule requiring a platform to provide me with such recourse.
Leaving content moderation to the discretion of platforms means that platforms are effectively the arbiters of truth. Better than government, you might say, and I would agree. But platforms undeniably have power over our freedom of expression, and there is no guarantee that they will act to protect and promote it. But this is as much an objective of the legislation as mitigating harm.
As I’ve noted before, this might seem like an unfortunate but inexorable dilemma. But it isn’t. We can maximise platform accountability and minimise the threat of overreach. Specifying that the discretion to moderate content remains with platforms means there is no need to be unduly restrictive over the scope of content or services to which the bill applies.
Michael Davis, CMT Research Fellow