- Posted on 10 Apr 2025
- 5-minute read
With the election campaign in full swing, the Australian Electoral Commission is taking a pro-active approach to tackling the inevitable challenges posed by the digital media environment. It is once again actively engaging with electoral misinformation on social media, often humorously, despite the seriousness of its task. In January, the AEC announced the Voter’s Guide to Election Communication, a compendium of advice for voters to promote digital literacy ahead of the election. It expands on resources implemented during previous national votes, such as a register of disinformation on electoral processes, and the ‘stop and consider’ advertising campaign. But it has limitations.
A series of videos offers advice about checking for reliable sources, being sceptical about motives, reducing impact by not forwarding misinformation, and being more aware of manipulative communication strategies. Reinforcing this message, the communication tactics catalogue explains common manipulative techniques, while the communication channels catalogue provides advice on common questions that arise during election campaigns, including whether bulk text messages, lies or deepfakes are legal. Another page provides advice on political influence, including transparency requirements for donations and restrictions on foreign influence.
There is also a dedicated webpage on AI and elections and video explainers on AI-generated misinformation and deepfakes. The webpage rightly observes that there is little evidence so far of deepfakes undermining elections, and a recent statement from the AEC noted that focusing on the perceived risks of AI to democracy “could by itself damage public trust in democracy. We need to remember to keep things in perspective.” Nonetheless, the webpage offers visitors advice on how to spot deepfakes, as well as where to report content of concern.
What does concern the AEC is a narrow range of communications that mislead or deceive an elector ‘in relation to casting a vote’, as well as communications that do not include required authorisations. The narrowness of its concern is no fault of the AEC’s, since its powers are circumscribed by the Commonwealth Electoral Act. But it does result in the omission of a whole range of information that could support the public sphere in critical election periods.
For example, the catalogue of communication tactics draws on research in inoculation theory, an experimentally supported method of building resistance against misinformation and manipulation. But, as with the theory itself, the focus on individual psychology leaves a whole lot out of the discussion. Thus, while there is reference to political messaging and online media, the catalogue is essentially a compendium of logical fallacies. It includes little information, for example, about how disinformation campaigns work at scale. Understanding logical fallacies is important to critical thinking, but this is not much of a defence against efforts at mass manipulation using bot networks, sockpuppet accounts or state-backed troll armies, orchestrated attacks on trustworthy media, or more humdrum but disingenuous political communication strategies like astroturfing. Nor is it much of a defence against simple partisanship. And like the idea of a carbon footprint, relying on media literacy can deflect responsibility onto the individual rather than addressing systemic issues.
Information on some of these issues is provided on various government websites, including Home Affairs, ACMA, and the eSafety Commission. The government has also provided funding for digital literacy education in schools. But this fragmented approach to the problem won’t count for much if we don’t demand greater efforts from politicians, media and digital platforms to improve the digital public sphere.