Is transparency enough
It’s Senate estimates week, and two topics on the agenda for the Environment and Communications Committee were the forthcoming legislation on age limits for social media and the misinformation bill currently before the legislation committee. We’ve previously raised concerns about both these potential reforms. One of those concerns is the government’s piecemeal approach to platform regulation, which seeks to address particular areas of harm without addressing the fundamental problem – the lack of accountability in how platforms operate.
The misinformation bill, at least, focuses on platform systems and processes. Clauses 67 and 68 of the bill leave discretion for content moderation with platforms; ACMA will have no role in assessing particular content, and under questioning by the committee, ACMA indicated that their main concern is transparency. But is transparency sufficient for accountability? Senator David Sharma pursued this line of questioning, though he was warned by the chair not to stray into territory that is properly considered in the committee inquiry on the bill. The rub is this: If discretion for content moderation decisions and policies lies entirely with platforms, on what basis will ACMA be making an assessment about platform compliance with their obligations? In other words, how will they be held accountable?
ACMA’s responses suggest that they consider the bulk of the work will be done by transparency obligations. If platforms do not have systems in place, or if they’re not transparent about those systems, then ACMA would take enforcement action. But in addition to promoting transparency, the purpose of ACMA’s new powers, as expressed in the EM, is, to “hold digital communications platform providers to account for the effectiveness of actions taken by them to counter the spread of misinformation and disinformation on their services” (my emphasis). When Sharma pressed on this point, ACMA chair Nerida O’Loughlin said that it is for platforms, not ACMA, to judge the effectiveness of their systems. ACMA would conduct research to see what the community thinks of platform efforts, and they would look at whether complaints were properly handled, but “We won’t make a decision about whether they’ve got it right or not.”
Similarly, with respect to Free TV’s ongoing review of the commercial television code of practice, ACMA indicated that it would ask Free TV to provide all submissions on the bill, to gauge whether community sentiment was appropriately taken into account. The difference there, of course, is that ACMA has the power to hold broadcasters to account for the outcomes of their complaints-handling processes.
While the approach in the misinformation bill limits the potential for ACMA’s powers to intrude unduly on freedom of expression, there’s a risk that the bill will provide transparency without genuine accountability. For the latter, someone needs to be charged with, inter alia, looking at whether platform complaints are correctly resolved. Under the current voluntary code, this role is performed by an independent complaints committee, and perhaps ACMA contemplates such a body overseeing complaints under the co-regulatory system as well. But to provide genuine accountability, the bill should establish a body, independent of both industry and government, to determine if complaints are correctly resolved and require platforms to take account of its advice in their content moderation approaches, and ACMA to do so in its compliance assessments.
Still, this would provide accountability only for a small part of digital platform operations. For full accountability, a more comprehensive and holistic approach to platform regulation is needed.
Michael Davis, CMT Research Fellow