The disinformation dilemma

Since public consultation ended last August on the exposure draft of the combatting misinformation and disinformation bill, there have been few clues on what changes the government is thinking about to address the feedback. One clue came in April in the issues paper for the review of the Online Safety Act, which indicated there would be ‘new measures to improve protections for public debate, freedom of speech and religious expression, improve the transparency and accountability of platforms’ decision making, and improve public visibility into the efficacy of platform misinformation and disinformation strategies.’
Another clue came last week in a submission from the Department of Infrastructure, Transport, Regional Development, Communications and the Arts to the ongoing inquiry on Social Media and Australian Society. There, the department states that the bill ‘would not enable ACMA to request [that] specific content or posts be removed from digital platform services and includes strong protections for privacy and freedom of speech. For example, professional news content, online content related to satire, reasonable public debate (i.e. academic, scientific, religious or artistic), and private messages would be exempted entirely.’
This suggests the government has taken account of the free-speech concerns raised by the exposure draft – which in our view were not without foundation, if somewhat overblown – and that the limitations on ACMA’s powers will be clarified. And while the final form of the bill remains to be seen, it also suggests that ‘excluded content for misinformation purposes’ will be expanded to include ‘reasonable public debate’, as well as professional news, satire and private messaging.
These changes will no doubt be welcomed by many critics of the bill. But as we have previously argued, narrowing the scope of ACMA powers by excluding certain types of content also narrows the range of content for which platforms can be held accountable. The bill thus fails to acknowledge the imposition that platforms make on freedom of expression when they moderate user content. Excluding a certain class of content does not prevent platforms from moderating that content – it only limits ACMA’s powers to hold platforms accountable for how they engage in that moderation. Thus, ACMA will not have the power to request data about how many professional news items a platform has removed, or why. Nor will it be able to ensure that platforms provide news companies with that data, or with access to an appeal mechanism in the case that a news item is removed.
This is not idle speculation. YouTube’s removal of Sky News Australia videos under their Covid misinformation policy is an example of an action that would be excluded from the bill. Excluding it will not prevent YouTube from removing professional news videos in the future, but it will certainly mean they cannot be held accountable under the bill for doing so.
In this way, the bill undermines its own objectives by failing to hold platforms accountable for a wide range of content-moderation measures. This might seem like an unfortunate but inexorable dilemma. Surely it is more critical to limit government power over online speech than platform power. But, as we argued in our submission on the exposure draft, we can in fact do both; by explicitly limiting ACMA power to functional matters relating to ‘platform systems and processes’, there is no need to be unduly restrictive over the scope of content or services to which the bill, or an industry code, applies. This would provide increased accountability for platforms, while allaying concerns about government overreach.

Michael Davis, CMT Co-Director