EFA has put together a submission to the DBCDE accountability and transparency review on what measures would be required to make mandatory ISP filtering legitimate. Many thanks for Irene Graham and Kylie Pappalardo for their tremendous help in putting this together.
If you want to put in a submission, the deadline is today. You can see our submission for some suggestions.
The point we’re making is that filtering is a terrible idea, but if it absolutely must be introduced, it must be legitimately administered. The only justification for secret censorship that are compatible with liberal democratic values is where the material is illegal to possess and publishing identifying information on censored material would cause direct harm – as may be the case with publishing the URLs or other identifying information of child sexual abuse material. Even in these circumstances, there are stringent requirements for notification, rights of appeal, and regular rigorous oversight of the secret list. In all other circumstances, classification, in order to be compatible with the ideals of liberal democracy, must be carried out in an open and accountable manner.
Our full submission is available here (PDF).
The specific recommendations we are making are over the fold.
- If mandatory ISP filtering is introduced, only the portion of the list of banned URLs that relate to illegal child sexual abuse material should be secretly maintained; the broader category of material that is Refused Classification but not illegal to access or possess should be treated in an identical way to offline classification of publications, films, and computer games;
- The Classification Board is the appropriate body to oversee the classification of all material in Australia (EFA strongly supports ‘Option One’ in part);
- All decisions by the Classification Board should be reviewable by the Classification Review Board on the application of any interested person;
- The ACMA should be obliged to notify Australian and international operators of websites when a URL they are responsible for is added to the list of banned sites (EFA strongly supports ‘Option Two’);
- Any reviews of decisions to the Classification Board and the Classificaiton Review Board should be taxpayer funded, but the cost structure of the Classification Board should be reviewed;
- Any filter should display a page that informs users attempting to access blocked URLs that the page has been blocked, why the page has been blocked, links the user to an online copy of the Classification Board’s decision summary, and provides means for the user to apply for review of the Classification Board’s decision (EFA strongly supports ‘Option Three’);
- In the interests of legitimacy, transparency and consistency with the NCS, the filter should provide a warning page only, rather than completely block, the portion of RC URLs that
are not child sexual abuse material (i.e. are not illegal to access); - URLs provided by highly reputable overseas agencies may be added to a mandatory filter where (a) the material is clearly child sexual abuse material; expedited review by the Classification Board is readily available; and ongoing sampling of the integrity and quality of overseas lists with regards to the Australian classification guidelines is undertaken (EFA tentatively supports ‘Option Four’);
- Annual review of the list of blocked URLs and the processes used to generate the list should be conducted by an independent expert and tabled in Parliament (EFA strongly supports
‘Option Five’); - Annual review should pay particular attention to and report on URLs added to the list that fall within a ‘grey area’ that are not clearly and incontestably illegal child sexual
abuse material; - The Government should investigate the creation of an industry group to consider the administrative arrangements of the bodies responsible for any mandatory
filtering scheme (EFA tentatively supports ‘Option Six’).
Related Items:
- Submission: Safe and Responsible AI 2023 24 August 2023
- Submission Doxxing and Privacy Reforms - 1 April 2024 2 April 2024
- EFA Supports Responsible AI 22 August 2023
- Submission: Social Media (Anti-Trolling) Bill 2022 9 May 2023
- EFA Voices Concerns on Privacy Act Report in Latest… 11 May 2023