Wearing my EFA Board Member hat, I spoke today at an event at Parliament House hosted by the Menzies Research Centre in a debate with Tony McLellan of the Australian Christian Lobby. The audience was primarily members of the Australian Liberal Students Federation; young Liberals destined for jobs as political staffers and politicians.
Below is the text of my part of the debate.
Let me begin with a short anecdote.
On Monday night as we watched Four Corners and Q&A, my not-quite-13 year old daughter, Hannah, made a particularly interesting observation. “Gee, Dad,” she said, “I think I’ve just seen more rude pictures in that story than I’ve ever seen on the Internet.”
Hannah has been using the Internet since she was four.
Certainly, much of that time it has been under our supervision, but increasingly it’s not. When Hannah uses the Internet, she uses a connection at home that is completely unfiltered, neither by the router we use nor by activating the fairly comprehensive parental controls that come as a standard part of modern operating systems. She has administrator access to the machine she uses and she also knows and understands how to access and manage the home network.
Knowing I was coming here today, I conducted something of a straw poll of that observation amongst friends and acquaintances with kids of a similar age. I deliberately avoided asking only “‘Net savvy” parents.
Universally, the experience was the same; none of our children had ever inadvertently encountered pornographic or other offensive material on the Internet, let alone material of the kind that falls under the umbrella that the National Classification Code defines as Refused Classification. None of the children had filtered or managed Internet connections. All of them used computers placed in public spaces in their homes and several had their own computers in their rooms.
The most recent research into public opinion on the filter, carried out by the Safer Internet Group consisting of Google, Internet Industry Association, iiNet, Australian Council of State School Organisations and the Australian Library and Information Association and others shows a marked increase in doubts about the filter amongst parents.
There is significant opposition to the government’s filter as proposed. Rather, parents first want greater education options and at-home filtering and as a next-best option, an opt-in filter. Mandatory filtering runs a long last.
So too, our friends internationally, including most notably the US Ambassador to Australia, Jeff Bleich, speaking on Q&A have come out publicly against the filter as it stands. Ambassador Bleich, an internationally recognised authority on human rights, was particularly clear, when he said:
“We have been able to accomplish the goals that Australia has described, which is to capture and prosecute child pornographers … without having to use internet filters. We have other means and we are willing to share our efforts with [the Australian government].”
The arguments of the government and its supporters in favor of the filter regularly hang on the matter of RC material. On this, I’d like to first highlight two matters of interest that seem to cause some real confusion.
First, is the myth that all RC material is illegal. This is simply not true.
The fact is that of all material classified RC, it is only material depicting the sexual abuse of children that is that is illegal to own. For good reason. No reasonable person in today’s society believes that such material is suitable for adults to access, let alone children.
Material that falls under the RC umbrella is unquestionably sometimes distasteful or controversial or contains or depicts concepts of an adult nature; drug abuse, explicit material about abortion, guides to assisted suicide, violence. Whether you personally approve of such things or not, none of this material is illegal to possess in this country; it’s perfectly legal for me or you to own a copy of Baise Moi or The Peaceful Pill, just not to make it available for sale.
Yet the filter seeks to change this. Our classification system in Australia is something that largely works and is designed to empower adults and minors alike to make appropriate, relevant choices. When implemented, and have no doubt, the government’s plans for the filter are far from abandoned, it will take away adults’ ability to decide for themselves whether or not to access material that is by-and-large, legal in this country.
Second, is the fantasy that stumbling across material that is RC on the public web is something that occurs with frightening regularity. It’s not even easy to stumble across R- or X-rated material, not all of which is pornographic in nature and none of which will be targeted by the filter. You have to go looking for these things very deliberately. Looking for material that is RC is even harder.
The material the government proposes to filter is, in some cases, completely appropriate to access. For that which is not, child sexual abuse material, it is well known that the criminals who trade in this matter do so using tools and protocols that will not be managed by this or any other filter. Rather criminals trade their materials in private networks.
Additional dollars and human resources for law enforcement by the Australian Federal Police ought to be supported. It is only through the diligent and successful efforts of the AFP and its overseas collaborators that those people purveying child sexual abuse material are apprehended and put in jail where they belong.
Let’s look in turn at a number of the other issues around the proposed filter.
First, the matters of cyber-safety, education, self-determination and digital citizenship.
There is no question that as adults and particularly as parents, we wish to protect our society and children from danger and from exposure to deeply offensive or inappropriate material. Certainly, as a father, this is paramount in my concerns.
In order to do this, I have a responsibility. As a parent as and a member of society, it is incumbent on me to educate myself, my child and those who I come into contact with about issues such as good digital citizenship and appropriate online behaviors. Doing so helps us, particularly, to protect ourselves from threats the filter will not even address such as cyber-bullying (and bullying in the flesh-and-blood world), from online predators, from identity theft.
These issues are certainly much higher in the minds of the parents, teachers and students I speak to regularly as a part of my work than are matters like RC.
Despite the marked increase in this country of policy that erodes our freedoms, pushing back against personal determination and our ability to make decisions for ourselves, the fact is that the vast majority of Australians are not complete dullards who need the Nanny State to tell them how to run their lives. Rather, they are perfectly normal, intelligent people who are capable of self-determination, of critical thinking and decision-making.
Australian parents are largely not irresponsible and incompetent at bringing up their kids. Most of them are entirely the opposite, doing a fine job of parenting and making appropriate decisions about child rearing. They are perfectly able, as parents and adults, to decide what is and isn’t appropriate for their children to see online and elsewhere. Equally, they are able to teach their children, with help from educators, law enforcement and others, how to behave as reasonable digital citizens.
The millions of dollars the government proposes to spend on the filter, a technology that will not actually work as advertised and will be easily circumventable, would be far better spent on law enforcement and on thorough programs for teachers and parents to educate themselves on risks, on teaching how to manage their own and their children’s access to the Internet, on appropriate online behavior and, where they wish to, how to filter their own computers directly and by choice; provably the most effective form of filtering and placing the power to conduct themselves firmly in the hands of individual people rather than in the hands of a government.
In more than one research study, both here and overseas, strong evidence exists that the risks to minors of exposure to unwanted, by which I do not mean only illegal, material, are considerably overblown. Children are not irreparably damaged by seeing things that may be distasteful or inappropriate online, particularly if they are surrounded by a framework of parents, mentors, educators and other support services that can help them make sense of these things.
Even if some form of filter is ultimately introduced, it would be far better if such a thing was opt-in rather than mandatory, as it was in Labor’s original pre-election policy. This leaves the decision-making in the hands of parents, where it belongs. Indeed, many opponents of the current filter scheme have stated that their objections would largely be mitigated of opt-in was the choice.
I don’t want to spend a great deal of time on the technology, as the concepts here have been argued at length and in detail by others. Suffice it to say that, in spite of Senator Conroy’s arguments to the contrary, there are major technical issues with the filter that remain unanswered or lacking in enough detail to be satisfying:
- secure web sites, such as we use for online banking and e-commerce cannot be filtered without making them less secure
- there remains a risk that if a popular and culturally valuable sites such as Wikipedia, the National Gallery of Australia or YouTube were subject to a filtered URL, overall access to those sites may be measurably degraded
- the introduction of the NBN and networks running at those speeds have not been tested under filter conditions at all
- only material published on the web will be subject to the filter, other distribution methods such as BitTorrent, email and instant messaging, often used by criminal networks to distribute offensive material, will not be subject to the filter
- bypassing the filter is, as admitted by Senator Conroy on more than one occasion, a trivial exercise, even for relatively non-expert users
- mandatory filtering is less flexible and customisable than home-based, on-router or on-computer filtering
All of these issues require evidence-based, thorough answers.
The blacklist itself is problematic on a number of fronts. These too have been discussed at length, but let’s look at them briefly.
The list is secret. In a world where open government in modern democracies is receiving significant attention, this is, at the very least, interesting. We hear arguments that a secret list protects us from exposure to the URLs that contain the offensive material. However, if the URLs are filtered, in what way do we risk exposure? The argument fails its own logic. Beyond that, it’s simply offensive to me to think that any government believes that I am incapable of enough independent thought to determine what URLs I do and do not visit.
By its very secrecy, if my website ends up on the blacklist, I am unable to know how and why it got there. It’s also unclear how I get off the list if I’m there unjustifiably. What happens if someone opposed to your political views or faith manages to get your site on the list?
Secret things have a tendency to leak through the cracks. The blacklist has already been leaked once. It’s not inconceivable that it will happen again. And again. And again.
The list is tiny. In a world where the public web is now in the trillions of pages, a list of something around 10,000 URLs barely scratches the surface of any pool of offensive, let alone illegal, content that may exist.
Which brings us to criminal networks distributing child sexual abuse material – I’ve already mentioned this, but it bears repeating – these networks do not use the public web to distribute their wares. The technologies they do use – private networks and peer-to-peer – will not be filtered.
The only effective way the distribution of this illegal material can be stopped is through active law enforcement. The AFP has a highly competent cybercrime unit that could be more effective if it was the beneficiary of additional funding and resources.
Last, to matters of filtering and free speech.
Senator Conroy, on Monday night’s Four Corners, stated clearly that for the purposes of the filter, his government’s policy was to filter RC content only and that he would be amongst the many voices raised in protest should some subsequent government decide to broaden the scope of the filter.
The filter covers material legal in other forms and media. It lacks accountability and appelability which are at odds with our open democracy and markedly different to equivalent decisions that are open to scrutiny when subject to other media.
While the Senator’s and the government’s hearts may certainly be in the right place, we cannot be so certain about unknown future governments and their thoughts on the nature of what could and should be subject to filtering. It is entirely possible that over the long term not only material that is RC will be subject, but perhaps dissenting political voices, matters of taste or voices belonging to certain faiths may be censored.
So, here’s a summary of the issues as I see them:
- there’s no serious Internet content problem to solve – you just can’t inadvertently stumble on RC or child porn on the Internet
- even if there was, few want the government to solve it this way – there are better, more effective, more workable and more societally acceptable options
- the technology presents a real risk – we’ve seen the trial results and the extensive analysis which points out the flaws
- the blacklist itself is a problem – it’s secret, unappelable, deals with material that remains legal, it’s already been leaked and will again (you’ve heard of the Streisand Effect, right?)
- the filter will not address criminal distribution of illegal material – it’s far better to ensure funding and resources for law enforcement, who are the only people equipped to deal with this problem properly
- the filter impinges on the freedom of Australians to determine for themselves – it represents a real shift in the ability for Australians to determine what is and isn’t appropriate for them to view online and significantly changes a fairly workable classification system in other media to cope with a medium that is changing rapidly
- the filter will be administered by governments ill-equipped to do so – the technology and policy are complicated and problematic. We’ve seen several policy and program stumbles lately, do we want one over this?
- there is no guarantee that future governments will not change the scope of what is filtered – the suppression of material based on moral or political grounds is anathema to what Australia is about
This is far from a simple issue.
I’d like to close with a few words from Will Briggs, an Anglican priest from my wife’s home town of Somerset, Tasmania. Will is a strong voice in the discourse on the filter. He said:
“[This issue] is best [addressed] through clear information, balanced argument, reasoned debate…[on the] multiplicity of issues… [it is] a debate which is not simply about sexual ethics but about freedom of speech, the reductionism of morality, and the role of government in society… by… simplifications in this case [we] look like simpletons.”
Related Items:
- What's on: Privacy Awareness Week 2 May 2023
- EFA talks with...Vanessa Teague 24 July 2020
- Call for 2022 Board Nominations 19 September 2022
- EFA Thanks Longtime Board Member Matt Watt For His Service 6 June 2023
- Call for Nominations to Electronic Frontiers… 4 October 2024