First reaction: Discussion paper on Enhancing Online Safety for Children

Source: iriskh | Flickr http://www.flickr.com/photos/irisphotos/4905621374/
Flickr: iriskh

A discussion paper on Enhancing Online Safety for Children was released by the Commonwealth Department of Communications on Wednesday.

The paper opens public consultation up on measures to improve the online safety of children in Australia, specifically:

  • the establishment of a Children’s e-Safety Commissioner
  • developing an effective complaints system, backed by legislation, to get harmful material down fast from large social media sites, and
  • examining existing Commonwealth legislation to determine whether to create a new, simplified cyber-bullying offence.

EFA’s initial position on the discussion paper – which is not a policy document and should not be read as such – is while we are not endorsing the document at this stage, we are cautiously optimistic that it is asking reasonable questions.

We are concerned about the potential for overreach and vagueness of some of the standards proposed (offensive content is not always bullying) and who may end up making decisions, but we are interested in a serious public discussion about what is already being done (and perhaps not acknowledged) and what could be done.

Organisation and buy-in

The document proposes a range of options for dealing with the problem of cyber-bullying. The proposal of a Children’s e-Safety Commissioner raises the spectre of a powerful ‘tsar’-like position, but Option 4: “designation of a non-government organisation with expertise in online child safety” (p. 7), provides for a Non-Governmental Organisation that is based on an organisation with cyber-safety expertise and includes representatives from industry, government, and civil society. That kind of multi-stakeholder model is very important as well as potentially more powerful than legislation.

EFA has seen a successful version of a similar NGO ‘clearinghouse’ in the Japanese “Internet Content Safety Association” (Japanese language; See this English-language news report from 2011) designed to filter and delete child pornography.

I met with Yuta Uemura (ICSA and Yahoo! Japan) and Nao Fukushima (Mitsubishi Research Institute) about the ICSA last year. They reported that they have never had a complaint about repressing freedom of expression from civil society or the corporations involved. Such an NGO is the preferred option both because it involves buy-in from all the stakeholders and because it limits the potential for overreach from any one player.

That organisation is actually funded by industry and government, with a dedicated staff member from each of the major social media companies and Internet Service Providers. Its actions are under government oversight and within a legislative framework, but there is about 80% voluntary compliance.

One of the reasons for working hard to achieve buy-in is that many social media companies already have mechanisms for dealing with problematic content such as abuse and cyber-bullying. Arguably improving the clarity and publicity of those processes might go a long way to dealing with this issue without needing a new NGO and very elaborate new legislative framework. On the other hand, if such companies were part of an NGO then they would have an incentive to help create an excellent clearinghouse for information about all of their mechanisms. It might also limit the need for sanctions against non-compliant companies, especially since many of the companies involved are foreign-based.

Defining platforms

The document also recognises the difficulty of defining the platforms on which cyber-bullying takes place, and uses well-respected academic research (boyd and Ellison, 2007) to frame that definition process. There has been some reported criticism about the document naming some particular social media organisations, such as Facebook and Yahoo!7, and missing out newer apps such as Snapchat. I think this misreads the document. My reading of it is that those sites were named as organisations who voluntarily signed up to the 2013 Cooperative Arrangement for Complaints Handling on Social Networking Sites. While we should always be cautious about overreach, the discussion document specifically asks how to define the relevant platforms, and that is a reasonable question.

Filtering?

The document does not contain direct filtering and censorship measures, which shows at least some commitment to separating online freedom of expression issues from safety issues. That being said, one of the proposed responsibilities for the proposed Children’s e-Safety Commissioner will be “working with industry to ensure that better options for smartphones and other devices and internet access services are available for parents to protect children from harmful content” (p. 5).  Is this another version of the opt-out – opt-in hokey-pokey? Such a provision is broad enough to warrant vigilance in ensuring that it is not a foot in the door to “think of the children” rhetoric to push through over-reaching measures.

A need for nuance

The document proposes a ‘mid-range cyber-bullying offence’ applicable to minors that involves warnings and fines. It is important for governments to develop nuanced legislation that is different for minors than adults in online contexts, so the proposal is a good place to start considering how to deal with minors with respect to this difficult legal situation. Practically, there might be an issue with how fast deliberations can be made, but the cyber-safety group named in the document already have many empirical examples on which to make decisions, and as with many legal issues there is likely to be a large number of fairly easy decisions and a small number of difficult cases.

That being said, there is a serious issue in trying to define “harmful” content. There are important differences between what is offensive, harmful, and bullying, and we must guard against overly broad definitions. Similarly, we must be careful to ensure that the decision-makers include all stakeholders.

Public consultation

EFA appreciates that the government has opened this issue to public consultation. That is the responsible path. It is surprising that the NBN Strategic Review did not use the same process–there is great public appetite for a say on the NBN. We hope that any proposed future NBN reviews might take this on board.

A bill of digital rights

That being said, the bigger picture is that we have reached a stage in digital rights and responsibilities where we need a comprehensive review process that sets all online issues up for review in similar ways. It is interesting, for example, that this discussion paper only deals with minors. While there is a need for nuanced response to protecting children, there is certainly also a need for considering the same issues for adults.

Rather than a piecemeal approach to online issues–privacy here, surveillance there, fair use here, child safety there, freedom of expression here, censorship there–EFA would like to see progress on a comprehensive bill of digital rights that would set the principles for all future relevant reviews, legislation, and organisations.