On Sunday, the ABC posted an article about facial surveillance that, we feel, was poorly written, edited, and generally did a poor job of informing readers about the issue. EFA and Digital Rights Watch submitted a joint complaint to the ABC about the article, and we are sharing our correspondence publicly so that others can see the details of our concerns.
If you would like to complain to the ABC as well, their complaint form is here: https://www.abc.net.au/contact/complain.htm
ABC Online Facial Surveillance Article Complaint
Dear ABC Editorial Team,
We are writing to submit a complaint regarding the analysis piece titled “That selfie you posted can be used to train machines — which might not be a bad thing”, published on the morning of Sunday 19 June 2022 at: https://www.abc.net.au/news/2022-06-19/why-many-people-arent-comfortable-with-facial-recognition/101157518
While we appreciate the ABC’s commitment to providing a variety of viewpoints, we are concerned that this piece is actively harmful, misleading, and overlooks essential legal and ethical points. Given that this is an analysis piece, rather than an op-ed, we would expect to see a more balanced approach to weighing up the arguments, and a greater use of readily available facts.
Existing practices are unlawful
The piece starts by referring to “tech companies who used [photographs on social media] to train their artificial intelligence systems”. This practice, as used by Clearview AI, Inc, was found to be illegal in Australia under the Privacy Act with the Office of the Australian Information Commissioner (OAIC) ordering the company to stop collecting facial biometrics and to destroy all existing images and templates that it holds.
“Although Meta, the global behemoth that owns Facebook and Instagram, stopped using the tech last year, that doesn’t mean your pictures aren’t still harvested by companies who build searchable databases of faces.”
Holding up the practices of companies such as Clearview AI or similar products that have been found to be in breach of the Privacy Act as a reason people shouldn’t care about the covert use of facial recognition technology in major retailers is misleading.
It is also disingenuous to suggest that the use of facial recognition technology to surveil shoppers is in any way analogous to using social media.
“Sharing selfies on social media platforms, using a streaming service or loyalty card all divulge more personal information than the facial recognition technology CHOICE was probing.”
Surveillance capitalism is certainly a serious digital rights issue, but an individual does not give up their right to privacy in a physical store because their privacy is already compromised online.
Existing bad practices are not a justification for increased surveillance
The piece repeatedly uses the argument that mass data collection is already happening as a justification for increased surveillance.
“Sharing selfies on social media platforms, using a streaming service or loyalty card all divulge more personal information than the facial recognition technology CHOICE was probing.”
“But the reality is, this kind of data is already available through our online activities, which have been harvested and sold for years.”
This is a lazy, status-quo argument that relies on and encourages disempowerment and apathy in ABC readers. It suggests to readers that “better things aren’t possible, so give up now”. It also overlooks the immense harm that is caused by the current trajectory of ubiquitous use of surveillance-based technologies.
Current harmful—and frequently illegal—practices are not a justification for continued or increased harmful and illegal practices.
Harms are clear
As Commissioner Falk found in their Clearview AI decision, at [176]:
[T]he indiscriminate scraping of facial images may adversely impact all Australians who perceive themselves to be under the respondent’s surveillance, by impacting their personal freedoms.
Further, the activities referred to—harvesting data online, the data broker industry, invasive practices of loyalty cards—are all practices that continue to be subject to both legal and ethical debate. For example, the OAIC and the ACCC have both recently investigated privacy-invasive practices of loyalty card programs. They should not be held up as justification as to why the use of facial recognition in major Australian retailers is less concerning, and encouraging readers to dismiss their valid concerns regarding any of these harmful applications of surveillance-based technologies is dangerous.
Motivated sources
The use of a former FBI Agent [update: and also former “Senior Intelligence Officer with the Defense[sic] Intelligence Agency”] to advocate for giving up privacy and accepting increased surveillance is deeply troubling. They were presented as an independent expert without disclosing their law-enforcement affiliation up-front. The potential for motivated reasoning is clear, but this affiliation was only disclosed to readers in the picture caption, not in the text of the article itself.
Their claims were presented as fact, without any supporting evidence. Specifically:
Retailers rely on it to reduce shoplifting. They can be notified if someone who has stolen from the store before enters it again.
Law enforcement agencies across Australia use it to disrupt serious and violent crime, as well as identity theft.
These statements were not presented as quotations, so it is difficult to determine if these are the views of a clearly pro-surveillance source, or of the author themselves. No evidence is provided to support the claims made.
“Dr Desmond said many people don’t understand what they can obtain in exchange for giving up a certain level of privacy.”
It was disappointing to see a largely pro-surveillance article that gave little space to the abundance of evidence of known harms. It was particularly disturbing to see no discussion of the harms caused by surveillance technology in law enforcement contexts when the primary source for the article was clearly linked to law enforcement.
For example:
- Facial recognition technology has been shown to exhibit bias and high error rates on people with darker skin tones. This exacerbates disproportionate policing of Indigenous peoples and other People of Colour in Australia.
- There have been numerous examples of people being wrongfully convicted of crimes based on inaccurate facial recognition technology used by police.
- The misuse of Covid check-in data by police in Australia.
- The misuse of retained metadata by Australian government agencies. The practice appears widespread.
- Widespread unlawful use of surveillance data by ACT police.
- The covert use of facial surveillance technology by Australian police, which was denied until proof was provided at which point police changed their story.
It is unacceptable to promote the use of facial recognition technology for the purpose of law enforcement without acknowledging the known harms or risks.
Misleading readers
Overall, we were dismayed to read such an article published by the ABC. Most of these concerns should have been detected and corrected as part of the editing practice.
It was disappointing to read an article so dismissive of people’s valid concerns regarding their right to privacy and agency over their own biometric information. It is exceptionally disappointing to see the ABC publish arguments in favour of collection and use of Australians’ personal and sensitive information through such controversial technology as facial recognition, based on the existence of other harmful, unethical, and in some cases specifically illegal practices.
We understand that the ABC Editorial Guidelines regarding impartiality do not require that every perspective receives equal time, nor that every facet of every argument is presented, however, we are concerned that this piece is likely to mislead readers. Facial recognition surveillance poses a fundamental threat to Australians’ human rights, however the article inaccurately presents it as legal, benign and widely accepted. Providing “both sides” on an issue with well-documented human rights violations goes against the ethos of the ABC.
To counter the article, we propose to provide the ABC with a balancing piece that lays out the evidence for the known facial surveillance risks and harms, and better explains the legal context for the technology in Australia.
We look forward to hearing from you,
Justin Warren, Chair, Electronic Frontiers Australia
Samantha Floreani, Program Lead, Digital Rights Watch
Related Items:
- Letter to OAIC: EFA Wants Answers on Facial… 21 July 2023
- EFA Complains to OAIC About Retailers Use of Facial… 22 June 2022
- Correspondence with Bunnings Privacy Team 8 August 2023
- Media 2019 21 November 2020
- EFA Welcomes PJCIS Rejection Of Facial Recognition Bills 25 October 2019