FBI HQ
FBI HQ, Washington D.C. Image: Aude – CC-BY-SA

I read the news headlines and sometimes it is hard to tell what decade we are in. Just over 20 years ago, the National Security Agency developed and tried to make standard an encryption device with a built-in backdoor called the Clipper Chip. At that point it was intended for voice communications but inevitably would have been incorporated into Internet communications over time.

Fortunately, due to the efforts of the Electronic Privacy Information Centre (EPIC) and the Electronic Frontier Foundation (EFF), the Clipper chip was defunct by 1996.

Flash forward to today and we seem to be still having this discussion. As a result of the investigation stemming from the San Bernadino shootings, the FBI is attempting to order Apple to develop a backdoor for their Apple iOS operating system. This is to enable investigators to obtain any additional information about the shooters, who they might be connected to and any other plans that may be in place.

While the government’s needs don’t appear to have changed over time, it is fair to say the world certainly has. There is a genuine and sometimes urgent need for law enforcement to be able to undertake investigations in response to terrible act of violence and terror aimed against a civilised society. This job is made increasingly difficult by the common use of strong encryption technologies. It is the unfortunate nature of the technology that it provides the same protection to both the innocent as well as the guilty.

So what is wrong with the government mandating a backdoor into Apple iOS? It would certainly provide less protection to terrorists and criminals who wish to do society harm.

The first problem rests with the fact that as soon as a backdoor is developed, it proves it can be done. At that point, the ‘cybercrime industrial complex’ (not to mention foreign state-based intelligence services) will be putting considerable resources into attempting to replicate the backdoor. This will include teams of highly skilled specialists pouring through iOS code to try and find how it was achieved. It will also likely include attempts at industrial espionage and a significant inducement for any Apple employee to leak the secret. Simply put it will be almost certainly just a matter of time before the backdoor is found. This is also not taking into account the possibility that this backdoor is found by a legitimate security research undertaking vulnerability testing on the platform.

The second problem is the legal precedent this will set. The media as far as I can tell hasn’t revealed any specific motive that the FBI has for gaining access to the device. Any discussion as to whether there is actual suspect evidence on the device that is required or this is just being used to push an agenda would be conjecture. But regardless of the motive, this would potentially give law enforcement an additional power that has very little framework to govern it. There is potential that future applications for access could be for cases far less serious than a terrorist shooting or even for a very clear case of ‘fishing’ for evidence. This would have a dire impact on the privacy and rights of everyone.

Both of these problems make the installation of a backdoor in the iOS encryption mechanism a very bad idea. One additional fact that hasn’t gain coverage is that this will also be a bad idea for the government itself. Government iPhones will be just as impacted as the iPhones of ordinary citizens. I would also think you could hazard a guess, in the case of the first problem described above, who the targets of foreign state actors and criminals will primarily be.

So what is the solution? The arguments with this case are essentially the same as with the Clipper Chip all those years ago. This problem is not going away, and if anything is just going to get worse the more that society depends upon electronic devices for their daily lives.

I don’t have any magic solutions unfortunately. This is a difficult problem to solve. But I can say that the current approach is bad for all.

It is important to remember that the FBI already has access to a wide range of data (‘metadata’) which will include details of every phone call and text message sent and received by the phone in question, as well as the location of the phone at the time these occurred. You’ll probably also be aware that Australian law enforcement agencies have similar access to this type of information.

The answer I think is to look well outside the current security paradigm of ‘lock and key’. We have to acknowledge that electronic privacy is a paramount concern in this day and age. The loss or compromising of an electronic device or communication technology can be devastating to an individual. But at the same time we do live in a civilised society of law. We expect the institutions set up to protect us to do their job well.

Whatever is the solution, I think it will change the way we think and work a great deal.

This article is by Daryl Sheppard. Daryl is an EFA member.