Apple’s commitment to customer privacy took the acid test after the San Bernadino shooting incident. Law enforcement demanded that Apple unlock the shooter’s phone, and Apple refused. Court cases ensued. Some people think that the need to protect the public outweighs the need for privacy. Some people think that once they can unlock one iPhone, it won’t stop there and that will be bad for everyone. This post isn’t about either of those positions. The FBI dropped their lawsuit against Apple. Why? They found an Israeli firm that would unlock the phone for about $5,000. In addition, Malwarebytes — a company that makes security software — reports that law enforcement can now buy a device that unlocks iPhones from a different company.
Little is known about how the device — from a company called Grayshift — works. However, Malwarebytes has some unverified data from an unnamed source. Of course, the exploit used to break the iPhone security is secret because if Apple knew about it, they’d fix it. That’s happened before with a device called IP-box that was widely used for nefarious purposes.
That seems to be one of the biggest concern about devices and services like these is that they don’t fall into the hands of phone or identity thieves. The other big concern is just government abuse. Even if you trust your government, there are plenty of other ones out there.
The device apparently comes in two flavors. One seems to act as a proxy for remote access and is locked to the network it uses during setup. That box costs $15,000. The other version (at $30,000) just works with no network connection required, although it takes a two-factor token to access it.
At first glance, the remote unit might seem less scary as long as you trust your law enforcement agencies. After all, if a box is stolen before setup, it would only be good as long as the home office was still allowing it to work. And, presumably, after setup, it won’t work because it will know it is on the wrong network. Contrast this to the stand-alone unit. If you steal one of these with a password written on a sticky note and the two-factor token, you can unlock iPhones all day until Apple fixes whatever allows this to work.
However, if you think about it, even the phone home version has a problem. Is the data transmitted securely? Is it stored securely? We don’t know because it is all so secret. Hackaday is full of stories of secure devices that have giant holes in them waiting for someone to exploit. For that matter, the very existence of the Grayshift box and the IP-box are examples of how the super secure iPhone isn’t really. Of course, if you don’t trust any of the law enforcement agencies that can buy these, then the two versions are equally bad.
We’ve reported on anti-iPhone snooping before. Unfortunately, at the core of this, there is a philosophical problem, not a technical one. Most of the modern world agrees that humans have at least some right to privacy. However, everyone would like to be protected from people who want to harm us and our loved ones. It ought to be possible to create totally secure privacy mechanisms. It is certainly possible to design deliberate backdoors in for the police. The unknown is the public and government willingness to trade some privacy for some security. It is a slippery slope whichever way you go, and not one that will be solved in the pages of Hackaday.