Apple’s commitment to customer privacy took the acid test after the San Bernadino shooting incident. Law enforcement demanded that Apple unlock the shooter’s phone, and Apple refused. Court cases ensued. Some people think that the need to protect the public outweighs the need for privacy. Some people think that once they can unlock one iPhone, it won’t stop there and that will be bad for everyone. This post isn’t about either of those positions. The FBI dropped their lawsuit against Apple. Why? They found an Israeli firm that would unlock the phone for about $5,000. In addition, Malwarebytes — a company that makes security software — reports that law enforcement can now buy a device that unlocks iPhones from a different company.
Little is known about how the device — from a company called Grayshift — works. However, Malwarebytes has some unverified data from an unnamed source. Of course, the exploit used to break the iPhone security is secret because if Apple knew about it, they’d fix it. That’s happened before with a device called IP-box that was widely used for nefarious purposes.
That seems to be one of the biggest concern about devices and services like these is that they don’t fall into the hands of phone or identity thieves. The other big concern is just government abuse. Even if you trust your government, there are plenty of other ones out there.
The device apparently comes in two flavors. One seems to act as a proxy for remote access and is locked to the network it uses during setup. That box costs $15,000. The other version (at $30,000) just works with no network connection required, although it takes a two-factor token to access it.
At first glance, the remote unit might seem less scary as long as you trust your law enforcement agencies. After all, if a box is stolen before setup, it would only be good as long as the home office was still allowing it to work. And, presumably, after setup, it won’t work because it will know it is on the wrong network. Contrast this to the stand-alone unit. If you steal one of these with a password written on a sticky note and the two-factor token, you can unlock iPhones all day until Apple fixes whatever allows this to work.
However, if you think about it, even the phone home version has a problem. Is the data transmitted securely? Is it stored securely? We don’t know because it is all so secret. Hackaday is full of stories of secure devices that have giant holes in them waiting for someone to exploit. For that matter, the very existence of the Grayshift box and the IP-box are examples of how the super secure iPhone isn’t really. Of course, if you don’t trust any of the law enforcement agencies that can buy these, then the two versions are equally bad.
We’ve reported on anti-iPhone snooping before. Unfortunately, at the core of this, there is a philosophical problem, not a technical one. Most of the modern world agrees that humans have at least some right to privacy. However, everyone would like to be protected from people who want to harm us and our loved ones. It ought to be possible to create totally secure privacy mechanisms. It is certainly possible to design deliberate backdoors in for the police. The unknown is the public and government willingness to trade some privacy for some security. It is a slippery slope whichever way you go, and not one that will be solved in the pages of Hackaday.
A startup named Grayshift based in Atlanta who specializes in unlocking iPhones has released a device called GrayKey …
https://latesthackingnews.com/2018/03/19/an-iphone-unlocker-is-posing-a-serious-threat-to-apple/
The description of this hack is very similar to how you unlock newer Lenovo laptops (>x30 series, previous ones had trivially bypassable serial eeprom/tpm chip).
http://www.allservice.ro/forum/viewtopic.php?t=3044
As a bonus this type of hack is customizable per device, allservice hack is locked to particular motherboard device ID.
Damn beaten to it lolll disregard my post
Clearly this calls for a proper hardware implementation of RFC3514.
???? totally agree! it should be implemented both on phones and hack boxes.
“However, everyone would like to be protected from people who want to harm us and our loved ones.”
Second Amendment, and don’t give me that look.
You mean this gigantic eye-roll I’m doing right now? I’ll give you that look if I damn well feel like it.
But yeah, law enforcement shouldn’t have this ability. It won’t save lives unless it’s used preemptively. Think about that for a minute. It’s certainly not ideal.
Using our data preemptively, like in the use of other avenues such as predictive policing…
https://www.theverge.com/2018/2/27/17054740/palantir-predictive-policing-tool-new-orleans-nopd
oh boy here comes Minority Report lol
We don’t need freedoms.
They only make our lives more complicated.
We should submit to the powers that be.
Brought to you by Minitrue, of Ingsoc!
I suppose the people of Austin should be shooting packages they find on their front porch?
Yes, but from a safe distance of course.
+1
You seem to either willfully miss the point to further your political ideology, or you simply cannot fathom a point at which guns don’t solve the problem.
Either way, seems sad.
When will the FBI learn, if they can do it so can their adversaries. These things never stay secret long. Security is an arms race and if you don’t patch your armor, weapons will find the holes.
“…might seem less scary as long as you trust your law enforcement agencies”. HA.
maybe I still read too much slashdot, but I’d think the ven diagram of tech savvy nerds and people who trust LE & TLAs is a pretty narrow sliver.
Well, I agree that most people in most places probably don’t. But I was simply saying IF you are a big supporter and think that means it is OK for them to have access, think again….
Yet, some people want us to rely completely on the government for protection. But if the government has to investigate our untimely demise, then our trust goers away. How about picking one and sticking with it.
This article is naive in several ways.
It assumes:
First that the US government works on a system of economics. (It clearly doesn’t)
Second that Apple stood strong when mounted with a popular idea in a law suit that they stood on principle and wasn’t swayed by politics. (I contend at the best it wasn’t that)
Third, that government employees are both (all three?) capable,able to find, and willing to try alternative methods other than a law suit to get what they want. (this all assumes one person has the decision making power AND they are in “the know”)
What likely happened is at best, Apple pointed them in the direction of this company or Apple unlocked the iPhone.
I suppose it is possible they pointed them to the company, although I have as much evidence for that as I do that the CIA told them where to go to have it done, which is at least as likely.
Actually, I’ve worked with some pretty smart people who worked for the government. Of course, that wasn’t for this kind of stuff, but still.
As for Apple, I’m not a fanboy (in fact, I kind of trend negative to Apple products) so I don’t know what their motivations were. Were they really fighting an ethical issue or just protecting their IP or preserving their stock price? I don’t know. We probably will never know, for that matter.
Apple does not have have any “moral principles”, they are simply a smart parasite that know that it must not excessively irritate the hosts it feeds off, their customers. This guides their public actions, it is all just smart marketing and PR.
This comment is naive in several ways.
It assumes:
First that the article assumes the US government works on a system of economics.
Second that Apple did not stand strong when faced with a lawsuit as a direct result of privacy concern – regardless of whether that was ideologically or politically motivated – the central issue is still privacy.
Third, that there are not parts of the government that subvert the law to get what they want.
What likely happened is – based on information and not random guesses made by people on the internet.
“After all, if a box is stolen before setup”
I can’t help but suspect the network locking/two factor token thing is as much or more about protecting their own IP, not so much preventing the box from ending up in the wrong hands.
How is it not the same thing, if they have a box in the wild then the game is up for them, and so their profits tank.
It is kind of an interesting business model. You could ransom your finding back to Apple. You can sell your services as they are doing. So you must be betting on selling your service is more lucrative. Or perhaps you want the “street cred” of being “that company.” However, if a box does get out, you could crack stolen phones with it OR go sell it to Apple and I’m betting the latter is less risky and could be a great reward. So maybe they really don’t want one to escape.
I would think they might scrutinize their buyers a bit. Sell it to police agencies, and try to avoid selling it to anyone likely to take it apart and reverse-engineer it, and then sell their own cheap copy.
lol or even worse, sell it to a shell company that is working for apple so that apple can fix the exploit. (note: this comment assumes that apple has a business case reason to fix the exploit and the company offering the exploit for sale is not a shell company working for apple)
now that’s an interesting point. If apple owned the shell company… they could appease law enforcement AND shareholders (by not telling either one they owned the shell company).
If the non-volatile RAM chips can be removed and imaged, then replaced with a flashable storage that emulates the original RAM, then the RAM image could be loaded into a huge number of phones for brute force PIN cracking in a short time. Any time the phone bricks or otherwise locks down, just reload the image and go again.
Wouldn’t Apple have grounds to sue some of these companies under DMCA?
Not if Apple assisted in it’s construction; Food for thought.
One of the co founders of the company is an ex-apple security engineer.
That smells like a lawsuit in the making..
I think we need a massive refactoring of the legal system worldwide
where you only can be arrested if whatever you’ve been doing hurts somebody (physically/financially/other forms of hurt) and your records are not something that people can look through unless they can prove you’ve committed a major crime
This would allow us to fully enjoy the fruits of our hacking/makiing
I’m going to put a lightning port protocol sniffer into an iPhone and then convince the cops to unlock it. When the iPhone automatically uploads the sniffed data to my online server then I will have this sort of access as well :)
In order to convince the cops to unlock it you would have to commit a crime? Then you’d be put in prison for some years, then when you get out Apple will have fixed the vulnerability you have a sniffed.
Dear drama queens!
This is not storry about your personal privacy or security. It was question of time when this will happend.
This is story about security of kapital and about market. When people is stopping to buy overpriced product with stupid “security” you have to do what this people want. So they did what is done for all phones on market, simple give them without this type of “security” if they want. If I buy something overpriced like Iphone, I will choose level of security i want, i want it work on all networks, I want that this is my personal thing, with or without any stupid account or cloud, and that I will put inside any card and give this overpriced mob to anybody and this person will use it like his without limitations.
If you get Iphone for 1USD then this is not your phone.
Or simple: If they want to sell more, they have to give us what we want.
I would think apple would get ahold of these things to fix the bugs. So, their lifetime would be quite limited.
Just a thought, Apple could (has?) hire a 3rd party to buy the devices, reverse engineer them to find out how they work, and implement new steps to prevent those devices from working.
I was thinking that this would be a very smart idea on Apple’s part. Of course, due to the “protect their IP” mode mentioned by others, the manufacturer would likely vet potential buyers very thoroughly.
Of course they have, but never close a backdoor too fast. Wait a little bit that they sell their device quite a lot, wait for one or two year for them to slow their R&D then close it.
On the other side, use only one vulnerability and keep the other on hands, only using them as soon as a backdoor is closed.
Cat&Mouse, since the invention of Cat (because there was no use of them before).
Meh with a cloud-based world and ecosystem there will always be MITM attacks that can be leveraged now, methinks, especially with phone home bs to spoof. Kinda similar to the kiosk atm jackpotting that is going on now (but has apparently never been done before according to msm) but was already heavily documented in the 90s, it just has to be worth someone’s risk and time. ATMs are known to hold many hollywood n00dz…
Why doesn’t Apple setup a fake but real enough looking company, buy one of these, then see how it happened? Just run it on a phone, and pull logs?
There’s also the irony here, that these guys are using the exact encryption to protect their machines/secrets, and allegedly circumventing it on iPhones.
Law enforcement already had this cracking technology. What they needed was an announcement of some kind that it exists, so that they can begin to use evidence collected in this way in court. This invented machine makes it seem more “limited” and “safe” to non tech savvy people.
So I was thinking… This IS Apple, in another country, with a shell company, unlocking their own phones.
So what would happen if a cop with a sense of civic duty would take one of those boxes and deliver it to Apple? I mean what would happen to that cop? If they fire him he can sue and take it through various courts all the way to the supreme court perhaps. With a little help of the EFF and Apple I venture to guess.