The decryption key for Apple’s Secure Enclave Processor (SEP) firmware Posted Online by self-described “ARM64 pornstar” [xerub]. SEP is the security co-processor introduced with the iPhone 5s which is when touch ID was introduced. It’s a black box that we’re not supposed to know anything about but [xerub] has now pulled back the curtain on that.
The secure enclave handles the processing of fingerprint data from the touch ID sensor and determines if it is a match or not while it also enables access for purchases for the user. The SEP is a gatekeeper which prevents the main processor from accessing sensitive data. The processor sends data which can only be read by the SEP which is authenticated by a session key generated from the devices shared key. It also runs on its own OS [SEPOS] which has a kernel, services drivers and apps. The SEP performs secure services for the rest of the SOC and much more which you can learn about from the Demystifying the Secure Enclave Processor talk at Blackhat
[xerub] published the decryption keys here. To decrypt the firmware you can use img4lib and xerub’s SEP firmware split tool to process. These tools make it a piece of cake for security researchers to comb through the firmware looking for vulnerabilities.
DECYRPTED DECRYPTED
Thanks, fixed.
Would have saved the FBI a lot of money. Now they can go through their backlog.
Apart from the fact the secure enclave is still secure as it was yesterday.
All this is, as one commented said “Imagine the Secure Enclave as a vault. Apple hung a big, dark curtain over it to prevent anyone from even seeing the vault. Now, that curtain has been opened and people can see the vault. The vault, however, is still locked as securely as ever. ”
However we don’t know who else has got this far already, and we certainly don’t know if other actors have managed to find vulnerabilities in the code.
exactly it’s like saying open source software is not secure because everyone can see the code.Sure it is easier to find vulnerabilities but it doesn’t mean anyone will. However I think someone will be able to pull something off eventually, If you are the one to hack an Apple device these days you can make a lot of cash so I would imagine their will be a few researchers having a peek.
What I’m curious about is if there really isn’t a handy backdoor put in for ‘selected actors’ .
Although I’m not sure this will conclusively show that, but it’s at least a test.
I mean they might champion privacy and security and do the thing with the FBI, but if I was a foreign leader I’d still assume there was a backdoor for the spooks, seeing Apple is still a US company.
Who’s there to use it if not Government Agencies ? Do you think Steve had personal interest in your phone ?
Folks are going to have fun with this.
Step 1 towards telling Apple where they can shove their Apple Store Only button replacements.
You can use non-apple buttons, but touchID won’t work. The reason they block it is actually security; what stops a cheap chinese “sensor” from just replaying the same data for every finger without ever looking at it? If the sensor is not known to be real and valid/authorized, it cannot be trusted to provide accurate data. it’s as simple as that.
Note that the cheap ones still work as buttons (not sure about 3D touch ones though…)
Bullshit, if a scanner world always submit the same result you wouldn’t be able to add your fingerprint to the phone. Try to add your finger while not moving it, it will force you to move your finger or it won’t continue the process.
There is close to no reason not to allow replacement scanners. It’s mostly been a business desission
Then how about a dodgy scanner that sends the same manual rotations of a fingerprint as a human would provide? It could even be programmed to still only work with your fingerprint and some other mould that they have ready made. Or just store your full fingerprint for reconstruction (probably even easier and less obvious if you were to disassemble it).
Hi!
TouchID security architect here (no, really). It was most certainly not a business decision. Adding the complex pairing needed to created an encrypted and authenticated tunnel between TouchID and the Secure Enclave was a logistical nightmare that took a while to get right. If Apple just wanted to lock the sensor to the SoC, and not provide any security, that could’ve been done WAAAY simpler than performing an irrevocable pairing between the SEP and TouchID.
What do Schneier and Krebs say about this?