Hacker News new | past | comments | ask | show | jobs | submit login

an apple nerd will come along with a real answer but i believe the answer is no. that even if they patched software.. the chip involved is not going to (or physixally can't) cooperate



> i believe the answer is no. that even if they patched software.. the chip involved is not going to (or physixally can't) cooperate

Indeed.

The whole point of the Secure Enclave is that it is the hardware root of trust. See the Apple Platform Security Guide[1].

The Secure Enclave also contains things like a UID (unique root cryptographic key) and GID (Device Group ID), both of which are fused at time of manufacturing and are not externally readable, not even through debugging interfaces such as JTAG.

As hardware root of trust the Secure Enclave is fundamental to all parts of device security, including secure boot and verifying that system software (sepOS) is verified and signed by Apple.

Apple put a lot of effort into Secure Enclave and hardware revisions have brought improvements as you might expect, so always be weary if you come across old presentations !

[1] https://help.apple.com/pdf/security/en_US/apple-platform-sec...


Even if the chip didn't cooperate, Apple has the key derivation function and presumably everything used to generate your key. While we're on the topic of unlikely first-party attacks, it would be interesting to hear (or see) how Apple limits their ability to create duplicate keys.


> Apple has the key derivation function and presumably everything used to generate your key.

Nope.

The Secure Enclave still contains things like UID and GID which are fused into hardware at manufacturing and are not externally accessible, not even through debugging interfaces such as JTAG.

So Apple will never have all the input parameters for the key derivation functions.

And please, lets not go into tin-foil hat territory where you somehow think Apple logs all keys ever fused during manufacturing and then somehow ties these to you personally.


Unlikely. Having the key generation function is worthless, as you also would need the truly randomized nonce and salt used in any modern cryptographic function. There are plenty of methods to have truly unknowable functions even knowing exactly how the function is generated. That's the whole point of trustless security.


Anyone's free to brute force encrypted data?


Do you need to brute force the data, can't you simply intercept the data from the finger print sensor?


Do they go through internal bus unencrypted?


> Do they go through internal bus unencrypted?

Of course not.

"The sensor captures the biometric image and securely transmits it to the Secure Enclave"[1]

IIRC the implementation detail is AES-GCM-256 with ECDH P-256, i.e. the biometric sensor and the secure enclave derive a unique session key via ECDH each and every time.

[1]https://support.apple.com/guide/security/face-id-and-touch-i...


But what transfers the data from the finger print scanner to the chip?

Software?


> Software?

Clearly some software layer is required to interface with the secure enclave but its not the app.

The app opens an authentication context through the API and asks the API to perform the authentication. It is the API (through a standardised GUI interface) not the App that collects the biometrics. The API then returns yes/no to the app.

There is further a strict seperation of duties between biometric sensor and secure enclave.

Apple puts a significant amount of effort into making that software layer secure, and as this document[1] shows as time progresses the amount of security has only increased with the various chipset revisions.

The thing I say to all the Apple bashers is this. Sure you might not trust Apple (or Google), but even if you go buy the latest $Cool_Sounding_Open_Phone, you still need to trust someone and trust the supply chain.

Sure $Cool_Sounding_Open_Phone might have open-source firmware, but have you actually read every single line of code AND do you have the knowledge to do a security review of the code ? Not many people do. And if you are truly security conscious, you cannot possibly trust "the community" to review it for you.

Unless you're going to start from scratch, build your own PCB, your own firmware etc. But even then, you still need to trust the chip manufacturers, unless you open up your own foundry. So let's put our tin foil hats to one side shall we ?

[1] https://help.apple.com/pdf/security/en_US/apple-platform-sec...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: