Yes, indeed they went above and beyond - probably because they also need to defend not only against external threats, but against the user of the device himself to keep the walled garden intact.
Yes (dunno why all the downvotes) but Apple went even further than the walled garden would require. They could have easily left an Apple backdoor. But they encrypt the protocol going over wires to/from the Enclave. They go insanely far rather than sufficiently far.
Yeah, nation state level attacks will still work, especially if they have the phone. But with Android it's not nation state level. It's corporate level and maybe less if they have the phone.
I felt that Apple's description of the initial key setup between the enclave and the main processor was hand-wavy at best.
I know of another similar implementation that's used by Microsemi for their FPGA-based secure boot process[1]. They claim to protect the initial AES key transmission using an "obfuscated" crypto library that is sent to the processor over SPI on boot[2]. Also, I wonder if Apple exchanges a nonce during the setup to prevent replay attacks?
[2]: It's a C/C++ library called WhiteboxCRYPTO. There is a whitepaper (http://soc.microsemi.com/interact/default.aspx?p=E464), but AFAIK the gist of their argument is that the code and keys are sufficiently obfuscated to prevent reverse engineering (typical marketing-speak).
There was an article about iOS security where someone argued that Apple controls the enclave for security reasons, to which I answered that this is basically security by obscurity. You can see I was downvoted for this: https://news.ycombinator.com/item?id=13676135
I still downvoted izacus because it was an uncharitable fanboy rambling. The charitable interpretation would be that the walled garden (in regards to the enclave) is a side effect of their implementation, and not the intention.