Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Devil's advocate: If our threat model includes your laptop being tampered with by an evil maid competent enough to imperceptibly modify the camera LED circuit, couldn't they just install a separate camera elsewhere (maybe in one of the speakers)?


Or they put you to sleep in a way where you have no memory and place the bug inside your body. You can go on like that forever. So leave your doors unlocked because you can't ever be 'safe' right? Obviously not.

There's a scale from dead easy to more difficult to very dificult. Easier to get you is a bigger problem. Cheap & easy to prevent - well why wouldn't you? It's asymetric.

Wouldn't you feel hilariously stupid if someone modified your camera circuit when interecepting your laptop and you actually didn't stick a post-it over it to thwart their dastardly plans.

The point here is making the kind of claims made about LEDs and camera circuits is really, really easy when telling other people what is not a risk. When you carry that risk - ie "all possible models and other threat vectors" suddenly you should not be so sure anymore. A physical cover is better, easier, cheaper and basically infallible for what it is advertised to do. Asymetric payoffs are worth noting. A genunine plausible risk scenarios are all you need to take a /trivial/ mitigation step.

Apple making trivial mitigation steps harder is really, really, really stupid. In fact, beyond merely stupid, it's unwittignly and incompetently user-hostile. (Unless you think they're design process has been infiltrated by the NSA or something, which I guess is at least possible, but I think it unlikely in the face of utterly incompetent idiocy - which Apple do display from time to time).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: