Eye tracking data is incredibly sensitive and privacy-concerning.
HN tends to dislike Microsoft, but they went to great lengths to build a HoloLens system where eye tracking was both useful and safe.
The eye tracking data never left the device, and was never directly available to the application. As a developer, you registered targets or gestures you were interested in, and the platform told you when the user for example looked to activate your target.
Lots of subtlety and care went into the design, so yes, the first six things you think of as concerns or exploits or problems were addressed, and a bunch more you haven't thought of yet.
If this is a space you care about, read up on HoloLens eye tracking.
It's pretty inexcusable if Apple is providing raw eye tracking streams to app developers. The exploits are too easy any too prevalent. [EDIT ADDED: the article is behind a paywall but it sounds from comments here like Apple is not providing raw eye tracking streams, this is about 3rd parties watching your eyes to extract your virtual typing while you are on a conference call]
> if Apple is providing raw eye tracking streams to app developers
Apple is not doing that. As the article describes, the issue is that your avatar (during a FaceTime call, for example) accurately reproduces your eye movements.
Isn't it the a distinction without a difference ? Apple isn't providing your real eye movements, but an 1 to 1 reproduction of what it tracks as your eye movements.
The exploit requires analysing the avatar's eyes, but as they're not the natural movements but replicated ones, there should be a lot less noise. And of course as you need to intentionally focus on specific UI targets, these movements are even less natural and fuzzy than if you were looking at your keyboard while typing.
The difference is that you can't generalize the attack outside of using Personas, a feature which is specifically supposed to share your gaze with others. Apps on the device still have no access to what you're looking at, and even this attack can only make an educated guess.
This is a great example of why ‘user-spacey’ applications from the OS manufacturer shouldn’t be privileged beyond other applications: Because this bypasses the security layer while lulling devs into a false sense of security.
> ‘user-spacey’ applications from the OS manufacturer shouldn’t be privileged beyond other applications
I don't think that's an accurate description, either. The SharePlay "Persona" avatar is a system service just like the front-facing camera stream. Any app can opt into using either of them.
The technology to reproduce eye movements has been around since motion pictures were invented. I'm sure even a flat video stream of the user's face would leak similar information.
Apple should have been more careful about allowing any eye motion information (including simple video) to flow out of a system where eye movements themselves are used for data input.
"technology to reproduce eye movements has been around since motion pictures were invented"
Sure, but like everything. It is when it is widespread that the impact changes. The technology was around, but now it could be on everyone's face, tracking everything you look at.
If this was added to TV's so every TV was tracking your eye-movements, and reporting that back to advertisers. There would be an outcry.
So this is just the slow nudging us in that direction.
To be clear, the issue this article is talking about is essentially "during a video call the other party can see your eyes moving."
I agree that we should be vigilant when big corps are adding more and more sensors into our lives, but Apple is absolutely not reporting tracked eye-movement data to advertisers, nor do they allow third-party apps to do that.
The problem is the edge case where it's used for two different things with different demands at the same time, and the fix is to...not do that.
> Apple fixed the flaw in a Vision Pro software update at the end of July, which stops the sharing of a Persona if someone is using the virtual keyboard.
" lot about someone from their eyes. They can indicate how tired you are, the type of mood you’re in, and potentially provide clues about health problems. But your eyes could also leak more secretive information: your passwords, PINs, and messages you type."
Do you want that shared with advertisers? With your health care provider?
The article isn't about the technology, it is about sharing the data.
Does HoloLens also use a keyboard you can type into with eye movement? If not, seems to be unrelated to this attack at all. If yes, then how would it prevent this attack where you can see the persons eyes? Doesn't matter if the tracking data is on-device only or not as you're broadcasting an image of the face anyways.
I disagree strongly. I don't want big tech telling me what I can and can't do with the device I paid for and supposedly own "for my protection". The prohibition on users giving apps access to eye tracking data and MR camera data is paternalistic and, frankly, insulting. This attitude is holding the industry back.
This exploit is not some kind of unprecedented new thing only possible with super-sensitive eye tracking data. It is completely analogous to watching/hearing someone type their password on their keyboard, either in person when standing next to them or remotely via their webcam/mic. It is also trivial to fix. Simply obfuscate the gaze data when interacting with sensitive inputs. This is actually much better than you can do when meeting in person. You can't automatically obfuscate your finger movements when someone is standing next to you while you enter your password.
You are an expert user, so of course you will demand extra powers.
The vast majority of people are not expert users, so for them having safe defaults is critical to their safety online.
> It is completely analogous to watching/hearing someone type their password on their keyboard,
Except the eye gaze vector is being delivered in high fidelity to your client so it can render the eyes.
Extracting eye gaze from normal video is exceptionally hard. Even with dedicated gaze cameras, its pretty difficult to get <5 degrees of certainty (without training or optimal lighting.)
Apple does not provide eye tracking data. In fact, you can’t even register triggers for eye position information, you have to set a HoverEffectComponent for the OS to highlight them for you.
Video passthrough also isn’t available except to “enterprise” developers, so all you can get back is the position of images or objects that you’re interested in when they come into view.
Even the Apple employee who helped me with setup advised me not to turn my head, but to keep my head static and use the glance-and-tap paradigm for interacting with the virtual keyboard. I don’t think this was directly for security purposes, just for keeping fatigue to a minimum when using the device for a prolonged period of time. But it does still have the effect of making it harder to determine your keystrokes than, say, if you were to pull the virtual keyboard towards you and type on it directly.
EDIT: The edit is correct. The virtual avatar is part of visionOS (it appears as a front camera in legacy VoIP apps) and as such it has privileged access to data collected by the device. Apparently until 1.3 the eye tracking data was used directly for the gaze on the avatar, and I assume Apple has now either obfuscated it or blocks its use during password entry. Presumably this also affects the spatial avatars during shared experiences as well.
Interestingly, I think the front display blanks out your gaze when you’re entering a password (I noticed it when I was in front of a mirror) to prevent this attack from being possible by using the front display’s eye passthrough.
Like checking out how you are zeroing in on the boobs. What would sponsored adds look like, once they also know what you are looking at every second.
Even some medical add, and the eyes checkout the actresses body.
"Honey, why am I suddenly getting adds for Granny Porn?".
the article is talking about avatars in conference calls which accurately mirror your eye position. Someone else on that call could record you and extract your keyboard inputs from your avatar.
Enabling "reader mode" bypasses the paywall in this instance
HN tends to dislike Microsoft, but they went to great lengths to build a HoloLens system where eye tracking was both useful and safe.
The eye tracking data never left the device, and was never directly available to the application. As a developer, you registered targets or gestures you were interested in, and the platform told you when the user for example looked to activate your target.
Lots of subtlety and care went into the design, so yes, the first six things you think of as concerns or exploits or problems were addressed, and a bunch more you haven't thought of yet.
If this is a space you care about, read up on HoloLens eye tracking.
It's pretty inexcusable if Apple is providing raw eye tracking streams to app developers. The exploits are too easy any too prevalent. [EDIT ADDED: the article is behind a paywall but it sounds from comments here like Apple is not providing raw eye tracking streams, this is about 3rd parties watching your eyes to extract your virtual typing while you are on a conference call]