> My inner VR enthusiast thinks the Apple Vision is cool but my inner realist wonders if anyone is really going to use anything beyond smart glasses.
My personal prediction - smart glasses as a smartphone replacement, AR/VR headsets as a poweruser workstation machine replacement.
The market segments map out nicely with your prediction as well - almost everyone has a smartphone of some kind (just like i expect almost everyone to have some sort of smart glasses in the future), while a relative minority (even though it is a large one) has poweruser workstation machines (just like with AR/VR headsets in my prediction).
That is, until at least the tech gets insane enough to allow packing full functionality of an AR/VR headset into the form factor of glasses, with an insane battery life to boot. I don’t see that happening in any foreseeable future though, sadly, barring some transformational and unexpected battery chemistry breakthroughs.
> My personal prediction - smart glasses as a smartphone replacement
I still struggle with seeing smart glasses as a viable smartphone replacement unless paired with some sort of peripheral to perform input privately. Doing everything by voice or expressive gestures around others isn't going to work for people.
That’s a really good point I totally forgot about. I would expect it to be controlled by a combo of gestures using eye tracking + some auxiliary input device, like a ring or a smartwatch or something like that.
I agree, for now we have no good or even barely established UX/HCI paradigms for hypothetical standalone AR glasses.
Not that we even have those types of paradigms established for currently existing AR/VR devices, but we are getting there slowly. With each year since I first tried the original HTC Vive, every new device and update slowly but surely made the interactions better, simpler, and feel more “worked out”.
What gives me hope is seeing how the touch-only UIs have changed since the original iPhone release. At first, everyone was scoffing big time at touch-only interfaces ever becoming functional, viable, and widely used. The first third party apps on the App Store were also extremely disjointed and had almost nothing in common between each other in terms of UI/UX. Felt like everything was just spliced together and stamped with “we think this should work.” Not casting shade at devs of those apps, everyone was in that position back then, as there were no established UI/UX for touch-only interface smartphones.
In 2024? While there are still continuing changes, things slowed down overall as the general cohesive UI/UX principles for touch-only smartphones have been established. And they indeed became functional, viable, and widely used devices.
Pupil tracking is in consumer VR devices, I can see it being further miniaturized, especially with advances in waveguiding.
In fact, this might be a great use for Zeiss's holocam tech [0]: high resolution, low definition, grayscale "window" that waveguides some of the light passing through, down to one of the edges, where an image sensor picks it up and decides it.
This isn’t for in-home use, I believe they were talking about use cases similar to using smartphones outside of home. I am not pulling out a bluetooth keyboard out of my pocket on the street when I need to navigate using GPS or look something up.
Btw, pretty much every AR/VR headset I am aware of these days already supports bluetooth controllers and keyboards. For some keyboard models, you can even have them visible and physically tracked in your VR space (I tried it with Quest 2 and apple’s wireless keyboard, worked like a charm).
I actually was thinking of what people by far use their phones the most for (in terms of what matters to them, and perhaps also in terms of time for many): Text messaging. Having a social life others aren't privy to because it's silent and easy to hide from their eyes.
Keyboards in VR are really bad, and the AVP also doesn't have any ideas there (looking at keys one by one and pinching is extremely "peck and hunt").
The closest to a magic AR solution I can think of is tracking so obscenely great that you can project a touch keyboard on a screenless slab without annoyance. Maybe.
Unless you approximate smartphone typing speeds, silently, it's not going to be a smartphone replacement for any masses.
> That is, until at least the tech gets insane enough to allow packing full functionality of an AR/VR headset into the form factor of glasses, with an insane battery life to boot. I don’t see that happening in any foreseeable future though, sadly, barring some transformational and unexpected battery chemistry breakthroughs.
It's not that far fetched if you move most of the hardware to a fanny pack or similar. You can probably get pretty close with current smart glasses (or Bigscreen Beyond) and a (next gen) Steam Deck.
My personal prediction - smart glasses as a smartphone replacement, AR/VR headsets as a poweruser workstation machine replacement.
The market segments map out nicely with your prediction as well - almost everyone has a smartphone of some kind (just like i expect almost everyone to have some sort of smart glasses in the future), while a relative minority (even though it is a large one) has poweruser workstation machines (just like with AR/VR headsets in my prediction).
That is, until at least the tech gets insane enough to allow packing full functionality of an AR/VR headset into the form factor of glasses, with an insane battery life to boot. I don’t see that happening in any foreseeable future though, sadly, barring some transformational and unexpected battery chemistry breakthroughs.