Having built similar tech (Meta, YC S13), it's been a great year with Vision Pro, Orion, Spectacles and more coming out.
Currently at my co, seeing most day to day use out of XReal, and keen for Visor.
AR/XR/MR/VR app I'm most looking forward to is a 360 location share with the sharing user in AR, and the receiving user in VR, with additional virtual objects shared between. Orion would be great for the send side, with a few extra cameras and Vision Pro on the receive side.
The main thing letting down tech today is how open the platforms are for external developers.
The lack of projecting black I don't see as an issue, clip on something for VR (ok 70 degress isn't quite enough but getting fairly close), or just dim and use gradients for day to day work.
I think we're still at the most basic level in terms of understanding optical physics and ultra high resolution much smaller devices will come out, probably not too soon though.
I remember Meta (your old Meta not the new Zuck's Meta) had an amazing section of the site where you could submit and see sort of like Kickstarter proposals for use cases and I always wondered where all that creative devkit type passion went. Probably on a couple hard drives in a lockup.
Ah yes! I remember working on this. It was born out of asking all the YC founders what they wanted to see - and anyone else we met as well. Formed the basis of our second kickstarter which we self hosted and was much more successful.
> The main thing letting down tech today is how open the platforms are for external developers.
You mean how closed they are? Apple was bad about this, but I think Meta is pretty good with helping spatial / game devs? Am I wrong about that? I don't work in the space, it's just my impression
Yes, correct. As in their degree of openness (not much) is what's letting them down. I see how you could have read it as I meant their too open, which I definitely don't think is the case!
Hard edged, per-pixel light blocking is impossible for the foreseeable future. What's possible today, and what Magic Leap has, is diffuse dimming of large areas of the display.
The problem with light blocking is that when the blocker is millimeters from your eye it is completely out of focus. Unlike for the display, you can't use optics to make it appear farther away and in focus because the direction of the light it needs to attenuate can't be modified (or else your view of the world through the glasses would be warped).
For a near-eye light blocker to work, it would need to be a true holographic element which can selectively block incoming light based not just on its position but also its direction. Each pixel would essentially be an independent display unto itself that selectively blocks or passes incoming light based on its direction, instead of indiscriminately like a normal LCD. I have no idea how such a thing could ever be fabricated.
Damn! That's a throwback. I remember reading about you guys in an airplane magazine once and getting hooked on the concept. I always wondered where y'all went...
a bunch at Vision Pro, some at Zuck's Meta, some at Hololens, some doing other things. Meron is doing BCI, I'm doing AI infra - strongcompute.com (YC W22)
Currently at my co, seeing most day to day use out of XReal, and keen for Visor.
AR/XR/MR/VR app I'm most looking forward to is a 360 location share with the sharing user in AR, and the receiving user in VR, with additional virtual objects shared between. Orion would be great for the send side, with a few extra cameras and Vision Pro on the receive side.
The main thing letting down tech today is how open the platforms are for external developers.
The lack of projecting black I don't see as an issue, clip on something for VR (ok 70 degress isn't quite enough but getting fairly close), or just dim and use gradients for day to day work.
I think we're still at the most basic level in terms of understanding optical physics and ultra high resolution much smaller devices will come out, probably not too soon though.