Selling one product and using the profits from that to create a second product is how businesses work. So if/when Tesla/anybody is selling their humanoid robot and those profits pay for going to feed the NVIDIA supercomputer that they're using to train models to run on robots, and to fund development of robot 2.0, that's exactly where we'll be.
To my limited knowledge it's not even clear what the edges are but I think it's probably safe to say that the bigger it is, the more complexity you can cram in there.
It works as you described: Texturing the mesh "live" as you move through it. It does use Three.js as a base, but needs custom shaders to make it 60 FPS.
Matterport works special magic on the transition between 360 images in their SDK. As far as I understand it, they render 360 cube camera onto the material of the mesh of the environment as you move so it looks like you're moving in the real mesh but only seeing the 360 image. I tried to approximate this myself in Three.js but didn't get anywhere near the quality or performance of their work. Homage.
reply