Hacker News new | past | comments | ask | show | jobs | submit login

I'm huge on VR -- especially WebVR -- but this isn't what comes to mind when I think of the metaverse opuses in the post. Telepresent chat rooms were never an important part of these stories. That's not where the magic is.

My take is that the metaverse is simply a software system so deep that it becomes part of everyday reality. And we've already built this, but our human-computer interface technology (mostly the software) sucks too much for us to see it. In the meantime we call it the web.

This looks like really cool tech, but I don't think the path to the matrix involves a new protocol, or a new data layer. I think it will come about by extruding our pre-existing software epic -- the web -- into our virtual and augmented environments, in the same way the iPhone compressed the software and ideas that already existed into our pockets.




The problem is that the web itself just doesn't have a standard realtime component. (other than Matrix, or possibly XMPP or Solid). Sure you can provide better HCI technology to expose the websites and webservices of today to users via VR/AR... but they end up just being disconnected silos; the equivalent of different tabs in a browser.

The thesis here is that you need some kind of fabric to weave these apps together if you want them to feel remotely immersive in AR/VR. Rather than each app being its own island, surely you want the ability to move your avatar between them, and collaborate and communicate between services and apps, and have some kind of common virtual physical metaphor that you can literally build on as a platform - whether that's with bots or services or whatever. This is the thought experiment we're doing with the VR side of Matrix.


Are Websockets and WebRTC not realtime enough? They need to bake more, but they're literally standards.

I wholeheartedly agree that probably the biggest problem that needs solving is that of siloed experiences. But this is also something the web solved long ago with URLs and files on the front, and webby APIs on the back.

If we invent another protocol to solve this problem, are we not simply creating new silos?


Websockets and WebRTC really are not enough at all, and this is a very widely held misapprehension. Websockets (and their HTTP/2 equivalent) are just a dumb pipe for shifting data - just like a TCP socket, but tunnelled over HTTP. Meanwhile WebRTC defines how to encode and transport real-time media streams over the 'net... but deliberately avoids specifying how to discover or signal the existence of those streams in the first place. In analog terms WebRTC is like a phone line without a phone exchange.

I agree that the open Web avoids silos through URLs and ad hoc HTTP APIs. But this is insufficient for a hypothetical VR universe of immersive services: you don't want the only way for a user to be able to navigate between services to be to 'click a link'. And it's arguably a failure that almost every single web service ends up publishing its data via an incompatible ad hoc HTTP APIs - for instance, given Facebook, Slack, HipChat and Skype all provide supersets of precisely the same basic functionality, how come each has its own proprietary custom HTTP API to do the same thing? Wouldn't it be better if there was a standard open data fabric into which they exposed their data?

The idea here is that VR will suck if it's just a bunch of disconnected apps, and instead you need decentralised primitives for sharing Identity, Avatar properties (e.g. position & orientation), shared VoIP, shared Video, even shared world geometry/assets/physics in order to stitch it together into a coherent experience; both for users or developers.

The web was successful because HTML is so simple, forgiving and flexible - and you end up with essentially the minimum successful viable hypertext system. One could probably limp along with a VR Web which is just a bunch of webapps which happen to have chunks of WebVR UI. But it could be so much more, while still being flexible and forgiving and lightweight. At least this is Matrix's hope :)


Apologies for nitpicking, but I find it rather important: Websockets are not tunneled over HTTP, they are only negotiated over HTTP. Once you are connected, there is no overhead of sending HTTP headers back and forth.


totally off topic, but misapprehension is such an oddball word. It doesn't mean what I would have thought it meant. Totally threw me for a loop and I've added it to my list of words that I think people shouldn't use because there are better more common words to use to express the same idea. Thanks!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: