Thanks for the reply. Incidentally I've also worked on a bunch of problems you guys must have had related to the UX in VR. For example I've also implemented a bunch of virtual keyboards ;-)
Are you using straight CEF or have you improved the compositor to composit directly into a texture? IIRC CEF only provides the composited web page in a bitmap and then you're going to have to do repeated texture uploads which is going to be a drag.
Awesome to hear that! Would love to check out your work if you have a link / video or anything like that!
We actually forked CEF - and had to make a few changes to allow for integration in the way that we needed. We do use OSR mode, and update a texture in that way - although we need this buffer anyways, since we're sending video frames across the peer to peer mesh - so even if we did go straight GPU, we would still have to download the buffer from the texture.
It's a drag, but there are a number of techniques to improve the performance. Resolution is one great approach - since the resolution of the HMD makes having a high resolution on the browser kind of useless, so reducing the resolution also reduces the pressure on the GPU. Also, we can limit frame rate based on the kind of content being used - and also, we can leverage dirty rects to optimize for content that isn't changing. Since we're running multiple browser tabs, the latter technique isn't as useful for a particular page, but makes it more performant when a user is doing multiple things, like watching a video on the shared screen and scrolling through wikipedia or a news site like NYT.
Up until we consolidated the build for Oculus release, we supported OpenVR and still do in our code - just not the Oculus build. We've gotten a lot of interest in the Vive build in this initial release, so might look to reintroduce that. Before pushing to Oculus, Dream would just launch off the desktop and detect which HMD you had plugged in and then launch the appropriate platform. Shouldn't be a ton of work to bring it to Steam!
Are you using straight CEF or have you improved the compositor to composit directly into a texture? IIRC CEF only provides the composited web page in a bitmap and then you're going to have to do repeated texture uploads which is going to be a drag.
Does this support VIVE too or only Oculus?