Wild speculation: iOS 8 will bring an out-of-process, heavily sandboxed webview, which was the missing piece for enabling webgl for everyone. The greatly increased attack surface (with almost direct gpu access via shaders etc) was too risky to host in in-process web views - explains why webgl was only enabled for iAds (since those would be vetted by a review team)
We were talking about this yesterday. I asked how many of the previous issues we classified a security problem would of been mitigated by a sandbox and the conclusion would be that it would not even cover the majority.
Most of the security issues we encounter are with bugs in the driver. A common bug for example with the Intel mac driver is when sending allocating a valid large texture the texture will sometimes instead be filled with old gpu memory[1]. Then you can glReadPixel the data and reconstruct parts of the desktop windows or tabs. A sandbox isn't going to stop you from exploiting this kind of buggy driver if it incorrectly starts returning other people's data when you asking for unrelated valid commands.
It is coming soon. Firefox Nightly has it as an option, and according to this it's supposed to be an option in Firefox 30. I'm not sure when it will be turned on by default though:
Going along with your plausible speculation, and assuming XPC is used to communicate with the WebGL view (using UIRemoteView maybe), hopefully that means opening up the XPCKit framework as well in iOS 8.
> The greatly increased attack surface (with almost direct gpu access via shaders etc) was too risky to host in in-process web views
What really?
Were they seriously even pretending to use that excuse?
Everyone I've ever talked to was 100% certain it was simply because they didn't want anyone by-passing the app store for games and other immersive content they would then have no control over.
I'm not denying that webgl is an attack surface; it totally is; but surely it's a bit rich to say that the webgl implementation in webkit has been switched off for the last 3 years because of security concerns.
If that was the only reason to not having feature parity with other OS's, they'd have some something about it.. oh, you know. 2-3 years ago.
>Were they seriously even pretending to use that excuse?
Everyone I've ever talked to was 100% certain it was simply because they didn't want anyone by-passing the app store for games and other immersive content they would then have no control over.
Well, "everyone you ever talked to" was wrong. Apple could not care less about web apps by-passing the app store in this regard.
For one, they initially offered web apps in place of a native API, and developers (and users) hated it.
Second, they had for the longest time (still do IIRC), the fastest and more complete web browser experience on mobile. Including extra JS hooks for things like the accelerometer and such. Strange coming from a company concerned about the web apps by-passing the app store.
Third, most iOS developers don't pericularly want web apps. The App Store has half a billion of credit cards on the ready, and gives billions of dollars to developers each year. It will be extremely difficult to try to monetize some web based game for iOS users. Plus it offers far better performance (and possibilities for integration) than some Javascript/WebGL game.
Fourth, WebGL is not that great a deal on the desktop web yet (even most casual games prefer the canvas API or Flash still), so why would it be on the mobile web?
>If that was the only reason to not having feature parity with other OS's, they'd have some something about it.. oh, you know. 2-3 years ago.
Well, those other OSes, namely Android, just enabled WebGL in their Chrome browser 9 months ago. And, oh you know, that in the beta version of the broswer. In fact, "Can I Use It", still shows only partial support for WebGL, and for the first time available just in the latest Android Chrome version: http://caniuse.com/webgl
> Everyone I've ever talked to was 100% certain it was simply because they didn't want anyone by-passing the app store for games and other immersive content they would then have no control over.
This never made sense to me as an explanation, if only because for a very long time iOS had a far more sophisticated browser than Android; Android only really reached feature parity with Chrome for Android, and for years iOS was a much better platform for webapps. If they were so protective of the app store, they would hardly have put so much effort into best-in-class modern web support; offline web apps are in principle as much a threat to the app store as webgl stuff.
While true, to be fair they've hardly put much effort into the browser since iOS 3 or 4. Still far behind on many important web standards. Still do releases like once a year. Haven't improve homescreen webapps at all (every time you click the icon it starts the app from-scratch, even if it was already open).
> While true, to be fair they've hardly put much effort into the browser since iOS 3 or 4. Still far behind on many important web standards.
Well, it's much, much faster, and they've added plenty of stuff since IOS4 (web sockets, for instance, which it had a fair while before Chrome for Android).
I love this useless outlook that Apple's ecosystem arguments (which may or may not be relevant banter, depending on your outlook. I for one am tired of the fighting.) automatically mean that all engineering decisions the company makes are automatically ANTI-FREEDOM!!111!!1
It's not like Apple's been a major engineering company in the center of some meaningful tech discussions (RFCs, open source projects sometimes, etc) for 30+ years. Not respecting other people's technical decisions really just hampers/hinders good collaborative work and conversation (especially about security) getting done.
The title is misleading. It should read, "There's a WebGL talk at WWDC 2014."
This blogger spotted a talk at WWDC 2014 on WebGL. He speculates this means there's a forthcoming release of Safari with WebGL enabled by default, thus predicting Apple is embracing WebGL.
This sounds like convincing to me, well somewhere around 75% convincing. :)
This will signal the true start of the WebGL era, because the lack of WebGL on iOS has been a real buzz kill for commercial adoption of WebGL outside of pure enthusiastic circles -- I know this first hand unfortunately in trying to do B2B deals with http://Clara.io's interactive embed technology -- it is nearly impossible to sell to major clients once they realize that it doesn't work on iOS:
Perhaps you could use Ejecta when they ask for iOS support ? Sure they'd still be disappointed that it doesn't run directly in Safari Mobile, but alternatives like Unity or AIR don't run in it either.
Although you probably don't need it anymore now that you have the remote rendering embeds :)
So is WebGL[1], but it is possible that Apple has chosen to open this up beyond the native Safari app by implementing some additional security measures (e.g. https://news.ycombinator.com/item?id=7783137). Of course, we won't know until WWDC, but it certainly is a possibility.
And for those already wanting to experiment with WebGL on iOS there's always http://impactjs.com/ejecta – which will likely still be preferable for packaged games even if embedded webviews will start supporting WebGL as it will still have less overhead than the webview.
I don't think they're trolling - I have some similar feelings about this too. I think the main point is that outside of document-centric websites (say, Wikipedia), rich webapps are basically grossly abusing HTML to wrangle it into a semi-sane (but not really sane) way to render desktop-style UIs.
Think GMail, think Pandora - neither of which parses logically as a document, but are made from HTML: horribly hacky, horribly complex HTML at that.
It would seem to me that we're making HTML do something it was never meant to do, and that there are many applications on the web today that really don't need that stack at all, and really just need code and a renderer (like, say, a desktop OS).
Of course, this is a double edged sword. Simply tossing a renderer to devs is just going to invite a whole slew of confusing, non-standard UIs. Though, that said, if you look at GMail et al, we're already there.
"you do not want to render text or form controls in WebGL"
You don't. I don't. Our users don't. But trust me, there are plenty of ex-Flash devs out there positively ITCHING for a return to the days of un-navigable custom UI elements.
asm.js and WebGL give a hope that one day we will be able to forget most of HTML, CSS and JS like a horror dream they are. At the very least there should be a real choice of in-browser GUI technology and language (no crappy transpilation apart from asm.js please).
I think CmonDev refers to apps made with PhoneGap, Cordova etc., and I agree that using this type of solutions for 3D games might not be the best choice in order to offer a good user experience.