Oh boy here we go... queue the comments about how Apple hates your cat or whatever ridiculous theory people will invent to explain this.
Apple is testing WebGL. Introducing it in iAds is a pilot and will allow them to shake out bugs and such. This is a good thing as it means that WebGL is coming to iOS soon.
Apple often uses frameworks itself in one generation of the OS (Mac or iOS) and then makes it public in the next release after they've worked out the kinks. This is similar, they are testing out WebGL in something under their thumb that they control tightly, it just happens to be less private than usual as 3rd parties will get to bang on it with iAds.
I think if you're worried about having a large unvetted attack surface, simply releasing it en masse to beta testers won't usually result in a very secure product. Sure, you'll get some bugs closed but plenty others will simply get ignored or worked around or worse reserved by bad actors for when it's live. It only really makes sense to release something like that to a wide audience after you feel like you've made a solid first attempt.
Are iAds the only and best way to test this out? Imagine ad developers given a free reign on animating ads while distracting users and taking up precious phone resources like CPU/GPU/RAM/Battery which need to be available for the actual game or app that the user wants to run.
Haven't we learned anything from the web and flash ads that we are forced to repeat it all over again on mobiles?
It's a very safe way because if there is a bug in the WebGL implementation, Apple can quickly pull the Ads that crash WebKit.
Apple will have huge advantage of being able to pull the content, especially if it is a security bug. If Apple enabled WebGL in Safari, then Apple would not be able to react as quickly as all users would have to upgrade their iOS version.
I totally agree with you about the unpleasantness of ads in general, but iAds don't work like that - a static banner appears first and then if the user taps it, the interactive portion is started.
Oh boy here we go... queue the apologies for Apple, how this is innovative and how it will benefit the platform.
I'm sorry for the snark (not really), but do you have any citations for any of this? I'm a skeptic, but it doesn't help that they still haven't turned on the acceleration for Javascript in webviews.
Yes, I saw that. That has nothing to do with mobile, nothing to do with Mirosoft trying to exert their native development over the web, which is sorta my implication here (though I did read that one of the embedded web views is Nitro'd but the jury's still out on the other).
No apologies because none are warranted. Just watch Apple for a few years and you'll see how routine this is.
What if WebGL turns out to be a flop? (I know it's unlikely at this point, this is purely hypothetical) If that were the case they didn't launch some new thing and have a bunch of people using it in their apps or on their websites in production, thus be obligated to support it. They are cautious about introducing new APIs. That's it.
That's fine. I'm in no rush to accuse them of anything here. WebGL is new enough that I'm shocked to hear they're pushing it at all, but I also know nothing about it and graphics (esp at that level) are as foriegn as photoshop and water painting to me. I'm looking forward to seeing how it goes and/or is adopted more widely!
Although it's easy to look at this in a sinister light, the simpler explanation IMO is just that Apple doesn't feel entirely comfortable with its implementation of WebGL on iOS and wants to be able to screen apps instead of turning it into iOS's Flash.
I assume there's probably a risk of arbitrary code execution, which is mostly mitigated by vetting each ad. Too bad though, would be nice to see what could be built for iOS web apps.
I'd love to know what the imagined risk would be - as far as I can see the only additional capability from WebGL against regular JavaScript is to send commands and data to and from the GPU. What harm can that do?
There are a few issues here. One is that bugs in the GPU itself can enable memory corruption (and thus code execution), and the other is that GPU drivers are notoriously buggy and huge, making a large attack surface. I can't blame them for being concerned in this case -- opening the GPU to web developers at large opens up a huge can of worms.
I hope that Apple has enough clout with their GPU providers to get them to fix their driver bugs and implement ARB_Robustness. Seriously if someone can get a good and safe implementation of WebGL it's apple. They own the hardware and the software.
It's not really the GPUs that are the issue (once it's on the GPU I don't as far as I know see a way to hijack the system) it's the drivers that are the problem. Before your shaders can run on the GPU they have to be compiled and it's that that funny stuff can occur I believe. I guess you could maybe find a way to interfere with the other applications using the GPU for rendering.
Hopefully someone here as a good understanding of this stuff and can tell me if I understand the risk wrong.
once it's on the GPU I don't as far as I know see a way to hijack the system
I think GPUs can do DMA => If you can break into a GPU, you may be able to write into kernel space, just like you could (can?) break into a Mac over it's FireWire port (and, possibly, the Thunderbolt port in new MacBooks)
GPUs don't respect CPU memory protection -- they can write to any region of mapped memory. If you happen to have some of your program code living there you can "draw" over it with replacement code.
What you wrote doesn't make any sense whatsoever, you seem to imply that there is an inherent problem by design with this, but there is no such thing. There are possible bugs etc. that could lead to arbitrary code execution by exploiting as-yet-not-found holes in current drivers. But saying that you can draw over program code would allow even browser content to overwrite program code, especially in these days of GPU accelerated browsers.
It's only a problem if the bits of memory you want to enforce special protection rights on (ie: no execute, read only, etc.) is also mapped into the GPU's address space.
It's not something that "just happens" normally, it most often requires clear intent to do so, but it can be done.
It all depends on the implementation though, right? If Safari just does a simple check to see if the source URL is something like http://adserver.apple.com then it would be trivial to hack DNS and send WebGL to your iOS device.
I can definitely see this being an attack vector used by jailbreakers to jailbreak iOS 5. I think the only way they could prevent these type of DNS jacking attacks is to use a trusted SSL certificate on their iAd servers. I'm not sure what type of performance penalty that would add, forcing all advertising traffic to use SSL.
They don't, but that's a security restriction. They can't give dynamic-codesigning to all apps, or their security (which that disables, as a requirement to enable the JIT) would then be useless.
Surely the same page running on the same browser on the same device but with 2 different javascript engines depending on where the page was launched from can best be described as a bug. A design bug, perhaps, but a bug.
How does this make any sense? Apps from the iOS App Store can use webkit views where webGL could apply but native apps have had direct access to OpenGL ES for years (only the original iPhone, iPhone 3G, and the first iPod touch iOS devices did not have a GPU and hence no OpenGL). I was pretty sure iAds only ran in native apps and hence could use OpenGL directly. So why would iAd developers have any interest in using WebGL.
Obviously web site developers and HTML5 app developers care very much for WebGL access in mobile safari. If Apple 'artificially' holds back WebGL they will just be shooting themselves in the foot. Imagine the damage if other pad platforms allow interactive, 3D web browsing and the iPad does not. It would be suicidal.
Aside from the security concerns, I suspect Apple would prefer game developers to write native apps rather than find itself constantly being benchmarked against last night's custom build of Android or Chrome or Firefox running a WebGL game no-one actually plays.
I wonder if this means WebGL will be in Android ICS. Sony Ericsson has already showed some demo in WebGL on a phone earlier this year, if Apple is starting to implement it in iOS already, then Google could do it, too. I just wonder if they considered it a priority, because when asked about WebGL coming to Android's mobile browser at I/O, the WebGL guys weren't too sure if that will happen anytime soon. But I sure hope so! Upcoming dual core and quad core chips should be able to take advantage of it somewhat.
Apple is testing WebGL. Introducing it in iAds is a pilot and will allow them to shake out bugs and such. This is a good thing as it means that WebGL is coming to iOS soon.