If anybody's interested why there's aliasing it's important to know how CSS transforms work. If you transform, the browser lifts the transformed rectangle into its own layer and renders its content to a texture. Then, later, it is composited with the page by blitting that texture with whatever transformation was applied.
So CSS transforms are an application of texturing. And in texturing what you do to avoid aliasing is mipmapping (you might also throw anisotropic filtering on top). However as the texture holding the rect the browser rendered isn't square power of two, some devices do not support mipmapping it. It is also quite slow to generate the mipmap for a texture every frame.
So with CSS transforms not being able to utilize mipmapping --> aliasing.
Now, in particular mipmapping text (that's supposed to be sharp) isn't even the best idea. It just gets blurry and looks unappealing (although that's better with anisotropic filtering, but then, anisotropic filtering is also expensive and not every device/GPU supports it).
Speaking as an old guy who saw Star Wars in the theater, it looked like that in 1977 as well. Not because of aliasing problems, but because they hit the resolution of the 35mm film and transfer process.
It looks fine on a Retina MBP. The farthest text gets a little tiny bit of flickering maybe, but it looks sharp all the way until it fades out. Nothing like that.
The flickering is what kaoD referred to with discretely changing aliasing. It shouldn't happen, ideally. Small pixels make it a little less jarring but still noticeable.
Well in CSS you can't do anything about it, that's just how the browser works.
If you'd be doing WebGL you can do some things like using square power of two textures and enabling mipmapping and if available anisotropic filtering.
You can also do things like using signed distance field fonts or draw fonts by rasterizing (in the fragment shader) the bezier curves, which can be made to nicely anti-alias (using standard derivatives).
Is this to say that CSS 3D transformations on text are bound to be aliased [in similar circumstances]? Is there a suggestion to the spec such as a "mipmap hint" that could (one day) be added to reduce this flaw without the need for other technologies?
> draw fonts by rasterizing (in the fragment shader) the bezier curves
So, ideally, browsers would be doing this themselves, after applying the transform to any non-text in the texture, and then compositing with the combined texture?
Gotta preload that audio. Even 100ms of lag between the logo showing up and the music starting is noticeable. Jarring even for (presumably) most of us, who have seen that opening crawl dozens, hundreds, or possibly for children of the 70s who went on to own laserdisc players in the 80s, thousands of times.
In any other context, you could safely pull off not preloading. But this example is just too ingrained into people's minds.
This sounds like 3D is turned off in your Chrome. Can you please check your settings chrome://flags. Maybe "GPU compositing on all pages" is turned off -> turn it on.
I have the same issue with no text after the logo. My "GPU compositing on all pages" was set to the default, but after changing it to "enabled" I still see no text. Version 31.0.1626.5 dev-m Aura SyzyASan
Back in 1994, my 486 had no issue rendering the Star Wars opening crawl in a sweet game, Star Wars: TIE Fighter. Fast forward 20 years, and thanks to the wonders of web, my computer can't even render it correctly. I get a blank screen after the star wars logo. So much for progress.
To a tech naive eye, this CSS looks and sounds much better than the 486 version. If I had to show my granny my first pc (a ~20 kilos 486) playing the game intro and a retina ipad rendering this css effect, she'd probably be amazed by tech progress in these last 20 years. Further, I can't see how the next 20 years may produce a greater improvement in this render than the past 20s. It's probably just that we can't fully appreciate progress from within..
If I remember correctly, the crawl converged in a point at about 1/3 rd from the top in the original while in this one it scrolls off the top of the screen (at least in the default view). That's a pretty big oversight.
Ha. This is something I made. It never generated as much traction as my Pure CSS3 Lightsaber Checkboxes [1], but if you wait until the end of the crawl, you can take lightsabers and swing them around and hit HTML elements. Pretty neat demo, but, overall, completely useless.
The code's also on GitHub [2]. It uses HTML5, CSS3, and Box2dJS.
Is is SVG. It's just your browser (I'm guessing Chrome) that chooses to render it at a fixed size and then scale it up. It's perfectly crisp in IE and Firefox, for example.
Cool, but sadly to me this is almost a parody of how modern web dev innovation HTML5/JS/CSS are so far behind the technology that actually exists in the world today (this was originally created in 1977). Web standards are good, but much faster and be far more "open" and expandable.
One suggestion. Don't just create a standard for (monopoly) ECMA Script, instead - create a standard for a VM that other languages can compile bytecode to. Create an API in this VM for graphics, video, audio, ETC. This may be self defeating for these clowns, because JavaScript/CSS/HTML may be decimated in a decade if they actually did this, but hey... One can hope.
Once Google firmly owns the browser (like Microsoft in the 2000s) one of two things will happen. One they say to hell with the standards board and push real innovation and opportunity for developers to innovate. Two, they will sit and let it stagnate. Either way, the cycle is coming around again. Android will own the market in a big way soon. Big opportunity is on the horizon and I am getting my hopes up that there is a big shake up with "browser standards". End the monopoly of lame markup, styling, and scripting (HTML/CSS/JS) in the OS of the internet.
"this is almost a parody of how modern web dev innovation HTML5/JS/CSS are so far behind the technology that actually exists in the world today (this was originally created in 1977)"
That's a weird opinion. The original was analog, created by several people and shoot carefully in specific conditions. This HTML version was made by a guy in his bedroom in a fraction of a time and is sharable, editable and can be viewed on many devices, including mobile, instead of just a movie theatre like the original. For me this is innovation across the board (not only in web standards, but technology overall)
> Don't just create a standard for (monopoly) ECMA Script, instead - create a standard for a VM that other languages can compile bytecode to. Create an API in this VM for graphics, video, audio, ETC.
Please no. That's the opposite of where we should be going. Less executable code on the client, more powerful markup.
JavaScript was a huge mistake that we will likely have to live with for decades, let's not compound it.
"Less executable code on the client, more powerful markup."
Well, since when it comes to layout, the union of everybody's desires appears to be simply Every Possible Thing, your more powerful markup is going to end up being executable code anyhow. If you specify a Turing-complete set of needs, you're going to need a Turing-complete language to satisfy them.
You can already see this happening even in CSS, and the more they try to pretend that CSS isn't code (even though it increasingly is), the more frustrating it's going to get for everyone when you end up with a bug in your CSS, which you are not allowed to fix due to the fact that we're all pretending it's not code.
>Less executable code on the client, more powerful markup. JavaScript was a huge mistake that we will likely have to live with for decades, let's not compound it.
That ship has well and truly sailed. Mozilla and Google both want to make the web into a general purpose app platform, and they have veto power over any proposed standards. Admittedly they both want a slightly different version of the idea (asm.js vs Dart or NaCl), but that gridlock just leaves us investing even more heavily into the current Javascript ecosystem.
One suggestion. Don't just create a standard for (monopoly) ECMA Script, instead - create a standard for a VM that other languages can compile bytecode to. Create an API in this VM for graphics, video, audio, ETC. This may be self defeating for these clowns, because JavaScript/CSS/HTML may be decimated in a decade if they actually did this, but hey... One can hope.
How would that bytecode be any less of a "monopoly" than ECMA Script is now?
Web standards may be "far behind" in what they can do compared to native code, but their are far, far, far ahead in terms of giving some control back to the user, whether it's in terms of enabling browser extensions, letting content be viewed without running arbitrary code, doing presentation transformations (e.g. Readability), well defined semantics for sharing content (links), etc.
Your suggestion would grant some more power to developers, at the expense of users. So thanks, but no thanks.
"create a standard for a VM that other languages can compile bytecode to. Create an API in this VM for graphics, video, audio."
There already is a standard VM with an API for graphics, video and audio - it's called HTML5, and, as a bonus, it also includes an API for text layout. Sure, if you sat down intending to design such a VM, you might not come up with HTML5, but so what; what concrete improvement would you get by distributing programs as bytecode rather than JavaScript?
If anybody's interested why there's aliasing it's important to know how CSS transforms work. If you transform, the browser lifts the transformed rectangle into its own layer and renders its content to a texture. Then, later, it is composited with the page by blitting that texture with whatever transformation was applied.
So CSS transforms are an application of texturing. And in texturing what you do to avoid aliasing is mipmapping (you might also throw anisotropic filtering on top). However as the texture holding the rect the browser rendered isn't square power of two, some devices do not support mipmapping it. It is also quite slow to generate the mipmap for a texture every frame.
So with CSS transforms not being able to utilize mipmapping --> aliasing.
Now, in particular mipmapping text (that's supposed to be sharp) isn't even the best idea. It just gets blurry and looks unappealing (although that's better with anisotropic filtering, but then, anisotropic filtering is also expensive and not every device/GPU supports it).