> One person, working mostly alone, produced better quality drivers than a whole team working at Qualcomm
Does anyone have any insight into how this ends up happening? Are Qualcomm just understaffing that team, hiring incompetent people, deprioritizing it (let the intern do it)?
>And because nobody seems to care about these issues
The word "quality" is very subjective in this case. Perhaps the driver already did everything Qualcomm needed it to do. Maybe the code isn't perfect, but if it's working and money is being made... "if it ain't broke, don't fix it". I'm sure a non-trival amount of people on HN have worked in less-than-good codebases, but finite resources & politics kinda prevents you from cleaning it all up with weeks of re-factoring and testing. I know I've seen this.
One reason I've seen these projects (take an existing system and make it work well) fail to gain traction is an unstated lack of faith in the people pushing for refactoring. In my experience this often comes from others having been burned by their prior proposed projects, where they've wasted 6 months doing self-indulgent refactoring that went nowhere or introduced bugs or made it hard for the rest of the team to contribute because of a bunch of new abstractions "to make the system flexible and generalized". The same people who care about code quality enough to argue for a cleanup project sometimes go overboard and burn their credibility. Unfortunately the right mix of judgment is hard to evaluate in any sense that isn't super-context-dependent.
Finite resources? Qualcomm is doing almost $20B in revenue, absolutely owns the market for Android superphones at the moment, and the Adreno GPU is the signature element of their market-leading SoC. Yet they can't make the driver work right per the relevant specs?
As someone that has some experience in big companies like that
1 - Hardware companies DO NOT KNOW how to develop software. The one that knew best was Nokia and see what happened to them. That's why mobile vendors are moving to Android or "pre-packaged solutions"
2 - They usually go for the pre-package corporate BS "solutions" for their development process, like Java in underpowered hardware, "solutions" that involves shuffling huge amounts of XML, "tools" like clearcase, etc. Don't think for a second this doesn't affect software quality (for the worse)
3 - The way people end in companies like that is: "I'm focusing on a stable job", hence knowledge of latest technologies (like OpenGL - that "latest" in a very wide sense) tends to suffer.
Having worked with a company that ported applications to the BREW platform, I can assure you that Qualcomm's QA process is "Does it work for the applications we ship?".
This leads to fun things like using the threading API causing a device reset, and the device reset API doing nothing.
Hardware companies are usually terrible at software - there just seems to be this incompatible worldview, even when they work very hard on the problems.
Having had some experience fiddling with consumer grade routers, I am convinced that nothing could have produced the resulting mess that is supposed to be the software, other than sheer unadulterated apathy.
After hundreds of different router models, it's vexing that so many of these companies are/were unable to come up with software as decent as ddwrt/openwrt to work on their OWN hardware. It's not even standardized across models. Why the hell would you use completely different stuff on two different but ultimately similar products you have? Why not just license one of the existing opensource solutions and call it a day?
surprise surprise, the only people who do are Apple. Airport extremes are genuine pleasures to set up, there's a literally an app for your iphone you can use which lets you do most basic tasks.
The way Apple products work makes sense if you imagine they were imported from some alternate world where Apple has monopolies in each category it competes in. If you really do only have Macs, iPhones, Airport routers, AppleTVs, etc. it all works together really well. An all-Mac household has a similar feeling to an all-Windows enterprise office--everything magically talks to everything else and sets itself up and enforces policies it gets politely asked to by the domain.
I kind of wish there was some open standard on this OS interoperation glue stuff -- X11+NFS+Kerberos+MIT client-cert auth was just about right for the early 90s, but we screwed it all up somehow instead of evolving it into something competitive with the two big modern closed-source environments. I mean, Linux got WINS-based zero-config link-local peer name resolution from Microsoft, Bonjour/mDNS-based resolution from Apple, but this was never a problem the FOSS community (or even the BSD folks) tried solving themselves. When there are real open-spec solutions for problems, Microsoft and Apple both seem happy to implement them; Linux just doesn't seem to want to do anything other than playing catch-up when it comes to multi-device experience.
From when I was in to building PCs and overclocking, I remember all the horrible value-add tools that came on CDs with motherboards etc. They were sure fond of their homemade UI widgets and non-rectangular masked windows.
That is more of an Asian cultural thing that is immediately clear when you visit a Chinese computer store and see blinking non-rectangular widgets obscuring anything that is not a computer.
It's a bit much to say that one person is beating Qualcomm. Keep in mind that Qualcomm's driver is custom code, while the freedreno code is based mesa/gallium. So, it's already built on thousands of man hours of work, plus all the engineering effort that Intel and Red Hat have put into the OSS graphics stack.
That's fair, but there's another way to look at it: Mesa and Gallium are MIT licensed, why are they rolling their own inferior solution when they can "steal" a better one? You'd think if your company's bottom line was built even partially on its software, the phrases "permissively licensed" and "better than we can do" alone would be enough to get the (glx)gears turning. Why did a single hobbyist make a smarter technical decision than a team of professionals?
EDIT: From yet another perspective, maybe rolling your own crappy driver infrastructure and OpenGL implementation that only you and your colleagues understand is the smartest decision of all.
This is just speculation, but my biggest guess open-source moves fast, and you have two choices: keep up with upstream, release your code and let the mesa community maintain it, or maintain a years-old fork internally.
Building your own might sound like a great idea when you first start out: you can fix any bugs you want without having to trudge through the giant amount of open-source code and keep up with all the architecture changes since.
Also, sometimes companies make dumb decisions. The fact that Mali and Qualcomm share some code smells to me like they've licensed a GL implementation from a third-party, which could be a reason why we aren't seeing any bug fixes: they don't have the source code to their own GL implementation.
Which just demonstrates how stupid it is all these hardware companies maintain their own proprietary garbage blobs when they could just grow up and use a common implementation that is reviewed and maintained by a half dozen companies.
As others have said, there probably isn't a tangible business justification to go back and fix a problem that isn't materially impacting a large percentage of customers. A lot of managers follow the Pareto (80-20) rule, which just means fix the top 20% of bugs, and 80% of your problems go away.
The cost of actually fixing a bug is proportional to the amount of time which has elapsed since the code was written. The cheapest time to fix a bug, of course, is right when a programmer is initially writing something, and it only gets more expensive as time goes by. This is why writing unit tests is so important; if you can catch bugs up front you'll end up saving a lot of pain and agony of trying to go back and figure out what went wrong.
Another important factor is there are a lot of politics in big companies. If there isn't an advocate (usually an engineer) who demands that a bug gets fixed, the bug probably isn't going to be fixed. The problem with advocating for something though means that you need to spend some political capital. When it comes to bugs, unfortunately there isn't a lot of positive political capital that you'll recapture as a result of fixing it. Managers aren't going to get praised for the lack of bugs in a particular piece of software, unless the software was buggy to begin with, someone complained loudly about it, and they went in and cleaned it up. If that happens and they were responsible for the bugs to begin with, they're probably not going to want to highlight the fact that it was their fault.
I hope Dolphin can take a look at iOS 7 and see how complete Apple's implementation of OpenGL ES 3.0 is:
> OpenGL ES 3.0 includes as core functionality the features of many extensions supported in OpenGL ES 2.0 on iOS. But OpenGL ES 3.0 also adds new features to the OpenGL ES shading language and new core functionality that has never been available on mobile processors before, including multiple render targets and transform feedback. You can use OpenGL ES 3 to more easily implement advanced rendering techniques, such as deferred rendering.
However, Dolphin is most definitely not allowed on iOS devices, so I don't think they care. I'm sure someone will make a port for jailbreak users eventually, but as far as I can see, that's a lot more difficult these days than when I last had an iOS device.
> However, Dolphin is most definitely not allowed on iOS devices
I assume you're referring to "no interpreted code".
I believe the rules are that you can interpret code, as long as you supply all of the code up-front.
So if you're using Lua you can appear on the App Store, as long as your app contains all the .lua files or strings or whatever in the app bundle that gets reviewed by Apple.
So Dolphin could be allowed on iOS devices if it shipped with the games it was going to emulate, and those games didn't download code.
iOS also does not provide APIs to map executable memory, which is required to run our JIT. You could run Dolphin in interpreter mode, but it's incredibly slow even on a high-end Intel CPU.
Dolphin couldn't possibly ship with those games though, as the project doesn't have any such distribution rights. You're expected to provide your own game dumps.
The iOS7 GLES implementation is almost certainly written and delivered by Imagination Technologies. Surely Apple has customized it a little, and being (by a huge margin) the most important customer they have strong influence to drive features where they want. But they aren't the GPU vendor.
In this case it is likely mostly about this being Qualcomms first version of the gles 3 driver and updating a driver for a mobile phone isn't as straight forwards as updatung mesa. It is likely to get better.
If I was GabeN and wanted to sell the SteamMachines and distribute SteamOS I'd offer to help these companies debug their OpenGL drivers so that Linux/SteamOS driver support is top notch by the time SteamOS and SteamMachines launch in 2014.
By helping out the FOSS community to have better OpenGL driver support it will get Valve Software a lot of goodwill. Projects like Dolphin will benefit from it, and of course those Steam Linux games will as well.
I really want to see the Linux Gaming Platform take off, many game developers avoid Linux ports because of buggy OpenGL drivers and bad support of it. I think Microsoft did this with DirectX standards because I remember early on DirectX drivers were buggy and game companies wanted to stay with DOS based games and not port to Windows, because DOS had better video support drivers. Once Microsoft worked with video card OEMs to improve their drivers and DirectX support, the video game companies ported their games to Windows 32 bit format (Windows 9X and then later NT/2000/XP) because of DirectX based libraries and better drivers and fewer bugs. It will take a company like Valve to coordinate the OpenGL stability between OEMs to get it stable enough to port games to Linux.
I've been gaming in Linux with both native and Wine/CrossOver games(free year thanks to Bush) for over 8 years now. Drivers are the biggest issue. There are 2 main issues here. First is quality. The video card manufactures need to dedicate additional resources to bring the drivers up to the quality of Windows.
The second issue is with the driver subsystem. The kernel developers need either a) stop breaking the existing driver interface ever other release or b) create new modern interface that's stable like MS did with Vista.
Imagine if every windows update, you had to worry about you games not working because MS might break the drivers. This is how Linux is now. This situation is especially bad for users with old and very new cards. For example, my X1900 can't run the catalyst drivers with any modern kernel, making it useless in Linux as my spare gaming rig. Of course it works perfectly in Win7, running about 2-5x faster than the worthless, buggy, open source drivers.
Fixing the video driver subsystem is exactly the kind dedication that would prove to me that Value is serious about making Linux a competitive gaming platform and likely convince me to purchase a SteamMachine.
The kernel developers will never stop breaking internal interfaces. Them being able to break internally is how they are able to move so fast and improve as fast as the kernel does. Programs written against Linux's external interfaces many moons ago will continue to work indefinitely, the issue is Video drivers need to work internally with the kernel are arguably even violate the GPL.
AMD have been shit since forever. There should be a message with giant letters printed on each box saying "THIS PRODUCT DOESN'T HAVE LINUX DRIVERS WORTH A DAMN".
Not that it helps your situation now, but maybe if you upgrade in the future, try with an nvidia card?
Qualcomm being at the bottom is counter to my experience in writing GL ES 2.0 apps on Android (since the Motorola Droid!). Adreno generally performs very well, and doesn't have weird performance characteristics like Mali and SGX. I also like Tegra because it has solid predictable performance. A big part of why Qualcomm have been cleaning house in the Android world is because they have a more solid BSP, so compared to the other ARM guys they have a pretty good software group.
I don't use most of the extensions that the Dolphin guys are using, though, and I wouldn't expect anything but the most basic ES 3.0 stuff to work. But maybe spending so many years on mobile has taught me to have low expectations! :).
> But maybe spending so many years on mobile has taught me to have low expectations! :).
This, and the post is not solely about performances. In fact, it's only a minor component, most of the notes and complaints are about outright bugs and broken implementations.
And of course, you "wouldn't expect anything but the most basic ES 3.0 stuff to work" while the Dolphin guys are porting down from desktop-class graphics backends.
This fits with what I've seen in other OSS projects - nVidia provides the most complete and robust implementation of OpenGL, Intel does pretty good but it's too slow and restricted to be useful anyways, and ATI/AMD basically needs you to code specifically against their device.
Isn't there something approaching the Acid2/3 test for graphics drivers? It seems like the popularity of the Acid tests was beneficial for the browser ecosystem...it was an easy and standard thing to point to.
The Khronos group has OpenGL and OpenGL ES conformance tests. Unfortunately, they are not public and you have to be a member to access them.[1] The only public OpenGl test that I know of is the piglit suite.[2]
Mali has a different architecture than desktop GPUs, so glClear generally isn't implemented as a clear op (or drawing two triangles that cover the viewport, etc). It clears out all of the commands that were previously scheduled for that target.
But yeah, the Mali driver has various performance quirks and oddities (there used to be a lot of weird stuff around how you updated index buffers too).
Firefox allows you to use switch your WebGL Renderer from default ANGLE to native OpenGL drivers with the webgl.prefer-native-gl about:config parameter.
You can then run the WebGl Conformance Tests[1]. I found that my integrated Intel did better in ANGLE and Nvidia card did better with native drivers. Epic Citadel WebGL also slightly faster with native Nvidia.
This really puts a damper on my excitement for the Nexus 5 (or whatever it will be called). I hear that it will be running an Adreno GPU. On the other hand, I guess it is going to be rough no matter what option you pick in the mobile world.
Does anyone have any insight into how this ends up happening? Are Qualcomm just understaffing that team, hiring incompetent people, deprioritizing it (let the intern do it)?