The reasons for not open-sourcing the drivers don't make sense to me. Sure, the community might not have the knowledge necessary to improve your drivers, but then no harm done. Drivers might be taken in directions you don't want them to go? These companies release new chips multiple times per year. With them come new drivers. Who cares what directions the old drivers take?
And furthermore, chip companies might just get free driver debugging. That makes their chips look better. It makes them look better. More sales.
Is there any possibility that the real reason for the lack of open-source drivers, is that in many cases "different" graphics chips are essentially the same, and what you're really paying for with the more expensive ones, is a driver that enables high-end functionality? Open-sourcing would wreck that.
I agree with you that GPU drivers should be open sourced. In fact, all driver source code should be released for all operating systems. They also should ship with instruction manuals and data sheets with circuit diagrams.
I write GPU driver software (in userspace) for living. I'd be much more happier if I could push my code to GitHub for everyone to see.
However, I do understand that keeping the drivers secret can be a competitive advantage. GPU's are devices unlike any other, they're all different from vendor to vendor and model to model. The driver source code will reveal crucial parts of the hardware design. Knowing what your adversaries have in their chip is a disadvantage to them. I guess some reverse engineering probably takes place, but that is at a whole different level than actually seeing the source code.
OpenGL is also an issue. It's a nasty legacy API from Silicon Graphics, dating back to 1992 or so. It's not only a very crappy API for the programmer, it's also hell to implement. If you don't believe me, try to look for texture completeness rules in the GL spec (the problem doesn't exist in Direct3D, because they don't have backward compatibility or legacy API's to deal with).
So, having a working OpenGL implementation that passes the 30,000+ conformance test cases is also a competitive advantage you don't want to give away. It's worth several years of programmer effort.
However, I still think that we and everyone else should open source their GPU drivers and OpenGL implementations and compete in who makes the baddass-est silicon chips.
"The reasons for not open-sourcing the drivers don't make sense to me. "
It makes total sense to me:
1-Every couple of years they can sell you a new card as the one that you have does not work with the new OS.
2-Competitors can not look at the code, understand and copy it, so technology leaders like NVIDIA are not reached by followers without significant investment.
3-Software Patents holders(patent trolls) can not look at the code and demand them exorbitant fees because using a buffer to paint on a computer screen is patented by them. This is very common in the USA and the main reason companies asked me for an NDA if I saw their drivers code.
4-People can not unblock features of cheap cards to make the same things expensive cards does(it is very common to make only one chip to get mass production cost and then to use some cheap hardware or software switch to deactivate features on cheap cards).
5- Video decoding and other stuff could be out sourced to specialist companies that demand their code to be secret by contract.
>> 4-People can not unblock features of cheap cards to make the same things expensive cards does(it is very common to make only one chip to get mass production cost and then to use some cheap hardware or software switch to deactivate features on cheap cards).
This is also sometimes done with slightly defective chips. E.g. AMD has sold some quad-cores as three-core cpu's when one of the cores has failed some quality control check. Sometimes they can be enabled and they work fine but there may be a risk involved.
I know this is sometimes also done because of marketing reasons. Some years ago, IBM was selling additional Java or DB "accelerator" chips to high-end servers. They were in fact the same kind of processor that the server shipped with but they were crippled with microcode so they could only run the JVM or a DB2 server or something. I very much dislike this practice, talk about wasted engineering effort.
These companies release new chips multiple times per year. With them come new drivers.
Not really. Because different GPU generations tend to be similar, they tend to have "unified" drivers that support multiple GPU models. This leads to problems if the community has diverged from your last code drop and now merging is really expensive and you'll get blamed if you don't do it (see Android).
Then there are additional problems because the vendors want to use the same driver core on Windows and Linux, but they definitely don't want to GPL the Windows version so they can't accept any GPL patches from Linux people, etc. (see Broadcom)
they definitely don't want to GPL the Windows version
Why not? Do the drivers provide them with a competitive advantage over anybody else? In the case of a GPU, I imagine they might. In the case of a wireless card, modem or something of that sort it seems pretty unlikely.
An example problem of proprietary drivers: Nvidia doesn't support the latest KMS/GEM APIs which are needed for smooth-looking boot and (I guess) for future stuff like Wayland. If the driver was open, the community would have ported it.
I disagree with "needed". It's entirely possible to have a modern, "smooth-looking boot" now even with the nVidia drivers as they are.
Wayland support is a valid point though.
Regardless, nVidia remains at the top of the pile in terms of performance and support on Linux. Especially for high-end 3D applications like Maya, CUDA, etc.
nVidia also has fundamental disagreements about the right graphics architecture for the system with many of the Linux folk. I personally think nVidia's concerns about that are valid.
It's unclear whether GEM, etc. are the right long-term approach.
Saying that the nVidia drivers are "at the top of the pile" doesn't say much for the competition.
Just a few of the problems that I have had:
1) In a 2 Monitor setup there is no way to rotate one of the monitors into portrait mode without the other unless you create them as 2 seperate X sessions.
2) Fullscreen almost any OpenGL app/game and it will span across both monitors, which just looks weird.
3) A very odd issue where on my second monitor the fonts will randomly screw up or parts of a window randomly become transparent and I can see the window behind (seems to happen mostly with flash) through it. Only way to fix it is to disable and then re-enable monitor.
Yes, I used to have an ATI card and I can confirm that it had just as many if not more issues.
>> 2) Fullscreen almost any OpenGL app/game and it will span across both monitors, which just looks weird.
This is because the games often use a library like SDL to do the windowing. Their idea of opening a window is called "SetVideoMode", so no wonder it goes wrong.
If a software uses X11 windowing in a way that most apps these days do, using extended window manager hints for fullscreen, etc, this problem would not exist. Btw. I use my window manager's full screen feature to do this. It works a whole lot better for games which allow resizing windows (this often excludes SDL-based games because you explicitly have to tell SDL you want a resizable window) than letting the game set full screen mode (usually via some horrible deprecated X11 legacy api).
The nVidia display driver is not perfect either, it doesn't support Xrandr yet, so expect problems with multiple displays (and rotation) until the situation is corrected.
One of the reasons I always shop for PCs with integrated Intel graphics is because things like multiple-monitors and rotation generally work out of the box, with no configuration needed beyond opening the standard GNOME Monitor Config tool.
Providing documentation sufficient to write an open source driver would not prevent nvidia from also providing a binary driver that does things the "right" way.
Unfortunately I don't think this will ever happen, since from what I know, most of the nvidia binary driver is basically a firmware to run the 'dumb' multi-core gpu hardware. Giving this away would provide valuable knowledge to their competition.
Hmmm. I'm not shedding any tears over the lack of graphical boot. I like to watch the messages.
It's not just about having or not having boot messages, which are still there in framebuffer mode if you want them. In fact, with a graphical console, you can see lots more messages at once, at the perfectly crisp native resolution of your screen, with large fonts if you like.
A major part of the issue is video mode switches causing ugly flickers, with many LCD monitors taking ages to resync and turn the backlight on after a mode change. You can miss a lot of boot messages in those precious seconds. Sometimes my monitor takes so long to sync that I don't even get to see the BIOS screen or GRUB menu.
Meta rant: why do people feel the need to chime in with how much they don't want something when someone else says they do?
What I would love though would be reliable suspend/resume on an OpenGL 4.2 accelerated notebook.
Putting all drivers in the hands of the kernel subsystem maintainers would go a long way toward solving that.
with a graphical console, you can see lots more messages at once, at the perfectly crisp native resolution of your screen
That would be nice.
why do people feel the need to chime in with how much they don't want something when someone else says they do?
In this case, I'll admit I'm not a fan of splash screens in general.
I've been a teeny-tiny bit annoyed over the years at the level of effort I've seen kernel and distro developers put into graphical boot schemes when it seemed to comes at the expense of more basic stuff, like you know, working drivers for things after the boot was complete. But it's their time and energy. If they want to put it into showing a penguin holding a beer that's their business.
To me it's usually a sign of form-over-function, or functions-I-don't-want-over-functions-I-do.
I started working with the Linux framebuffer a long time ago. The penguin boot logo was cool, but the biggest benefit was being able to code in a 128x43 (or bigger) terminal without running X. I see KMS as the long-overdue fulfillment of the dream I had then of decoupling high-end video support from X.org.
Back then I was using links2 in graphical mode, fbi for viewing images, and fbgs for reading PDFs. Since I'm working with web apps and WebGL now, I'm obviously running X.
Entirely agree, I remember my first Linux install (Suse 6.x) seeing all of those messages fly onto the screen , first from the bios and then from Linux. It would go straight to a text login then I would type "startx" to get the GUI up.
Also made it much easier to tell WTF was going on with your system and displayed all those warning messages that I am guessing are just hidden away now unless you happen to look at dmesg.
I believe a lot of the problem is that the major vendors believe they'll expose themselves to patent lawsuits by making it easier for aggressors to make a case.
Besides any patents there are issues were Nvidia/AMD do not write every piece of their drivers. They can't unilaterally open source pieces of code that they don't even own the copyright on.
Patent thickets and minefields are common in the hardware world. Perhaps you remember the lawsuits and "cross-licensing agreements" between Intel and AMD years past?
Plus, it doesn't have to mean that it's hard to do in the first place. Being able to grep someone's source code tends to make it a bit easier to find that obvious usage of the 'if' statement (or whatever) that you have a patent on.
IMO all drivers need to be Free Software. The problem is that as soon as you introduce loadable modules into a kernel, there's really no way to prevent them because of the precedent set by Linux. One of the reasons the LGPL was developed was the linking issue, but Linux developers seem to be ok with letting binary blob modules as a "gray area": http://kerneltrap.org/node/1735
If Linux developers had from the start demanded Free Software drivers, they'd have had them (please spare me the arguments about "Linux wouldn't have grown if they didn't allow binary drivers"). Letting that precedent pass means that hardware developers don't have many incentives to provide sources for their drivers. This affects not just Linux, but all Free Software operating systems.
What incentive would hardware venders have to provide open source drivers for a tiny enthusiast operating system? No other OSs have had that requirement. It's hard enough to get working binary drivers.
I don't think that all the Linux driver issues would be solved with the release of source code. Driver code already makes up a large majority of the kernel. The sheer amount of code contained in binary drivers would dilute the already relatively small number of volunteers available. The best way to increase the quality of Linux drivers is to get venders to invest more in Linux driver development. The only way to do that is to vote with your wallets by supporting Linux friendly hardware, and increasing the size of the Linux market.
OpenBSD has always had a policy against binary drivers.
The reason it's "hard to get binary drivers" is because hardware manufacturers have no incentive to release documentation for their hardware, because they know they can get away with a binary driver for Linux. Then they call themselves "Linux friendly" and people like you get the wool pulled over your eyes (until you encounter a bug, or work on the kernel, or want to use FreeBSD, or suddenly find that your hardware isn't supported on Linux 2.6.xyz, etc, etc)
Even if they were to write Free Software drivers themselves, it's likely they wouldn't make it into Linux anyway because of quality issues (most driver code is shit). Community-developed drivers are frequently better quality.
Arguing that Free Software drivers would be "too much code" in Linux is just idiotic. Let's just stop writing Free Software altogether because it's "too much code."
I would guess that it's roughly in the same ballpark as non GPL software being able to run on Linux. Just happens to be ring0 rather than ring3.
I don't think it is realistic to expect all drivers to be Free Software especially specifically distributed under GPL. A huge number of Linux drivers are GPL anyway.
What makes you so confident that Linux hackers would have managed to reverse engineer all nVidia's stuff to the point of having a good stable performant driver? Especially considering nVidia release new cards with new capabilities (thus requiring driver modifications) very frequently.
If nVidia had decided that they would release no free driver and therefor no driver at all, I don't see the business case for somebody being paid to develop a free software driver for nVidia cards and it's certainly not a weekend project.
The problem is that all the ARM-system chipsets are shipping binary-only drivers for Android and OMAP devices, even though Linux is used almost exclusively on those devices.
That's the business case.
nVidia may not care about Linux because it's a tiny portion of their market, but chip manufacturers (like PowerVR, for example) would have had to release Free Software drivers if Linux hadn't set the binary precedent. This would also be true for a lot of other hardware now. But unless there's a radical change in policy, it's too late.
What's the point of having a Free Software OS if you can't run it on your hardware because the closed-source drivers won't work?
A missed bullet point: Ability for FOSS developers to port drivers to obscure platforms. Right now the Haiku OS folks are having a heck of a time trying to get hardware acceleration to work with nVidia hardware.
Maybe it should start with open hardware. The Open Graphics Project seems dead, but maybe there are other alternatives.
There are a lot of open hardware projects lately. Synthesizers, computers, 3D printers. All great examples. But video cards might be too complex to be open.
"...the open source driver is not very fast because there just aren't enough volunteers."
I'm sure that is part of it, but also writing a high performance/low latency GPU driver is really hard. So when you find someone who can write such drivers, they have no time to 'volunteer' because of all the money people are throwing at them.
It would be fabulous also if AMD would dedicate a resource to walking through their specs and then writing a paragraph about each register and how it was used. The original Voodoo library for the 3Dfx cards was done that way and it really helped. We need GPU documentation for someone who hasn't designed a GPU before.
Yes, but to be fair, it is more the hedonic treadmill than when goalposts get hiked around in a debate as a bad-faith debating tactic. I'm using the ATI drivers right now and I'd have loved to have had this quality of driver ten years ago at the height of the "if only they'd release documentation" phase.
I really resonate with this comment. The drivers that exist today are so much better than the ones we had before. They aren't as good as the ones on Windows or MacOS but they actually do 3D acceleration and handing multiple monitors, which they did poorly or not at all before.
So in that regard things are much better. But the grass is not only greener on the Windows side of the fence they have better shader support to make it wave :-) One of the things I had hoped various FOSS organizations might create would be the equivalent to a University 'chair', a position paid for by an endowment to the organization, to encourage some expert in the chair's area of interest in working full time on that.
NetApp supported a group at UMich for a long time who were making the NFS drivers in Linux better and Trond Myklebust who was the kernel gatekeeper. Sort of a public/private partnership. That effort was working because everything needed to succeed was pretty much available (docs, sources, examples, etc).
I had a marketing VP at S3 (back when they were making graphics chips and not an IP holding company) tell me that they didn't release documentation because they didn't know if they were violating someones patent and this way even if they were the patent holder didn't know either, it was better for everyone.
Now later events have born out the notion that patents are the center of the dispute here, between the SGI patents, the Microsoft patents, the 3dfx and nVidia patents, and various other players, clearly that has had a tremendous chilling effect on open communication.
Of course Voodoo graphics were introduced in 1996, so in 2016 3D video card patents start expiring and by 2020 most of the DirectX 9 pipeline's concepts will be public domain. Sadly we have to wait until 2025 for shader technology to unlock sufficiently.
"Even with documentation, the open source driver is not very fast because there just aren't enough volunteers."
IMHO we don't need more volunteers. We need more full time workers here. Graphics hardware is very hard, low rewards, very low level and very taxing for your health(it is not healthy to be in front of a screen all day doing nothing but solving errors, and seeing new errors appear).
Only students are smart enough and willing to do the necessary work because they learn a lot.
But as experienced students get hired, they are not going to continue contributing as they program enough in their real job. If they do, the time that they dedicate is nothing compared with a pro.
I would love to see some kind of partnership where it's not Microsoft or Apple or Oracle engaging in curriculum-capture (cf. regulatory capture) with universities, but open-source software.
I have heard good things about ATI on Linux, but perhaps the problems are only a personnel issue? Suffice it to say that if I'm going to be keeping Linux on my desktop that my next video card will be ATI, which means that my current frustrations are due to Nvidia ownershp.
I couldn't complain enough about Linux with Nvidia cards... The worst part is, my card is something like 8 years old now (The old Dell is still going strong as ever now) so it's a legacy card they won't even open source, it's not like they are ever going to make money from the drivers for Gforce4 ever again...
I have a 2 year old laptop with Nvidia. The driver installer is surprisingly smart (and not in a bad way). I have working accelerated OpenGL 3.3. It doesn't suck, as far as tained kernel modules go.
I wouldn't expect miracles with an 8 year old card though.
Nvidia does a decent job with new GPU's when it comes to 3d and basic graphics. It did a good job with the old GPU's too, but I guess the old drivers aren't maintained a lot.
However, the Nvidia binary driver still lacks some features when it comes to multiple displays, etc. HDMI connect/disconnect does not work on a laptop. There's no Xrandr support yet, so hotplugging displays and screen rotation, etc is not quite there. X has to be restarted more often than I want.
I've heard they're working on Xrandr and other stuff. I still think that they should be open sourced, though.
And furthermore, chip companies might just get free driver debugging. That makes their chips look better. It makes them look better. More sales.
Is there any possibility that the real reason for the lack of open-source drivers, is that in many cases "different" graphics chips are essentially the same, and what you're really paying for with the more expensive ones, is a driver that enables high-end functionality? Open-sourcing would wreck that.