We could have OpenGL but Microsoft wanted their own so they made Direct3D and ATI and Intel were unable to make both Direct3D and OpenGL work properly (because most people targeted Direct3D anyway so why bother?) so after SGI bankrupted themselves, Khronos was formed and they decided to define a new OpenGL from scratch to help ATI and Intel claim they have OpenGL (they pay money after all) so they built the core profile but ATI-now-AMD and Intel failed to make that work too (since people still targeted Direct3D so why bother?) so to help them again they decided to ignore OpenGL and make Vulkan which kinda seems to work for now but AMD and Intel took their time to produce certified drivers and Vulkan doesn't look that popular (though Direct3D 12 which is essentially the same thing also isn't that popular either, but the alternative tends to be Direct3D 11 so still no reason to bother fixing OpenGL) so who knows for how long this will work?
Apple was in the OpenGL train initially but like their Java and X11 support that was so they have some easy ports of important stuff and once they got a sniff of popularity they ditched anything non-Apple because why bother maintaining something others control?
That is my interpretation of the story so far anyway. And i didn't mention OpenGL ES which is OpenGL in name only but not in anything that really matters - but thanks to iOS and Android it became popular even though both of these devices had more than enough power to handle the proper thing instead of the crippled down version that was originally designed for J2ME feature phones that couldn't even do floating point math.
On the positive side, OpenGL is still the API that has the most wide availability and of all APIs the one that has the most reimplementations on top of other APIs - though in both cases you'll want to stick to older versions, but TBH versions were always a fluid thing with OpenGL anyway, you're not supposed to think in versions but in extensions.
> Microsoft wanted their own so they made Direct3D
Not quite, it was somewhat of a necessity at that point. 3D graphics was in an abysmal state back in the mid to late 90s. Hardware vendors shipped proprietary APIs (Glide), half-assed OGL ports just for Quake (MiniGL), but very few offered full OpenGL implementations.
In order to migrate gaming from DOS to Windows, Microsoft was in dire need for a reliable and widely available API that they could ship with the OS for game developers to use.
OpenGL wasn't exactly great, since it wasn't controlled by MS and the whole extension system was a huge mess and horrible to work with (I don't care what Carmack thought about it!)
Direct3D on the other hand offered a stable API and most importantly a REFERENCE RENDERER! - something that OpenGL was lacking and lead to super annoying bugs, since every other driver behaved differently...
The latter is still relevant today - OpenGL lacks a defined reference implementation, so "OpenGL support" on the box means very little in practise. This is why certain software packages require certified drivers, because CAD vendors would never be able to ship a finished product if they had to support every single quirk of every vendor, hardware- or driver revision...
When Direct3D was introduced there was no Glide nor MiniGL and OpenGL provided more than enough of the functionality games would need at the time. Microsoft was in control of their OpenGL implementation which allowed both a full replacement (what drivers do nowadays) but also a partial replacement where a driver would only implement a tiny subset of the API and the rest (e.g. transformation, clipping, lighting, etc) would be handled by Microsoft's code.
> In order to migrate gaming from DOS to Windows, Microsoft was in dire need for a reliable and widely available API that they could ship with the OS for game developers to use.
Yes, the rest of DirectX provided that and OpenGL games used it too.
> OpenGL wasn't exactly great, since it wasn't controlled by MS
Which was the only real problem for Microsoft, not anything else
> and the whole extension system was a huge mess
During the mid to late 90s there were barely any extensions and OpenGL 1.1 provided more than enough functionality for the games at the time. The main extension that would be needed during the very late 90s was multitexturing which was importing a few function pointers, nothing "messy".
> and horrible to work with
Compared to early Direct3D, OpenGL was much easier to work with - early Direct3D required to build execute buffers, manage texture loss yourself and other nonsense whereas OpenGL allowed you to essentially say "use this texture, draw these triangles". This was such a big issue with Direct3D's usability that Microsoft eventually added similar functionality to Direct3D in versions 5 and 6 and they even killed execute buffers pretty much instantly. Even then OpenGL still provided more functionality that the drivers were taking advantage as new functionality was available in GPUs (e.g. Direct3D 7 introduced hardware transformation and lighting, but OpenGL had this from day one essentially so all games that used OpenGL got T&L for free when drivers added support, whereas games that used Direct3D had to explicitly enable it).
> Direct3D on the other hand offered a stable API and most importantly a REFERENCE RENDERER! - something that OpenGL was lacking and lead to super annoying bugs
This is wrong, Microsoft had a software rasterizer for OpenGL 1.1 that behaved very close to the spec and SGI had also released their own software rasterizer.
> since every other driver behaved differently...
This was the case with Direct3D too and in fact a much more painful experience. Direct3D tried to alleviate this by introducing capability flags but in practice no game did proper use of them and games had all sorts of bugs and issues (e.g. DF Retro had a video where they tested a bunch of 90s 3D cards on Direct3D games and pretty much all of them had different visual glitches).
> OpenGL lacks a defined reference implementation, so "OpenGL support" on the box means very little in practise
This is an issue indeed, though it is largely a problem with the driver developers not caring about providing a consistent behavior than a problem with the API. If the driver developers cared they'd try to do things similar to other drivers as long as any difference was spotted between implementations.
Though that is a modern issue, for pretty much the entirety of the 90s and early 2000s there were official software rasterizers from both Microsoft and SGI.
> When Direct3D was introduced there was no Glide nor MiniGL
You must be from a different universe: MiniGL was released in 1996 - the very same year Direct3D 4.0 and Direct3D 5.0 shipped... As for Glide - that started also in 1996 and was commonly used until 3dfx went defunct.
> During the mid to late 90s there were barely any extensions and OpenGL 1.1
Again - in which timeline was that the case? Certainly not in this one: in 1996 (!!!) there were about 90(!!!) vendor-specific extensions [1]. This is not a question of whether you in particular are aware of them or their particular usefulness, they did have use cases and were supported across vendors; sometimes, with varying levels of support...
> Microsoft had a software rasterizer for OpenGL 1.1 that behaved very close to the spec and SGI had also released their own software rasterizer.
Neither of those were references that you could reliably run pixel-level A/B tests against to verify your drivers.
There never was an official reference implementation and there probably won't be any either.
> The main extension that would be needed during the very late 90s was multitexturing
Unless you were porting software from other systems like SGI workstations, which I did at the time. And believe me - it wasn't fun and having half a dozen code paths to work around that depending on the target hardware wasn't "clean" either.
I won't comment on your "which API is better"-drivel since your arguments didn't age well anyway. We're back to execution buffers and manual (texture-) managing for performance reasons so I could just as well argue that early Direct3D was actually ahead of its time... But that's a matter of opinion and not a technical issue.
> You must be from a different universe: MiniGL was released in 1996 - the very same year Direct3D 4.0 and Direct3D 5.0 shipped... As for Glide - that started also in 1996 and was commonly used until 3dfx went defunct.
Only the year is the same, but not the dates. Direct3D was introduced in DirectX 2.0 on June 2[0]. Voodoo 1, for which Glide and MiniGL were made, was released after Direct3D, on October 7[1].
It would be impossible for Microsoft to make Direct3D as an answer to APIs like MiniGL since MiniGL didn't exist at the time the first release of Direct3D was made!
> Again - in which timeline was that the case? Certainly not in this one: in 1996 (!!!) there were about 90(!!!) vendor-specific extensions [1]
I'm not sure what you refer to in "[1]", there isn't any date information in there. Regardless, from [2] (which is from 2000, when there were many more extensions than in the mid-90s) you can easily see that the vast majority of extensions are for hardware that is irrelevant to desktop PCs running Windows (e.g. the SGI-specific and GLX stuff).
In addition new OpenGL versions are essentially bundles of previous extensions, so a lot of these extensions are functionality you got with 1.1 (e.g. GL_EXT_texture is basically the ability to create texture objects which was introduced as an extension in OpenGL 1.0 and made part of the core API - and available to anyone with OpenGL on Windows - in version 1.1).
Of all the extensions listed even in the 2000s list, only a handful would be relevant to desktop PCs - especially for gaming - and several of them (e.g. Nvidia's extensions) wouldn't be available in the mid-90s.
> Neither of those were references that you could reliably run pixel-level A/B tests against to verify your drivers.
At the time that was irrelevant as no GPU was even able to produce the exact same output at a hardware level, let alone via APIs.
Also Direct3D didn't have a reference rasterizer until Direct3D 6, released August 1998. The Direct3D 5 (released in 1996) software rasterizers were very limited (one didn't even support color) and meant for performance, not as a reference.
> There never was an official reference implementation and there probably won't be any either.
That doesn't matter, Microsoft's OpenGL software rasterizer was designed to be as close as possible to what the spec described and was much more faithful to it than the software rasterizers available for Direct3D up to and including version 5.
> Unless you were porting software from other systems like SGI workstations, which I did at the time.
Yes, that could have been a problem since 3D GPUs at the time pretty much sucked for anything unrelated to fast paced gaming. But those uses were very rare and didn't affect Direct3D at all - after all Direct3D was at an even worse state with all the caps and stuff you had to take care of that OpenGL didn't require.
> We're back to execution buffers and manual (texture-) managing for performance reasons so I could just as well argue that early Direct3D was actually ahead of its time
Yeah and IMO these modern APIs are a PITA to work with, more than anything ever made before that with any improvement not justifying the additional complexity, especially when OpenGL could have been extended to deal with better performance.
> Microsoft wanted their own so they made Direct3D
Bill Gates "wanted his own" because this would limit software portability, making his near-monopoly in OSes even stronger. Good move for his bottom line, but a dick move for humanity.
The ones that keep mentioning Microsoft alone are the cynic ones, selling their FOSS agenda the best way they see fit, usually with zero experience from the games industry.
As much as games may not be able to be open source, I'm still baffled that the infrastructure still isn't. Open source infrastructure dominates the server space, I don't see why it couldn't dominate the desktop as well.
I see why it doesn't: it's mostly hardware vendors refusing to provide free drivers and refusing to hand over the specs of the hardware they sell (I mean the ISA). There may have been good reason 20 years ago, but it's been some years now that hardware tends to be mostly uniform, and could possibly stabilize its ISA. It has been done for x86 (for better or worse), it could be done for GPUs, printers, web cams, and everything else.
Hardware used to come with a user manual. Then it all stopped, around the time Windows 95 took over. Instead of a manual, they provided opaque software that worked with Windows. That has been the new tradition since, and changing it is hard. For instance it's only very recently that FPGA vendors started to gradually realise that open source toolchains could actually help their bottom line.
My dream, really, would be for hardware vendors to agree on an ISA, so we don't have to put up with drivers any more. https://caseymuratori.com/blog_0031
It dominates the server, because many FOSS users refuse to pay for tooling don't have any other option than paying subscriptions for their servers to keep running, or very least they need to buy hardware.
FOSS Desktop doesn't scale to keep a company running under such premises, because a large majority refuses to pay, and living from patreons and donations only goes as far.
Which is why everyone that wants to make money with Desktop FOSS software, either moved it beyond a paywall served via browsers or to mobile OS stores.
From my point of view FOSS friendliness is a marketing action, where underdog companies play nice, use non-copyleft licenses and as soon as they get rescued due to positive vibes, whatever, hop again into dual licenses to keep their business rolling.
> FOSS Desktop doesn't scale to keep a company running under such premises, because a large majority refuses to pay, and living from patreons and donations only goes as far.
Okay, how complex does an OS need to be, really? Let's see, it needs to schedule and run your programs, interface to the hardware, manage permissions… that's about it. Why would it need to scale? What's so impossibly complex about an OS that it couldn't be done by 5 highly competent engineers in 2 years?
Oh, right, the hardware. With the exception of CPUs, hardware vendors don't publish their specs, and don't agree on a set of interfaces. So you end up having to write a gazillion drivers, dozens of millions of lines of code, just so you can talk to the hardware.
Solve that problem, and OSes won't need to scale. Perhaps even to the point that game devs will be able to ship their own custom OS with their games. As was done in the 80s and early 90s.
Having a stable driver ABi, and being micro-kernel based helps with scaling, which fun fact, that is what Playstation with its heavily customised FreeBSD, or Switch with their in-house OS do.
As for portable specs, if Open Group, Khronos have taught anything, is that there is a big difference between paper and real hardware/platforms.
Yep, we shipped with our customs OSes, which also had our custom workarounds for faulty undocumented firmware bugs, those that we occasionally took advantage of for demoscene events.
> As for portable specs, if Open Group, Khronos have taught anything, is that there is a big difference between paper and real hardware/platforms.
But… they don't even specify ISAs, they specify APIs. I'd wager the big difference is only natural. Another way would be for a vendor to design their ISA on their own, then make it public. If a public ISA gives them an advantage (and I think it could), others would be forced to follow suit. No more unrealistic consortium. :-)
> faulty undocumented firmware bugs
I hope that today, any hardware bug would trigger an expensive recall, and firmware bugs would just be embarrassing. CPUs today do have bugs, but not that many. We could generalise that to the rest of the hardware.
” That is my interpretation of the story so far anyway. And i didn't mention OpenGL ES which is OpenGL in name only but not in anything that really matters”
You might be mistaking the OpenGL es 1.0 for anything modern.
ES2.0 and above is a true subset of the desktop OpenGL, some limits are less and support for things like Geometry shaders are optional, but that’s pretty much it.
OpenGL and OpenGL ES are two completely different APIs with their own specs and implementations. Some versions do have an overlap in functionality in that you can write code that can work with both with minimal (mainly shader) differences, but that's about it.
But IMO OpenGL ES 2.0 was pointless, the devices that were powerful enough to support it were also powerful enough to support the full OpenGL so Khronos should have pushed that instead of fragmenting the driver and API ecosystem.
No really. 4.3 made ES a true subset. As in you cannot write a spec conformant ES3.0+ software that would not run in GL4.3+ implementation.
This was very intentional by Khronos so they could bring the two closer together. In ES2.0 days what you said would have been true, as ES2.0 had some annoying differences especially on shader side, but it’s been 8 years since 4.3 came out.
Es3 shaders (with the modern in/out qualifiers) compile as is on Gl4.3+. It is a true subset. As an example 4.3 brought precision qualifiers to desktop GL. And now that fp16 is in desktop Hw they are actually useful there.
I also recall the early days of D3D (Immediate mode), which although it came after OpenGL, immediate mode allowed better integration with the cards at the time (notably 3dfx) and OpenGL did not have hardware drivers, which meant it was limited to software rendering. So, if you were into game dev, then D3D was your only option early on.
My recollection was that it wasn't until NVidia started up (and broke 3dfx by poaching engineers) that OpenGL started to become 'better'. Intel was left in the dust until mobo support for DMA (?around DX5?), which allowed cards to gain quick access to RAM, which was vital for texturing (you always had to 'upload' textures to the card itself prior to that). It was the final nail in the coffin for 3fdx at that point, who still hadn't released a new card for ages, and OpenGL was finally on par with D3D. D3D had a retained mode which began to be really useful by about that time too.
At the time, many people wanted to use OpenGL because it was loads easier than Immediate Mode and a lot more intuitive to grok. I recall a certain prominant Doom developer berating D3D loudly in a private email list (John Cormack) about this very fact. Ironically, a few months later some guys released a demo for a game called "Unreal" using D3D and everyone was blown away. (circa 1995-6). More ironically, it wasn't for another year that GL Quake came to fruition.
Carmack loved OpenGL because GPU vendors could (and would) release propriety extensions that exposed all the new functionality of new GPUs.
He would rewrite custom rendering paths for various GPUs and common sets of extensions, allowing him to improve performance and/or improve graphics.
With Direct3D, Microsoft defines a common feature set that all GPUs supporting that version of Direct3D are required to support, and any extra functionality that GPUs might provide on top of that are locked away, completely inaccessible.
Checking the Doom 3 source code, he the main ARB and ARB2 pixel shaders paths (equivalent to dx8 shaders). Then for the older gpus that mostly support pixel shaders but not in a standards compliant way, he has an nv10 code path and a nv20 code path.
Then he has a r200 code path, which I think just improves performance on r200 graphics cards over regular ARB2.
Extensions are a good thing since it allows developers to take advantage of new functionality and provide it to consumers pretty much immediately - this is a win win for everyone involved, programmers use cutting edge functionality and consumers actually get to use the fancy GPUs they paid money for.
Direct3D programmers disliking extensions make me think of the sour grapes fable.
But OpenGL providing extensions doesn't mean that Direct3D programmers are free of having to implement different code paths - if anything during the 90s with the proliferation of 3D cards and different capabilities flags, programmers had to take into account a lot of different possibilities in their rendering code (and most failed to do that with games having visual glitches everywhere).
I don't like the term "poaching". It implies that the engineers who took better paying jobs did something wrong, whereas they were just trying to retain a bigger portion of the enormous value that they were creating.
We could have OpenGL but Microsoft wanted their own so they made Direct3D and ATI and Intel were unable to make both Direct3D and OpenGL work properly (because most people targeted Direct3D anyway so why bother?) so after SGI bankrupted themselves, Khronos was formed and they decided to define a new OpenGL from scratch to help ATI and Intel claim they have OpenGL (they pay money after all) so they built the core profile but ATI-now-AMD and Intel failed to make that work too (since people still targeted Direct3D so why bother?) so to help them again they decided to ignore OpenGL and make Vulkan which kinda seems to work for now but AMD and Intel took their time to produce certified drivers and Vulkan doesn't look that popular (though Direct3D 12 which is essentially the same thing also isn't that popular either, but the alternative tends to be Direct3D 11 so still no reason to bother fixing OpenGL) so who knows for how long this will work?
Apple was in the OpenGL train initially but like their Java and X11 support that was so they have some easy ports of important stuff and once they got a sniff of popularity they ditched anything non-Apple because why bother maintaining something others control?
That is my interpretation of the story so far anyway. And i didn't mention OpenGL ES which is OpenGL in name only but not in anything that really matters - but thanks to iOS and Android it became popular even though both of these devices had more than enough power to handle the proper thing instead of the crippled down version that was originally designed for J2ME feature phones that couldn't even do floating point math.
On the positive side, OpenGL is still the API that has the most wide availability and of all APIs the one that has the most reimplementations on top of other APIs - though in both cases you'll want to stick to older versions, but TBH versions were always a fluid thing with OpenGL anyway, you're not supposed to think in versions but in extensions.