Hacker News new | past | comments | ask | show | jobs | submit login
The Evolution of Direct3D (alexstjohn.com)
127 points by mwilcox on July 22, 2013 | hide | past | favorite | 34 comments



Fascinating article, especially all the political reasons as to why Microsoft introduced Direct3D in the first place.

"Direct3D was not an alternative to OpenGL [...] it was a designed to create a competitive market for 3D hardware"—and how it ultimately failed, with Direct3D becoming like OpenGL, more confusing, and paving way for the alluring simplicity of developing for a single hardware specimen—the Xbox.


Really interesting post which I enjoyed immensely, and a good point about being able to create custom renderers with DirectCompute/OpenCL and skip all the legacy baggage while harnessing the ultimate powah of modern GPUs. (Reckon we'll see a bit of this sort of stuff in some PS4 games later on in the console generation.)

Not a new idea, but one I'm glad to be reminded of.

I'm totally going to put some time into playing with this. Anyone here already doing something cool in the custom GPU renderer space?


I know multiple devs on PS3 were using the SPUs to rasterize low resolution depth buffers, so that they could then do occlusion culling without hitting the GPU at all. I expect we'll see similar approaches using GPU compute in the future, assuming people can get the latency down low enough so that it still beats native GPU occlusion queries.


Several titles probably did it (I only know of one for sure) but it does not mean it's any good technique.

You see, on PS3 GPU there are no pixel shader constant registers. This means you need to patch shader code if you want any alterations in your pixel shader. A naive approach, with patching on CPU when submitting every primitive, is horrendously slow because the CPU sucks at moving large amounts of data. A slightly faster approach, using the GPU DMA, is still unbelievably slow, because even though the GPU moves large amounts of data very quickly it takes a lot of time to switch between pushing triangles to copying mem ranges and back. Luckily, on the PS3 there are also SPUs, pretty good at moving large amounts of data and having no penalty for doing so. Patching shaders on SPUs is so fast you forget about it. Also very easy to write.

So the mind boggles when you see people using GPU patching and, to somehow save peformance, do some complicated scheme with occlusion culling via SPU.


Wait, really? Isn't the RSX a DX9-class GPU? How does it not have any constant registers?


No version of DX mandates any specific hardware architecture. Otherwise you would not be able to run DX9 games on modern hardware that also does not have constant registers.


It coud be that all nvidia drivers (for pc) that is we're doing constant patching with some clever caching behind the driver.


Indeed. There is a reason the draw primitive rate is so different on the PS3 and a PC with the same GPU.


AFAIR, "native" occlusion queries and conditional rendering are so prohibitively slow in D3D today that nobody uses them anyway.


I remember reading some references about GPU voxel engines and raymarching in the demoscene, not sure about real games.


As someone who has felt mostly sour grapes toward Microsoft for introducing D3D in the first place instead of adopting OpenGL, it's nice to read a convincing rationalization for why that wasn't done initially, coming from someone who was involved in the process.


The 'caps bits' problem is really the core of it. OpenGL in that era was a nightmare for anything resembling game rendering. The OpenGL we have now is pretty reasonable, but back then, Direct3D was a breath of fresh air and DirectDraw was a much saner way to push pixels around efficiently as well. It was hard for me to ever understand why people preferred OpenGL at the time (other than the obvious benefit of theoretical portability). Vendor-specific shader bytecode, UGH.


It was hard for me to ever understand why people preferred OpenGL at the time

OpenGL was elegant and very simple to use, quickly becoming close to invisible. DirectX, in comparison, was layers upon layers of COM book-keeping code.

Of course OpenGL has become like DirectX in more recent iterations, as in the end immediate satisfaction is less important than flexibility.


For me the definition of 'layers' was having to juggle dozens of interacting state flags and mutually exclusive vendor-specific extensions just to draw a dual-textured triangle or render to a render target.

COM really wasn't that much of a hassle in comparison. A couple smart pointer templates and you're off to the races. I can see how a C developer would really resent it though - nothing but a pain compared to GL's regular C, 'everything is void*' api.


I guess it depends upon what period we're talking about here. DirectX didn't add multi-texturing until DX 6.1, half a decade after hitting the market.


This is how I got to understand that not all that Microsoft does is evil.

When one works for corporations of similar sizes, gets to learn that there are many factors at play, than just good or evil deeds.


Another interesting read about Direct3D and OpenGL: http://programmers.stackexchange.com/a/88055/51669


Absolutely fantastic article, barring one point:

Dear author, you being the one who chose the coordinate system of D3D: thanks a lot. >:-(


Is it more than a single line of code in a well-designed engine to flip the Z?


It's not a gigantic deal, but it can bubble up and fuck things in your content pipeline, in shaders, in physics, in other places. It's just an annoying arbitrary thing you have to remember and occasionally you run up against it (especially if you're writing engine code instead of just using something off-the-shelf). It also can screw up math and memory layouts when talking with other libraries.

At least with endian issues, for example, there was once maybe a compelling reason to do it from a hardware standpoint.


If the handedness decision really was "arbitrary" and OpenGL had set a precedent, why select the exact opposite handedness for a brand new system? Given Microsoft's vicious competitiveness in the 1990s, I can't help but think that they wanted to make code portability much more difficult for software developers. Is it really a surprise that "all other graphics authoring tools adopted the right handed coordinate system standard to OpenGL"?


The article says he chose left-handed out of personal preference. Not everything by Microsoft is a sinister conspiracy, sometimes it's a garden variety screwup. They were probably merely insular enough to ignore the rest of the industry, rather than arrogant enough to deliberately subvert it.

There is some logic to left-handedness, which is a bit more intuitive in some ways for computer graphics. Left-handed means the Z coordinate increases with depth into the screen, that the viewer is somewhere near Z = 0 and looking towards positive numbers. And projection space has 0.0 at the near plane of the view frustum and 1.0 at the far end. Right-handed means that either your projection space coordinates go negative or that your projection matrix includes a negation for the Z coordinate.

Left-handed does however have the enormous disadvantage of working against almost all (non-computer screen) representations of 3D space. Draw your X and Y axes on a sheet of paper in their customary orientation. It's much more intuitive to interpret positive Z as altitude above the paper rather than as depth into your desk. That's right-handed, and that's why OpenGL and everyone else chose that.

Answering a different parent, it's more than just one line of code in a library to change coordinate systems. The depth buffer check needs to compare in the opposite direction, for one.

Incidentally, Microsoft has learned from this mistake: XNA uses right-handed coordinates.


XNA is also dead as of Windows 8.

http://en.wikipedia.org/wiki/Microsoft_XNA


According to Microsoft Unity is the way forward, BUILD 2013.

XNA and its former incarnation, Managed DirectX, always suffered from the internal political differences between .NET and native tools development groups.


Now, what if Microsoft purchased Unity Technologies?

Wouldn't that be a thing?


How many versions it would take before it targeted Microsoft systems only?


The last sentence in the article:

>> I personally never imagined that my early work on Direct3D, would, within a couple decades,

>> contribute to the evolution of a new kind of ubiquitous processor that enabled

>> the kind of incredibly realistic and general modeling of light and physics that

>> I had learned in the 1980′s but never believed

>> I would see computers powerful enough to model in real-time during my active career.

I concur.

I'm much older than Alex St John.

Coding for 3D has never ever been as fun as it is now...


So what I'm missing is the Fahrenheit project I got into. MS and SGI were cooperating on "Fahrenheit", a highlevel scene graph sdk/api. Eventually it never got released and both parties bailed.


Fahrenheit was a horrible microsoft api, and more like a scenegraph than a low level api. There were photos (probably still somewhere on reality.sgiweb.org) of it all being burned. Burn API burn!


The book Renegades of the Empire ( http://amzn.com/0609604163?tag=hn2013-20 [referral]) documents the history of DirectX within Microsoft.

And Wild Tangent is one of the most annoying/useless pre-loaded crapware vendors -- I've never seen their software installed except on a brand new PC.


Honestly between WildTangent and from seeing the sheer hubris and sociopathy on display in most of St. John's interviews (and the fact that he just posts hundreds of private emails on the internet), I don't have a very high opinion of him anymore. It's a shame, these stories are super interesting - the posts about Talisman describe stuff I didn't know even happened. Talking about mocking colleagues during their presentations and actively doing really showy, rude stuff to execs...


I still remember the Pax Romana party. And the rumors of the Mothership (designed by Giger?)


[1] This October 97 article by Alex St John in Boot Magazine talks about origins of DirectX and about the HR Giger ship.

[1] http://www.maximumpc.com/article/features/old_school_monday_...


Good times.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: