Hacker News new | past | comments | ask | show | jobs | submit login
DirectX 12 (msdn.com)
121 points by kanche on March 20, 2014 | hide | past | favorite | 107 comments



It seems SemiAccurate was...accurate - Microsoft just took Mantle and renamed it DX12, to call it their own "innovation":

http://semiaccurate.com/2014/03/18/microsoft-adopts-mantle-c...


And it's rumored that Mantle is heavily "inspired" by the Xbox One and PS4 APIs. So really MS is copying themselves.


They are all inspired by the hardware and the experiences implementing high-content game workloads. The main goal is to reduce CPU occupation and improve parallelism of CPU cores. This is achieved in several ways:

- reduce raw CPU work by increasing precompilation of state and data to native formats

- reduce communication from the CPU to the GPU, by increasing the flexibility of API calls

- reduce CPU <-> GPU dependencies by moving some forms of logic and flow into the GPU

All of them conspire to increase parallelism as well since there are in general fewer points where synchronization is needed.

Playstation developers have a long tradition of doing these things since the first model; Xbox developers not so much, but still a lot more than on PCs. Since the general goal is to reduce API surface, remove abstractions and match the metal more closely, it's natural for APIs to converge.


Or maybe AMD making an end-run on easy porting from Xbox-PS4-PC(Steam). Thus making their wares more desirable ?


If true, this is a good move by Microsoft. The whole point of this exercise is to be closer to the metal, so why reinvent the wheel.

It's something of a Direct3D tradition to design the API around one hardware vendor's state of the art. Direct3D 9 was pretty much based on what the ATI Radeon 9700 could do. At the time this was a serious inconvenience to NVIDIA who had taken a different tack with the Geforce FX (I think it was -- it's been a while).


SemiAccurate is always accurate. One of my favorite sources.


According to wikipedia, SemiAccurate is a news/opinion site similar to The Inquirer. Is there enough information about the DX12 API yet to confirm this?


Charlie of SA used to write for The Inquirier. He has some damn good sources. I cannot recall a time when he was ever wrong, and he's broken some big stories (things like the Nvidia 8800/9800 GPU failures).

Out of all such rumor sites (Fudzilla and BSN being two other large ones), SA is the only one worth taking seriously.


Embrace, extend, extinguish.


Mantle is here and now. With the speed MS is moving that MO is not so scary anymore.


I am not a graphics / engine programmer but I know a decent amount of what goes into programming a game, and I have to call "suspect" on the claim that low level graphics API programming was heretofore only available on game consoles like the Microsoft Xbox, Microsoft Xbox 360 and Microsoft Xbox One!


From Microsoft's warped perspective, where dogfooding is a religion, it's more or less correct.

You've always been able to get a lot more out of consoles considering their specs; the 360 was marginally better than the state of the art of PC hardware, for a few months, but being able to code right to the metal (not as much as on an Amiga or Nintendo, but relative to the PC) gave an efficiency that made the games unmatched for years.

AMD's recently released Mantle is the first exception on the PC, and DirectX 12 is reportedly quite similar. Bing it on your Zune for further reading.


Unmatched? Compared to what? Those games look like shit compared to a PC that came out on the say day, let alone years later. Not to mention things like limitations of map sizes, player numbers, etc.


The issue with PC hardware is that it's just too varied to get really close to the metal. The deeper you get, after all, the more different the various GPU architectures become. Consoles, on the other hand, are all identical, so you can do the most unportable bitfucking to get the absolute most out of the hardware. This is also the reason why it takes years for games to really start to shine on a console: it takes that long for game developers to really get to know all the nitty-gritty details that just do not exist on the PC.

Console games often look mediocre despite the above because console hardware is far cheaper, and thus simply less powerful, than the hardware in high end gaming PCs, despite the fact that consoles benefit from economies of scale, and are priced at a loss to boot. It is not fair to compare the way a game looks on a $2000 PC to the way it looks on a $400 console.


That's what I've always heard too, but it seems that the tide is turning: http://blogs.nvidia.com/blog/2014/03/20/opengl-gdc2014/

(I can't answer for the technical details, though. It reeds like hieroglyphics to me.)


> Those games look like shit compared to a PC that came out on the say day, let alone years later.

1. Developers had no experience with the hardware during the initial years of the consoles.

They had to switch from an out-of-order and forgiving x86 to an in-order and unforgiving PowerPC that had substantially less cache (32k/32k vs 64k/2mb/8mb) than its PC counterparts of the day. Just ask any PC-gone-console developer of that age about LHS[1], or the off-the-wall Cell Broadband Architecture[2].

2. Developers had to manage 512MiB between the GPU and CPU.

Everything has to fit into that including the operating system... and I found 1GiB uncomfortable for PCs in 2006!

+ It was split on PS3, and you had to DMA into 256k for SPUs. + EDRAM was slightly too small to fit 1280x720x32x4 render targets.

3. Developers had no guarantee of permanent storage.

So everything had to be streamed from disk... which is unsavoury for various reasons.

Contortionist programming springs to mind[3].

4. PCs get upgraded.

'Nuff said.

[1]: http://www.gamasutra.com/view/feature/132084/sponsored_featu...

[2]: https://www.youtube.com/watch?v=bR8CVLVmKQs&t=4m

[3]: http://doublebuffered.com/2010/03/17/gdc-2010-streaming-mass...

(I'm assuming Gen 7, i.e., the Xbox 360 and PS3.)


Compared to an equivalent specced PC. There's a huge platform overhead on PC that consoles don't have to bear :

https://twitter.com/ID_AA_Carmack/status/436012673243693056

https://twitter.com/ID_AA_Carmack/status/436012724791681024


Compared to a PC with similar specs and/or a reasonable price point? The $10,000 PC has always been able to outperform the $400 Xbox, but nobody cares about that.


A $500 dollar PC can match/outperform both the Xbone and the PS4 depending on the game. In 2014.


But we're talking about a $400 Xbox in 2005. What PC hardware configuration from 2005 would still run modern games at a playable (not good) framerate on low?


On low and at 30fps? Probably most. You really think console hardware is some magical beast?


(Former AAA game dev, including a stint at Sony)

No, but console OSes and Drivers are (well, compared to PC drivers).

There's an enormous amount going on between your code and the metal on a PC, even when writing C++ w/ OpenGL or DirectX. Driver overhead for graphics is HUGE (which is what this is about).

PC hardware comparable to PS3/Xbox360 performs significantly worse under real world conditions due to the way the graphics stack is set up and programmed against. The new Direct3d 12 (as well as AMD's Mantle) are attempts to tackle this.


> PC hardware comparable to PS3/Xbox360 performs significantly worse under real world conditions due to the way the graphics stack is set up and programmed against.

I doubt the significant part. And even if this was true 7 years ago, it's definitely not true now. Comparable hardware would mean something like a GTX 760 (both the PS4's APU and the 760 have around 1800 GFLO/s). That card can do everything current consoles can.

Mantle is for lower end cards anyways, mid tier hardware like the GTX 760 and what's inside current consoles won't see a change that dramatic.


No, Mantle is for lower end CPUs. It's all about reducing the CPU bottleneck to feeding a graphics card effectively.

Which, incidentally, is the reason why the initiative came from AMD (also the strong emphasis on multithreading, since AMD sells cheap 8 core CPUs). If you can all of a sudden game on a weak CPU, then Intel chips will look less attractive in comparison.

Where you're seeing small improvement with higher end cards is that they're pumping the resolution, AA etc, with the same number of draw calls. If you instead ramp the draw calls up - for example by putting a ton more objects in your scene - you'll be able to get much more out of high end cards.

Mantle isn't going to help at all in the move to 4k, but it really will allow for much more complex games on the PC, akin to when Total War was released.

I'd also not take the launch titles as a good indication of what the hardware of these consoles is capable of. They're usually rushed, ported from other platforms, etc. The hardware is capable of much more.

This is a good demo that goes into lots of detail:

http://www.youtube.com/watch?v=QIWyf8Hyjbg


360 was pretty close to the X1800 XT as far as I ever ascertained, released November 2005ish (http://www.bit-tech.net/hardware/graphics/2005/11/11/ati_x18...) and with the 360 released November 22, 2005 we can easily say there was at least parity with PC hardware on release.


You have to take the hardware specs into account. It's easy to overlook the crappiness of the hardware of those consoles.


"Bing it on your Zune for further reading." I lol'd.


I don't believe they they are saying it was never available but that previous versions of DirectX didn't provide it.


True, but still seems like lying by omission, or at least for PR gain...


Xbox Apis have never been as low level as other consoles.


Isn't it just abstractions?

The reason Xbox APIs haven't given direct access to the GPU is for the same reason you wouldn't do this on Windows. The API gives a safe way, preferably with low overhead, to access resources that might be already being used. With the original Xbox, this was done to keep programming for the Xbox more or less the same as programming in DirectX on a Windows machine. Having comparable APIs makes porting significantly easier. The Xbox 360 maintained this paradigm.

If you consider the PIP type of gaming that Xbox One supports there's no way a game can have direct access because it would be fighting the kernel. Instead you are actually coding against a virtual device so that the kernel can decide what instructions can actually be executed.


I am a graphics engine programmer. This is not an absolute. Rather, a graphics API is an abstraction of a generic GPU, so it is never as low-level as it could be. The more the API provides ways to expose the underlying hardware more directly, the more the API can be said to enable "low-level" programming. It's relative. You can hear Carmack talk about this issue in some of his QuakeCon keynotes.


There are tradeoffs with that, too. The moment you provide low-level access, you make whatever low-level interface is supplied a standard to be supported now and forever. See: the VGA, or any of the Amiga chipsets. One thing that may fall out of this is that future GPU vendors would provide the low-level interface as an abstraction over what's really happening under the hood. And then developers will complain that they can't take advantage of the the real chip's theoretical capabilities.


Generally, I ask this to graphics programmers (Don't take any harm). What will you do after graphics singularity is reached? People say video game graphics are going to be photorealastic in 10 years.


Movie CGI is already photorealistic. Minecraft is not. You can add an arbitrary (and exponentially growing) computational cost to your game by making the world more dynamic. The games we look at today as paragons of cg advancent are really just static meshes with a few small entities running around. Any step away from pre-baked, pre-compiled and pre-made content will easily consume as many years of tech advancement as you let it. (For evidence, look at how long a modern level editor takes to bake in something as relatively simple as lighting.)


Right now, there is no shortage of things to work on, and things that have not been done yet. I am not currently aware of any horizon past which that won't be true. I don't see any problem yet, and at any rate, there are tons of other kinds of programming to do.


Holiday 2015? That's like, 1.5 years from now!? Or 10 internet years!


That's like forever away! Or two E3s away.


Now that sounds like an eternity.


ETERNITY! ETERNITY! ETERNITY!


Vaporware to scare developers off using Mantle?


Until Mantle is as portable as DirectX in terms of graphics card support, only console developers will care about it.

Most of the studios use engines nowadays, so Mantle impact on studios besides AMD blessed ones, remains to be seen.


Mantle isn't yet usable either, but my point is, that DX12 was announced as a vaporware to detract attention from Mantle. See http://en.wikipedia.org/wiki/Vaporware


I guess this means that if I want to get the most out of my games I'll have to "upgrade" to Windows 8.1. Not super excited about having to configure away all of the metro garbage.


Ask your favourite developers to use OpenGL instead. Even xp users get the latest version of OpenGL.


As a DirectX developer who's tried out OpenGL a few times, I found it horrific and painful to program in. But then again, I'm sure OpenGL devs complain about DX in the same way.

I honestly think DirectX is a great API. And let's be honest, by the time this is release in "Holiday 2015", it'll be at least until mid 2016 before anything comes out that uses it. Combine that with the rule about every other MS OS being good, and we should have a fairly decent Windows 9 to run these games on.


OpenGL is less painful in the long run, because you won't be tied to Microsoft's APIs. Those APIs weaken you and your ability to think about graphics programming.

All you need is a good GL wrapper. If writing your own is painful, just use someone else's. But it's important to be able to write your own, because if you can't, then that's what weakens you.


"Those APIs weaken ... your ability to think about graphics programming." - what is the justification for this? I don't really see that it follows, as virtually all of the interesting stuff (CPU code, shaders) is API-agnostic.


A GPU doesn't work in the way that Microsoft would have you believe it works. By restricting your worldview to Microsoft's predefined notion of what it should be, you'll never know the true extent of what you can do as a graphics programmer.

A secondary reason to avoid DX is because gaming on windows has a high chance of being dead within the next decade. If that seems laughable, think of how laughable it would've sounded for anyone to say that about CDs for music in 2001.


Well, I was hoping for some further explanation, but for what I paid this is fine ;)

I was under the impression that DX11 (and DX10 before it) actually do a decent job of exposing basically everything that's widely supported, and in a manner that can be made to work tolerably efficiently across a range of different hardware. But I was told this by some guy from MS so maybe I shouldn't have expected him to say anything else.

Assuming it's the case, perhaps I'm just cheating by not thinking of the leftovers as useful - not because the behaviour is useless, I mean, just because if you're targeting PC, you just have to accept that the underlying hardware could be anything. OpenGL extensions could provide you with more, but support can be a bit hit or miss. Reminds me too much of MS-DOS.

But I could have been misinformed, or my supposition is simply outdated, and the article suggests that could well have been the case, in which case my conclusion would be partly bogus too.


By restricting your worldview to Microsoft's predefined notion of what it should be, you'll never know the true extent of what you can do as a graphics programmer

can't you just as well say that By restricting your worldview to OpenGl's predefined notion of what it should be, you'll never know the true extent of what you can do as a graphics programmer ? I probably don't have enough experience with it, but to me it seemds in the end both DirectX11 and latest OpenGL pretty much have equal capabilities apart from some details? Not like DirectX7 for instance, which obviously is capable of less.


OpenGL is driven by the hardware vendors. It's developed a lot like html5, extensions get exposed and tested and get standardised based on that experience.


I definitely see Linux becoming a big platform for gaming, but I think they'll probably coexist.



For posterity:

http://aras-p.info/blog/2014/03/28/cross-platform-shaders-in...

(I've never worked on a cross-platform project where HLSL vs GLSL seemed to be a big deal.)


> All you need is a good GL wrapper. If writing your own is painful, just use someone else's. But it's important to be able to write your own, because if you can't, then that's what weakens you.

I have seen exactly zero good OpenGL wrappers. The stateful API makes it pretty much impossible to wrap in a way that would actually make it better.

I have also wasted a lot of time trying to implement some kind of a GL wrapper. I got something done that was fairly comfortable to use for my own needs but nowhere close to being an universal wrapper of any kind.

Have you got a decent GL wrapper to recommend?


> As a DirectX developer who's tried out OpenGL a few times, I found it horrific and painful to program in. But then again, I'm sure OpenGL devs complain about DX in the same way.

As an OpenGL developer, not a day goes past without me wishing that the API was sensible like the DirectX API and not the state machine mess that it currently is.

I'm pretty sure any OpenGL programmer worth their salt will agree. Only some beginners with no practical experience think that OpenGL is a better API because it requires less boilerplate code to draw a triangle.


> Even xp users get the latest version of OpenGL.

Only as long as nVidia and ATI keep releasing drivers for you. At some point XP will be abandoned.


Like it or not, but there's currently more reason for the manufacturers to support XP than even Linux. (~30% market share, 2nd most popular)


Agreed, but at this point, we should be getting users and things off XP, not on it.


OpenGL support seems rather iffy on Windows. Minecraft uses OpenGL and wouldn't run on my i7. At one point it wouldn't even load, but a while later it ran, but at laughable framerates.


Did you have the drivers installed? Minecraft runs just fine on many, many, many windows i7 machines.


Exactly. On windows the OpenGL implementation is supplied by the graphics vender. AMD, Nvidia, or in the case of the i7, intel.

Windows comes with an OpenGL implementation but it's an ancient version.


Well before it failed with an error about initializing graphics. When I tried a year later, that error no longer appeared but the FPS was in the single digits. I didn't dig around for special OpenGL drivers, no, I just let Intel and MS push certified drivers out via normal update mechanisms.


OpenGL has generally been great on Windows, due to the overwhelming demand for high performance graphics on Windows. While i7 GPUs are good in terms of integrated GPUs, they are still not great compared to discrete GPUs (try a GeForce or Radeon).


If you are serious about graphics never invest in Intel graphics cards.

They are the running joke from graphics programmers.


They are getting much better, especially Iris.


Or you could just wait till a DX12 game is out and decide if its worth upgrading. DX11 got a fairly lukewarm response from most studios when it came out.


It's not all that terrible. I've not been poked by metro once for about two months.


Cryengine, Unreal, and Unity are all moving to support Linux.

I wouldn't worry about this.


Yeah, that 1% of gamers will be really happy.


Did you know there's a Linux based console in development?


Steam Box? How many people are using it already?


I am not excited, as MS's track record shows, any of their "improvements" are not backwards compatible with old OSes. For example IE11 is so awesome! SO AMAZING! Not compatible with Windows XP, yet Chrome and Firefox is still.

MS's desire to ship their latest OS has hurt developers over the years. I hope people still chose Open GL over DX-whatever.


>Yet Chrome and Firefox are still

Older versions of FF are yes, but the latest is not supportedhttps://support.mozilla.org/en-US/kb/firefox-no-longer-works...

There's no reason they should be expected to put dev resources into an OS that's 12 years old.


To clarify, current Firefox does support Windows XP, as long as it is up to date, which means service pack 3. If you are still using an _unpatched_ version of Windows XP, well, you're on your own.


> There's no reason they should be expected to put dev resources into an OS that's 12 years old.

Depending on the the prevalence of XP usage among their users, there might be one, actually.


My point is speed of abandonment. Yes FF and Chrome gonna drop XP support real soon at EOL of XP. However the creators of XP are not supporting it. That's my point. MS has a history (not even contradicted once yet) of not supporting operating systems that are not the latest and greatest. I bet Win 7 wont be supported by newer IE versions in a year.

Which means IE11 may be great, but we're stuck with it forever just like we're stuck with IE8, etc.

And if you code for DX12, you will be abandoning like >50% of your audience, so nobody will code for it. Just like DX11 is only now gaining real steam.


>However the creators of XP are not supporting it. How long should they be supporting an OS? You can't expect them to support it forever. No one complains that older phones aren't supported anymore, any is the case for XP so special?


> MS's track record shows, any of their "improvements" are not backwards compatible with old OSes.

What do you mean? All new versions of all commercial software have new improvements/features. If they just released all of them for free on the previous version, they might as well just shut down their company.

> Not compatible with Windows XP, yet Chrome and Firefox is still.

Well IE since version 7 uses mandatory integrity control which is a kernel feature of Vista and above. XP only has DACLs. I believe what they did initially was just disable the protected mode feature when IE ran on XP.

Firefox is woefully behind other browsers in this particular aspect so it doesn't need to worry about it. Chrome does the same thing as IE on Vista+. They wrote their own sandbox because its a cross platform product which is why it works on XP.

So certainly, Chrome would be your choice if you want a single browser that runs on XP as well as Vista. That would matter only to businesses or computer labs or places where there is a large deployment of software across multiple OS versions. For home users, it doesn't matter. People just use whatever version of windows came with the PC.


Forza Motorsport 5 on PC (DX12), that's a good news.


It's a tech demo only, I'd be very very surprised if we ever see it on PC.


Seems like a lot of work. They had to port, at least partially, some core engine's capabilities (graphics, physics etc.). They could just enchance some existing PC game or demo. Maybe Forza will be something like launch title for DX12?


Yeah right. When hell freezes over.


No, I am not excited. I expect this will be (again) used as an excuse to force me to upgrade either Windows, my video card, or both.


So don't upgrade? How are you being forced?


Inevitably a game will come out that is DX12-only, at which point you have to incur those costs if you want to play it.


Sure, and that would suck. There isn't any aspect of force that I can make out. I'm seeing it as 'minimum system requirements will change and I don't want them to.'

We've seen studios release DX11/Vista+ games with fallbacks to work on DX10/XP.


One API for every device is great for developers targeting the Xbox and other windows devices.


It makes the job of engine developers somewhat easier in the long run, but very few game developers are writing raw DirectX these days.

And (almost) nobody's targeting just Microsoft devices, even developers who are paid for temporary exclusives. It's all about the cross-platform engines.


Why should I be excited?


50% less CPU use for the main thread.

Mantle has the same CPU savings, and during discussions on that game developers have talked about how in many (most?) games, up to 50% of your CPU time is spent in the drivers. Considering how many games I've run into lately where single-core performance is the bottleneck, this doesn't surprise me at all.

Mantle and DX12 strip out a lot of layers, optimize a lot of what it keeps, and spread a good chunk of the remaining load over multiple threads, meaning that instead of trying to use 120% of core 0 and 15% of cores 1-3, you can use 60% of core 0 and 25% of cores 1-3 (just as a hypothetical made-up example).


Greater efficiency, more ways for devs to optimize their games. Should see some FPS improvement with dx12.


Better multi core utilization. Less cpu use. Less dependency on single threaded performance.


I somehow cannot access this page with IE 8. WTH, Microsoft?


You haven't heard, Microsoft has dropped support for XP.


I'm on Windows 7, just stuck with IE 8 here at work.


IE8 is a pre-HTML 5 browser. Glad to see more and more sites are dropping support for it. Install Chrome. You don't even need admin rights. Your IT dept. never has to know.


Closed source API's like DirectX won't ever be as fast as an open source API standard like OpenGL.

This is why games like Left-4-Dead2 using OpenGL run substantially faster (More FPS) (http://www.extremetech.com/gaming/133824-valve-opengl-is-fas...)


Your first assertion has no bearing in reality. There's nothing prohibiting a closed-source API form being as fast as or faster than an open-source one. Your second assertion does not imply that your first is correct.

It may be that the current example of open-source are faster than the current example of closed-source, and there may be very real advantages that the open-source approach brings about that pushes the status quo towards as you describe, but there's nothing mandating that. I'm not sure why you'd therefore assert that.


I certainly stand corrected, I made the assumption that OpenGL was open source (which was incorrect)

The assertion I was intending to make was that widely used open source applications tend to be analyzed and improved by a larger community and thus are more likely to be more steamlined/efficient ("faster" was also not a correct term here to describe efficient pipelines)

Although as someone stated further down, OpenGL is not actually open source, it's an open standard. So my original point is null and void as is.

I believe my hate for Microsoft had blinded me, forgive me.



But it's also not closed source, because OpenGL is just a specification, like Wayland.


The correct term is open standard.


> Closed source API's like DirectX won't ever be as fast as an open source API standard like OpenGL.

Why are you comparing implementations and standards on a metric that's only application to implementations?


What is an API speed? The hardware that makes it happen determines the speed.


I confessed my idiocy earlier above, although I do believe OpenGL provides a more efficient pipeline to the hardware itself.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: