DirectX 12: The future of it within the console gaming space (specifically the XB1)

There are still a handful of fixed-function hardware units that handle clipping, scan conversion, depth testing, and spinning up pixel shaders.
OK, thanks. So even if they could be bothered to implement it as some hack, that would probably be S L OW.


Personally I'm totally fine with feature levels, but I suppose in practice they have a hard time fitting a wide array of hardware into such coarse buckets of functionality.
Since they strictly require D3D11-class hardware for Direct3D12, they do not have a "wide array of hardware" anymore and 9_x, 10_x and 11_x feature levels would not represent real hardware. These "down-levels" feature levels are only required for compatibility with your legacy game assets such as HLSL shader code, but they all would be running on D3D11 hardware.

XB1 is not running DX11 it uses a bespoke API (commonly referred to as XB1 API). XB1 API is much closer to the tin than DX11
Please explain how this is possible when it retains the same Direct3D 11 API?

Microsoft and AMD might have a very optimized "driver" and runtime that is tailored to the GCN architecture, and they might have extended the API with bundles or other improvements, but from all we know the resource-management and memory-management aspects of the D3D11 API have not gone anywhere on the Xbox One, whatever level of optimization they were able to apply.
On the other hand, Direct3D12 completely goes away from any resource- or memory-management.


From the titanfall tech talk, there is evidence that dx11.x1 does not have any of the multithreaded draw call performance of mantle or d3d12.
Yep, that's why they had to completely throw away any resource management in the driver which would require insane levels of inter-process synchronization, and instead devised a new model where resources are static and immutable and so access to them can be easily multithreaded.
 
Last edited by a moderator:
I think we had part of the answer in our hands for last 3-4 days. We had some slides about DX12 from GDC 2014 and Build 2014 in this short period of time. Take a closer look at this slide:

xxNpozh.jpg


Some features are already on XB1:

1) Bundles which is part of "CPU Overhead: Redundant Render Commands" (Page 26).

2) Nearly zero D3D resource overhead which should be part of "Direct3D 12 – Command Creation Parallelism" (Page 33).
Great find, thanks! Completely missed this slide from the GDC DX12 session.


Please also note how on Slide 34 of the Build 2014 session "Direct3D 12 API Preview", they say "Command List submission cost reduced by WDDM 2.0" - which means it should also benefit D3D11 runtime when they port the improvements from D3D11.X for Xbox One.

According to presenter "Descriptor Heaps & Tables" are essential for using Bundles on DX12. \
If you check Slide 55 from "Direct3D 12 API Preview", they say that bundles are supported on all D2D12 hardware, but multiple draw calls (from NVidia OpenGL extensions?) aren't.

According to Microsoft DX12 will have two level of functionality, one low level and the other one would be a superset of DX11 rendering functionality.
I'd think they were rather talking about expansion of D3D11 API with some features from D3D12 and D3D11.X from Xbox One - so D3D11 essentially becomes a "resource-managed" version of D3D12, probably with additional up-level "feature level 12_0".

That would require [post=1836199]an entirely new driver model on the PC [/post] that serves both D3D11.X and D3D12 runtimes, which is consistent with remarks cited above.
 
Last edited by a moderator:
I don't know that what I'm going to say is true or not (Please, correct me if I'm wrong). I think we had part of the answer in our hands for last 3-4 days. We had some slides about DX12 from GDC 2014 and Build 2014 in this short period of time. Take a closer look at this slide:

xxNpozh.jpg


Some features are already on XB1:

1) Bundles which is part of "CPU Overhead: Redundant Render Commands" (Page 26).
26949418242381796518.jpg


2) Nearly zero D3D resource overhead which should be part of "Direct3D 12 – Command Creation Parallelism" (Page 33).

GDC slide implying that these are parts of the DX12 which could be find on XB1 today.

Features that aren't on XB1:

1) Pipeline State Objects (PSOs).
32126815023753305173.jpg


2) Resource Binding.

20444746290927439108.jpg


These features will be available on XB1 later. Also "Descriptor Heaps & Tables" which is a sort of bindles rendering (page 19 under the "CPU Overhead: Redundant Resource Binding") would be possible only on GPUs that are fully DX11.2 capable (tier 2) and beyond. Considering that both DX11.2 and DX12 were announced for XB1 and DX team is prototyping DX12 on XB1 HW right now, it's likely that Descriptor Heaps & Tables will be available on XB1, too.

85024947937976908740.jpg

So based on this findings (correct/forgive me if I'm wrong) I think the impact of DX12 on XB1 would be considerable. Every one can download/see the PPT/presentation video files from here.

Edit: According to presenter "Descriptor Heaps & Tables" are essential for using Bundles on DX12. Presenter at GDC 2014 called them resource tables:

http://forum.beyond3d.com/showpost.php?p=1838040&postcount=7747

So can I say that XB1 is a "Tier 2" DX11.2 capable GPU?
I think Dave hinted at that before, that the Xbox One's GPU is a Tier 2 GPU.

I don't know what Tier 2 is, but it should have some advantages.
 
Seems to be a lot of discussion around DX 12 feature set to HW mapping on existing "DX11.+" cards (ie we need new hardware to take direct advantage of new features). What I'm gathering from the presentations I've seen is that while from a "feature set" perspective DX11 maps to current card capabilities, DX11 software state representations and processing elements don't directly map hardware state representations and processing elements; and as such incurs significant performance penalties in managing and translating those mappings.

DX12 seems mostly about removing some of that abstraction so that the API more closely maps to how the hardware actually works, from GCN and Fermi on. So in essence it's not about having to have hardware follow the API, but rather the API following the hardware.
 
Please explain how this is possible when it retains the same Direct3D 11 API?

Microsoft and AMD might have a very optimized "driver" and runtime that is tailored to the GCN architecture, and they might have extended the API with bundles or other improvements, but from all we know the resource-management and memory-management aspects of the D3D11 API have not gone anywhere on the Xbox One, whatever level of optimization they were able to apply.
On the other hand, Direct3D12 completely goes away from any resource- or memory-management.

Perhaps I'm overstating the case but both 'bundles make rendering nearly free' and 'nearly zero D3D resource overhead' sound very different from vanilla DX11. Resource binding and PSOs are unique to DX12 and as you say they are rewriting whole chunks of the library and drivers to account for this and the other changes making it radically different from DX11 but perhaps not so much from DX11.x?

I guess I have a hard time believing that DX11.x is such a bad fit for it's underlying architecture that DX12 will unlock radically more performance. The resource binding stuff sounds very interesting in the context of the ESRAM setup in that it would seem to allow developers to explicitly decide what goes where with less API overhead/meddling
 
Seems to be a lot of discussion around DX 12 feature set to HW mapping on existing "DX11.+" cards (ie we need new hardware to take direct advantage of new features). What I'm gathering from the presentations I've seen is that while from a "feature set" perspective DX11 maps to current card capabilities, DX11 software state representations and processing elements don't directly map hardware state representations and processing elements; and as such incurs significant performance penalties in managing and translating those mappings.[/b]
True, but Direct3D 12 doesn't directly map them to hardware elements either. I get the impression that drawing calls and resource/pipeline views mostly use the very same high-level D3D11 structures, though reorganized in a more efficient pattern to allow multithreading. They are not directly bound to low-level architecture details such as the actual instruction set or number/width of actual processor registers, though of course they depend on actual capabilities of the underlying hardware for things like register swizzling. On the other hand, resource management seems to have been completely reworked, to the point of getting rid of surface format descriptors and resource creation/destruction methods known since at least D3D8.


They also promised new hardware features, some of which are detailed in a "hidden" section at the end of the PowerPoint file from the Build session ("conservative rasterization" and "pixel shader UAV ordering" on slide 56), but these could as well be included in the D3D11-like API layer as "feature level 12_0".

both 'bundles make rendering nearly free' and 'nearly zero D3D resource overhead' sound very different from vanilla DX11.
Microsoft kind of confirmed that both of these features are present in D3D11 for the Xbox One [post=1840169] in the Forza DX12 slide discussed above[/post]. This was a Microsoft GDC talk and the live blog of the session is here: http://www.pcper.com/news/Graphics-Cards/Microsoft-DirectX-12-Live-Blog-Recap


xxNpozh.jpg


I think Dave hinted at that before, that the Xbox One's GPU is a Tier 2 GPU.
[post=1780981]If you follow past discussions[/post], only GCN 1.1 GPUs - that includes Xbox One, R9 290 and R7 260 - support Tier 2, which offers additional texture filtering modes and other minor features comparing to Tier 1.
 
Last edited by a moderator:
PS4 doesn't leverage OpenGL. It uses GNM.

More here. It's worth noting that the easier API, GNMX, has a 'significant CPU overhead', so it's true that games on PS4 could be quite hampered by API choice. I suppose that in theory, if DX12 whatever on XB1 is far more efficient that GNMX on PS4, that could make up a performance deficit, but GNM based titles will see little overhead on PS4.

Thanks for the article btw Shifty, a little off topic, but it was a better view in the general of the developer landscape. I was responsible for porting a game from openGL to PSN@Home's lua SDK and the reality is, I had to use the tools that were provided. Even if somehow I was granted access to lower level hardware (via lua script lol doubtful) I couldn't have done anything with it [I'm not educated in GPU], I was struggling to deliver this project as it was already (I already had to learn lua, I do not know how to make a scripting language operate fast and what calls were taking performance hits, so I just did what I could).

So I think, seeing this article the talk about different APIs, I mean, it's quite possible if you don't have the talent on your team to take advantage of these lower level APIs in the time frame you need it for, and you don't have time to learn, it's quite possible you're stuck with just using the shitty tools that are provided to you and you branch out where required; It's nice to know I'm not the only guy learning on the job.

Now that I've completed that thought, even though dx11.X1 that granted low level access, but you didn't know how to leverage it and DX12 was just better at that type of thing, I guess you could see an improvement from talentless guys like myself.

The long haul story is that in 2-3 year time we should actually be expecting titles that should be performing where people expect them to be.
 
Now that I've completed that thought, even though dx11.X1 that granted low level access, but you didn't know how to leverage it and DX12 was just better at that type of thing, I guess you could see an improvement from talentless guys like myself.
Development covers a wide range of abilities and targets nowadays, from the AAA hardcore first-party to the casual/indie developer using middleware to the scripted expansion type such as Home developers. That's where performance needs to cover two different requirements -

1) high level, abstracted, 'sloppy' code full of ifs and human logic, with the system needing to optimise out-of-order instructions etc. to get reasonable throughput.

2) low level, high performance code in engines and middleware to extract the best from the system.

I'm not sure DX12 targets either particularly. Both will see benefits. After all, the high-level languages sit on top of low level frameworks. We should see in any console the option for devs to go low-level and extract high performance from the hardware though, which is where high gains from an API change shouldn't happen for those devs/engines.
 
Forgive me if this is a stupid question, but... doesn't PS4 run DirectX as well? Would it not be able to take the same advantage of DX12 as any other computer?
 
Forgive me if this is a stupid question, but... doesn't PS4 run DirectX as well? Would it not be able to take the same advantage of DX12 as any other computer?

No, it doesn't. The APU is obviously capable of supporting DirectX, but the OS isn't.
 
Forgive me if this is a stupid question, but... doesn't PS4 run DirectX as well?

No. To be pedantic, DirectX is a software layer (API and library) that is only available on Microsoft ecosystems.

However, when people say DX12 Hardware, they typically mean the hardware capability levels that match up to the software, but not specifically the software layer.
 
Forgive me if this is a stupid question, but... doesn't PS4 run DirectX as well? Would it not be able to take the same advantage of DX12 as any other computer?

Not as any other computer, since PS4 doesnt support the API directly. But yes I belive it will take advantage of it anyway.
PS4 has a wrapper that can convert DX titles to its own API. This is working as we speak with DX 11.
However, in the same way DX11 games wont get any benefit from installing DX12 on your machines, since the current games do not make direct hardware calls (low level), the PS4 conversions do not benefit from its low level API, but just use compatible comands.

However, with DX12 this wrapper can be updated to convert the new DX12 low level accesses to the PS4 own low level accesses. And as I see it, this will allow for converted games with low level access to the hardware, thus benefiting the PS4.

Please correct me if wrong.
 
Not as any other computer, since PS4 doesnt support the API directly. But yes I belive it will take advantage of it anyway.
PS4 has a wrapper that can convert DX titles to its own API. This is working as we speak with DX 11.
However, in the same way DX11 games wont get any benefit from installing DX12 on your machines, since the current games do not make direct hardware calls (low level), the PS4 conversions do not benefit from its low level API, but just use compatible comands.

However, with DX12 this wrapper can be updated to convert the new DX12 low level accesses to the PS4 own low level accesses. And as I see it, this will allow for converted games with low level access to the hardware, thus benefiting the PS4.

Please correct me if wrong.

Is it really that straightforward?
 
PS4 has a wrapper that can convert DX titles to its own API. This is working as we speak with DX 11.
I really don't think that's true. I think a lot of people got confused between DX11 class hardware and features, and actually running DX11 (happens all the time!). PS4 has a low level API and a wrapper around that which presents higher level access much more akin to DX or OGL, but it doesn't have a DX wrapper AFAIK. There was some DX wrapper stuff implemented by TB for Sacred 2 on PS3 who went on to work for SCEE, but I don't think that resulted in a DX wrapper and easy DX game porting. The references I've found to DX11 on PS4 are people confusing the statements about Sony's APIs being DX11+ featureset.
 
I really don't think that's true. I think a lot of people got confused between DX11 class hardware and features, and actually running DX11 (happens all the time!). PS4 has a low level API and a wrapper around that which presents higher level access much more akin to DX or OGL, but it doesn't have a DX wrapper AFAIK. There was some DX wrapper stuff implemented by TB for Sacred 2 on PS3 who went on to work for SCEE, but I don't think that resulted in a DX wrapper and easy DX game porting. The references I've found to DX11 on PS4 are people confusing the statements about Sony's APIs being DX11+ featureset.

Well... I admit my bad.
Apparently the conversion is done on the compiler itself.
I quote from the "the crew" development team on interview to Eurogamer

"Sony has made a big deal about the accessibility of the PS4 hardware, and a key element of that would be the quality of the toolchain - the series of programs used to create compiled code. For the PS4 developers, the use of the established Visual Studio environment proves to be a key benefit, and the extent to which Sony has acknowledged and supported cross-platform game-makers is self-evident. There are even options within Sony's compiler specifically added in order to increase compatibility with the Microsoft counterpart used in compiling DirectX 11 games."
 
That doesn't even read as full conversion. Games written in VS for PC and DX will still need to be ported to PS4 and its own APIs, but there are some assists, is as I understand it.
 
Well... I admit my bad.
Apparently the conversion is done on the compiler itself.
I quote from the "the crew" development team on interview to Eurogamer

"Sony has made a big deal about the accessibility of the PS4 hardware, and a key element of that would be the quality of the toolchain - the series of programs used to create compiled code. For the PS4 developers, the use of the established Visual Studio environment proves to be a key benefit, and the extent to which Sony has acknowledged and supported cross-platform game-makers is self-evident. There are even options within Sony's compiler specifically added in order to increase compatibility with the Microsoft counterpart used in compiling DirectX 11 games."

From the Crew piece:
http://www.eurogamer.net/articles/digitalfoundry-how-the-crew-was-ported-to-playstation-4

A more crucial issue is that, while the PS4 toolchain is designed to be familiar to those working on PC, the new Sony hardware doesn't use the DirectX API, so Sony has supplied two of their own.

"The graphics APIs are brand new - they don't have any legacy baggage, so they're quite clean, well thought-out and match the hardware really well," says Reflections' expert programmer Simon O'Connor.

"At the lowest level there's an API called GNM. That gives you nearly full control of the GPU. It gives you a lot of potential power and flexibility on how you program things. Driving the GPU at that level means more work."

Sony has talked about its lower-level API at GDC, but wouldn't disclose its name, so at least now we know what it's called (the PS3 equivalent is GCM, for what it's worth) but what about the "wrapper" code supplied by Sony that is supposed to make development simpler?

"Most people start with the GNMX API which wraps around GNM and manages the more esoteric GPU details in a way that's a lot more familiar if you're used to platforms like D3D11. We started with the high-level one but eventually we moved to the low-level API because it suits our uses a little better,"
says O'Connor, explaining that while GNMX is a lot simpler to work with, it removes much of the custom access to the PS4 GPU, and also incurs a significant CPU hit.

A lot of work was put into the move to the lower-level GNM, and in the process the tech team found out just how much work DirectX does in the background in terms of memory allocation and resource management. Moving to GNM meant that the developers had to take on the burden there themselves, as O'Connor explains:

"The Crew uses a subset of the D3D11 feature-set, so that subset is for the most part easily portable to the PS4 API. But the PS4 is a console not a PC, so a lot of things that are done for you by D3D on PC - you have to do that yourself. It means there's more DIY to do but it gives you a hell of a lot more control over what you can do with the system."

Another key area of the game is its programmable pixel shaders. Reflections' experience suggests that the PlayStation Shader Language (PSSL) is very similar indeed to the HLSL standard in DirectX 11, with just subtle differences that were eliminated for the most part through pre-process macros and what O'Connor calls a "regex search and replace" for more complicated differences.

I initially only recalled the last bolded statement about regex search and replace and thought that was for the whole of the system. Of course only a portion of it is as easily managed. I also remember reading lots about "wrappers" for various aspects of the D3D so until I actually read something along the lines of this excellent article on DF I thought very similarly.

The underlined part puts me in mind of the DX12/Mantle discussion.
 
You can make your own DX12 wrapper to the PS4 if you want.

...for what it is my experience in game engines, they have a specific folder with the 'low level engine' files of choice (i.e. say /dx11, /ios, /xbone, /ps4, whatever) and all the engine commands goes there (plus a ton of #ifdef here and there for the featureset).

I believe that engines designed properly (not "grown organically") does have such stuff among requirements, no? Even if they never ported it, I mean.
 
Back
Top