DirectX 12: The future of it within the console gaming space (specifically the XB1)


This is great news, so we basically get the best of both worlds - graphics hardware continues to progress at the feature level while even older GPU's get the full benefit of the more efficient API.

This does raise some interesting questions though...

1. Will these features merely be the DX11.2 features already included in GCN 2.0?
2. If they go beyond that then presumably XB1 will only be 'partially' DX12 complaint.
3. Will Maxwell support the new features or will we have to wait for Volta?
 
This is great news, so we basically get the best of both worlds - graphics hardware continues to progress at the feature level while even older GPU's get the full benefit of the more efficient API.

This does raise some interesting questions though...

1. Will these features merely be the DX11.2 features already included in GCN 2.0?
2. If they go beyond that then presumably XB1 will only be 'partially' DX12 complaint.
3. Will Maxwell support the new features or will we have to wait for Volta?

1. I didn't even know GCN2.0 spec was revealed? Are they jumping from 1.0->1.1 -> 2.0? Must be a large jump if this is true.
2. I don't know if this is anecdotal proof because it's not as people have been known to say the wrong things (Xbox One will receive full DX12) https://twitter.com/XboxP3/status/446873135594225664 - Phil Spencer
 
@iroboto
In addition to your post:

d962df-1395496200.jpg
 
But that post doesn't "explicitly" talk of hardware features...

I mean... consoles always had the advantage of developers being able to handle stuff much more down to the hardware. That could very well be what DX12 will be in the future, too. And by virtue of not having to do the same renderer twice for PC and consoles, the console will benefit, as all optimizations done for either platform will lead to better performance to both systems.
 
1. I didn't even know GCN2.0 spec was revealed? Are they jumping from 1.0->1.1 -> 2.0? Must be a large jump if this is true.

There's not really an official designation but I was using 2.0 to refer to the IP in the consoles and Bonaire/Hawaii.

2. I don't know if this is anecdotal proof because it's not as people have been known to say the wrong things (Xbox One will receive full DX12) https://twitter.com/XboxP3/status/446873135594225664 - Phil Spencer

I wouldn't take too much heed from that statement. The XB1 getting the "full DX12 API" could mean almost anything. Since it's going to almost certainly be a custom version of the API for the console in the first place it doesn't really say anything about it being a feature match for the PC version. One thing we do know is that the XB1 has the same graphics IP as Hawaii so if Hawaii doesn't support the full DX12 featureset (and we don't know whether it does or doesn't at the moment - just that Kepler doesn't) then neither will the XB1.
 
There's not really an official designation but I was using 2.0 to refer to the IP in the consoles and Bonaire/Hawaii.



I wouldn't take too much heed from that statement. The XB1 getting the "full DX12 API" could mean almost anything. Since it's going to almost certainly be a custom version of the API for the console in the first place it doesn't really say anything about it being a feature match for the PC version. One thing we do know is that the XB1 has the same graphics IP as Hawaii so if Hawaii doesn't support the full DX12 featureset (and we don't know whether it does or doesn't at the moment - just that Kepler doesn't) then neither will the XB1.

Absolutely agree, it's the Tier-1, Tier-2 Tiled Resources discussion all over again. But recall that there were feature sets in Xbox 360 that were in DX10, but Xbox 360 was never fully DX10 compatible either, because of the console being custom.

And X1 is hawaii?? Are you sure it isn't Sea Islands?
"Just like our friends we're based on the Sea Islands family. We've made quite a number of changes in different parts of the areas...
http://www.eurogamer.net/articles/digitalfoundry-vs-the-xbox-one-architects
 
Last edited by a moderator:
And X1 is hawaii?? Are you sure it isn't Sea Islands?

The IP naming schemes from AMD are very mixed up. I'm not sure there even is an official "Sea Islands" IP anymore. However I'm pretty sure the consoles and Hawaii (as well as Bonaire and Kaveri) share the same IP which is you could refer to as Sea Islands or Volcanic Islands I guess.
 
The way I understand it is that Bonaire aka Sea Islands is GCN 1.1. Volcanic islands is gcn 1.0 which includes 7750,7770,7850,7870,7950,7970. Hawaii is Sea Islands based like bonaire. So technically the X1 and Ps4 are from the same arch as the 7790 and the R9 290.
 
The way I understand it is that Bonaire aka Sea Islands is GCN 1.1. Volcanic islands is gcn 1.0 which includes 7750,7770,7850,7870,7950,7970. Hawaii is Sea Islands based like bonaire. So technically the X1 and Ps4 are from the same arch as the 7790 and the R9 290.

That's how I understand it except that GCN1.0 is Southern Islands. Volcanic Islands was supposed to be the code name for Hawaii as far as I know but was more or less dropped as a designation altogether.
 
http://arstechnica.com/gaming/2014/...ovements-for-existing-hardware-in-directx-12/

There are other optimizations to reduce system loads in DirectX 12 as well, Gosalia said. Since "most apps and games have a lot of similarity in what they draw from one frame to another," DirectX 12 has algorithms to reduce the overhead from recalculating minor state changes in a scene. A new heap implementation will amortize the creation and destruction of resources, limiting system costs. DirectX 12 will introduce a new, faster resource binding model and allow for faster reuse of memory freed up by used resources that are no longer needed. Developers will also be able to decompress ASTC and JPEG images in hardware.
Other than the Xbox One and the JPEG move engine, do any other current DX11 GPU which now claim to be DX12 compatible actually have hidden hardware for ASTC and JPEG acceleration which just hasn't been exposed up to now? If not, is ASTC and JPEG acceleration going to be emulated either on GPU with shaders or by the CPU for older GPUs or will it simply be an optional part of the DX12 spec?

And along those lines, what is the most likely reason that nVidia can support DX12 on all their DX11 GPUs back to Fermi while AMD won't be supporting DX12 on their VLIW5 and VLIW4 DX11 GPUs and Intel won't support it on Ivy Bridge?

The possibilities seem to be:

1. AMD's VLIW GPUs and Intel's Ivy Bridge are hardware compatible with DX12 afterall, but AMD and Intel for whatever reason don't want to write DX12 drivers for them.

2. Just as AMD helped drive tessellation into DX11 and tiled resources into DX11.2, nVidia may have successfully lobbied this round to define DX12 in a way that supports Fermi.

3. nVidia's GPU architects are actually able to anticipate the direction of graphics technology years in advance of others.

I'm guessing it's more likely some combination of 1 and 2.

And in case it hasn't been posted before, there's a link to pictures of the GDC DX12 slides below (albeit out of order):
http://imgur.com/a/qnPph#22
 
The possibilities seem to be:

1. AMD's VLIW GPUs and Intel's Ivy Bridge are hardware compatible with DX12 afterall, but AMD and Intel for whatever reason don't want to write DX12 drivers for them.

2. Just as AMD helped drive tessellation into DX11 and tiled resources into DX11.2, nVidia may have successfully lobbied this round to define DX12 in a way that supports Fermi.

3. nVidia's GPU architects are actually able to anticipate the direction of graphics technology years in advance of others.

I'm guessing it's more likely some combination of 1 and 2.
It's 3 because nvidia support bindless resources(not textures) and drawindirect since Fermi - http://developer.download.nvidia.com/opengl/tutorials/bindless_graphics.pdf , but I suppose Fermi will have limited support(some feature level like DX11's feature level 10 for compute shaders on DX10 cards)
 
Last edited by a moderator:
Since it's going to almost certainly be a custom version of the API for the console in the first place


Wait ... how can you conclude that?!

I like to see DirectX as consisting of 2 parts The API and the HW-FEATURE-LEVEL

Now the Xbox One has an APP, GAME(TITLE) , and System (dashboard etc) OS but we'll ignore that for the time being ...


API -

The APP side is stock standard Windows, and im guessing that its also stock standard windows flavour of DirectX api and WinRT api .. So that will hopefully get the FULL DX API ...

The Game side I'm also guessing will get FULL DX API BUT probably as you say a custom Dx .. And as we already know a Title ships with what we believe is an entire image OS including kernel/Dx etc..

HW Feature Level -

Now the "HW FEATURE LEVEL" support within Xbox One is the big question.. Is Xbox One already HW feature level Dx12 ?! I'm guessing yes!

"Microsoft confirmed that DX12 was on the roadmap for the console, but "beyond that, we have nothing more to share." - http://www.eurogamer.net/articles/digitalfoundry-2014-directx-12-revealed-coming-to-xbox-one

Assuming DirectX was in planning 4 years ago, as they said in the DirectX blog and NVidia post, then chances are the design of the api influenced the design of the Xbox One architecture..
 
Last edited by a moderator:
Who knows if it is 100% compatible hardware side. It would be nice if it is. We still dont know all the details of the api's new features.
 
Assuming DirectX was in planning 4 years ago, as they said in the DirectX blog and NVidia post, then chances are the design of the api influenced the design of the Xbox One architecture..
The "4 years" was more like chatting in the coffee table "you know, we prolly should do another API after all this DX11 stuff, too"
I'll dig for the link, but it was mentioned at GDC that the DX12 has been actually "in active development" for about a year or so.
 
The "4 years" was more like chatting in the coffee table "you know, we prolly should do another API after all this DX11 stuff, too"
I'll dig for the link, but it was mentioned at GDC that the DX12 has been actually "in active development" for about a year or so.

lol I imagine it was a little more formal than napkin notes and drawings ;)
It was in active implementation for about 1 year according to nvidia but was being discussed/designed/napkin notes for about 4 years.

Which lends be credibility to the argument that X1 was really meant to be a 2014 release. Sony beat them to the punch and they rushed a product out well in advance. The focus on games became a priority, all developers were pushed to use DX11 until the DX12 work was complete.

What I'd really like to know is how they roadmap the features for Direct X. What is the long term vision for where gaming graphics should be. Do they look at Maya/3DS and professional graphics tools and say, yea we need those features for our games, or do they look elsewhere for inspiration on what should be next. It is curious that PowerVR now has a hybrid RT block requiring low power and built on a 28nm SoC; and personally it took me by storm. I've been living in a bubble and I didn't even think this was possible.

So if PowerVR and Caustic aligned and said: Yes, ray tracing is the future of graphics, we can't get there yet, but this is a interim solution, let's take this route and see what happens. I wonder what MS and Nvidia aligned on.
 
I find it curious how you disregard AMD from that completely, despite the fact that they're behind XB1 too and have just as heavy involvement with DX development in the past as the rest of the big players

Worth noting is that it's Qualcomm who was the 4th "big player" on the announcement, rather than PowerVR for example
 
I find it curious how you disregard AMD from that completely, despite the fact that they're behind XB1 too and have just as heavy involvement with DX development in the past as the rest of the big players

Worth noting is that it's Qualcomm who was the 4th "big player" on the announcement, rather than PowerVR for example

I didn't on purpose; the nvidia blog mentioned their involvement on dx12 for 4 years with ms and a implementation period of over a year now today. I just stayed on that thought. Realities are likely that this group joined together to discuss these items either directly together or indirectly. But I could imagine 4 1on1 discussions being different than 4 groups actively working together.
 
AMD said at the event that "DX12 has had the least tension between the companies involved in the history of DX", which tells quite directly that they've been heavily involved, too
 
Back
Top