DirectX 12: The future of it within the console gaming space (specifically the XB1)

SW/Shader fallback is fine, but I prefer HW(fixed function or what not) delivering my features ... I'll wait for FULL graphics HW to provide me FULL Dx12 experience.

I don´t think fallback is applicable here.
DX 10 cards can run DX 11 software with fallback, and that doesn´t mean they can enjoy "all the power" of DirectX 11. In fact they cannot enjoy DX 11 at all!

If DX 11.2 cards could only run DX 12 with fallbacks, then they were not having "The full power of DirectX 12", they were just having the full power of an optimized and low level DX 11.2.
 
I don´t think fallback is applicable here.
DX 10 cards can run DX 11 software with fallback, and that doesn´t mean they can enjoy "all the power" of DirectX 11. In fact they cannot enjoy DX 11 at all!

If DX 11.2 cards could only run DX 12 with fallbacks, then they were not having "The full power of DirectX 12", they were just having the full power of an optimized and low level DX 11.2.

LOL. You do know that FAQs aren't law binding contracts. Its not like "full" can't be a mistake especially when we have MS themselves stating that some DX12 features will require a new generation of gpus.

We had an AMD official saying no DirectX 12 and we had MS sending emails out saying DirectX is "no longer evolving as a technology". Yet here we are.

Time will eventually enlightens us to what "full" compatibility means.
 
Last edited by a moderator:
DX 11.2 will work on DX 11.1 hardware too. The diference is that not all features are available! But can I say a card without those features supports DX 11.2??

Kepler for instance, it has no Tier2 support. Can I claim this card supports DX 11.2? I think not!
NVIDIA surely thinks so, before DX11.2 they said DirectX 11.1 API (Feature Level 11_0), now they say DirectX 11.2 API (Feature Level 11_0) and with DX12 they'll be saying DirectX 12.0 API (Feature Level 11_0)

Question is: To use DX 11 you need DX 11 hardware support, and to use DX 11.2 you need DX 11.2 hardware support. Yet Microsoft claims 80% of all currently sold cards will support DX 12.
What is "DX 11.2 hardware support"? It has 2 tiers of Tiled Resources, one of which works, to my understanding, on all DX11 cards, and 2nd tier which works only on GCN Radeons (which are D3D Feature Level 11_1 to begin with)
There's no new features in DX 11.2 to my knowledge, which would require "hardware support", which is why there is no Feature Level 11_2 either.

From the DirectX developers blog:

Q: Should I wait to buy a new PC or GPU?
A: No – if you buy a PC with supported graphics hardware (over 80% of gamer PCs currently being sold), you’ll be able to enjoy all the power of DirectX 12 games as soon as they are available.
http://blogs.msdn.com/b/directx/archive/2014/03/20/directx-12.aspx

But feel free to correct me.
Power, not features ;)
And it's marketing talk anyway.
 
What is "DX 11.2 hardware support"? It has 2 tiers of Tiled Resources, one of which works, to my understanding, on all DX11 cards, and 2nd tier which works only on GCN Radeons (which are D3D Feature Level 11_1 to begin with)
There's no new features in DX 11.2 to my knowledge, which would require "hardware support", which is why there is no Feature Level 11_2 either.
.


GCN 1.0 (280x, 270x, 250x for example) are tier 1, GCN 1.1 (260x, 290x..) tier 2

pre GCN DX11 (5000-6000 series) have no support as far as I know

also DX12 was only announced as GCN compatible on the AMD side, so no 2009-2011 cards are supported.
 
What is "DX 11.2 hardware support"? It has 2 tiers of Tiled Resources, one of which works, to my understanding, on all DX11 cards, and 2nd tier which works only on GCN Radeons (which are D3D Feature Level 11_1 to begin with)
There's no new features in DX 11.2 to my knowledge, which would require "hardware support", which is why there is no Feature Level 11_2 either.

You are correct! Tier2 is feature level 11_1.

But even with that truth, there is, to my knowledge, no feature in the DX 11.1 API that requires the use of Tier2.

That means that although DX 11.2 has no hardware associated with it, it requires the full implementation of feature level 11_1. And funny enough, not all DX 11.1 cards have it.

That is why I talked about DX 11.2 hardware. It was a figure of speech just because DX 11.1 hardware may not be enough for it.
 
I thought AMD stated that the HD 7000 series arch in general was 11.2 capable. It was just a matter of AMD releasing the necessary drivers.
 
You are correct! Tier2 is feature level 11_1.
You're confusing two separate things. Tiled resources (and associated tiers) is an optional feature in all 11.x with a separate API query; it's not a required part of any of the feature levels. Fire up the DX caps viewer if you want more details but for instance... Kepler claims feature level 11_0 support w/ tier 1 tiled resources. Haswell is feature level 11_1 but no tiled resources, etc.

This is admittedly confusing to end users, but we're definitely currently in a world where API, "feature level" and optional features (i.e. caps bits) all exist and are somewhat orthogonal (outside of the fact that you usually have to use the latest API to get access to the latest features).
 
yes, all GCN cards are considered to have "full DX11.2 support"

but GCN 1.0 only support tier 1, GCN 1.1 tier 2
 
yes, all GCN cards are considered to have "full DX11.2 support"

but GCN 1.0 only support tier 1, GCN 1.1 tier 2

Andrew is saying that tiled resources is an optional feature of DX11. You can be DX11.2 feature-level compliant and not support tiled resources, if I understand his post correctly.

What that means for DX12, I don't know.
 
You can be DX11.2 feature-level compliant and not support tiled resources, if I understand his post correctly.
Correct, although somewhat more confusingly there is no "feature level 11_2"... but yes you can be a feature level 11_1 device and not support tiled resources at all, as is the case for Haswell.

What that means for DX12, I don't know.
Until more information becomes available, it just means that speculation about what "FULL DX12 SUPPORT" (from IHVs) means in terms of hardware features is pretty much useless :)
 
Correct, although somewhat more confusingly there is no "feature level 11_2"... but yes you can be a feature level 11_1 device and not support tiled resources at all, as is the case for Haswell.


Until more information becomes available, it just means that speculation about what "FULL DX12 SUPPORT" (from IHVs) means in terms of hardware features is pretty much useless :)

No DX11.2? Weird.

So, "Full DX12 support" is kind of meaningless at this point. We do know the general concept of some of the features/improvements coming in DX12. I guess with regards to Xbox One, the best we could do is figure out if Mantle supports the same "features" for GCN, and then we could speculate as to whether that makes it likely they'll support the same in DX12.
 
According to Ubisoft (see here on page 57), both PS4 and Xbox one have advanced GPU architectures, with lots of custom extensions, and capabilities not available on PCs. So I would not be surprised if these cards were up to DX 12 full specs or even above.
 
... have advanced GPU architectures, with lots of custom extensions, and capabilities not available on PCs.
That's true of any GPU viewed in isolation. They never map perfectly to a given spec.

So I would not be surprised if these cards were up to DX 12 full specs or even above.
That logic does not follow. You're assuming that there is a clear priority path towards "future graphics hardware features" and a GPU can thus be "further along that path". That's not the case. It's a wacky branchy land of different features and while I do expect some additional amount of API consideration for GCN due to Xbone there are other pieces of hardware to consider for a portable spec too... I don't see any compelling reason why their features should be excluded simply because GCN may not be capable of doing them. That sort of single-architecture design is Mantle terriority, not DX.
 
Yeah it's seems a little unfortunate to me that we went from having an API with one clearly-defined feature set (DX10) to having multiple feature sets (DX10.1, DX11) to having feature sets that are dependent on GPU + OS version combined with optional caps bits (DX11.1, DX11.2). It's caused a lot of confusion, even among developers.
 
Yeah it's seems a little unfortunate to me that we went from having an API with one clearly-defined feature set (DX10) to having multiple feature sets (DX10.1, DX11) to having feature sets that are dependent on GPU + OS version combined with optional caps bits (DX11.1, DX11.2). It's caused a lot of confusion, even among developers.

Maybe (hopefully) DX12 will get back to one clearly defined feature set. It sounds like talk started a long time ago and implementation is in its first of about three years.
 
ASTC_JPEG_575px.jpg


JPEG on the other hand is a very curious thing to mention, as its lack of existence on any API roadmaps goes along with the fact that we’re not aware of anyone having announces plans to support JPEG in hardware. Furthermore JPEG is not a fixed ratio compressor – the number of bits a given input will generate can vary – which for GPUs would typically be a bad thing. It stands to reason then that Microsoft knows a bit more about what features are in the R&D pipelines for the GPU makers, and that someone will be implementing hardware JPEG support. So we’ll have to keep an eye on this and see what pops up.
http://www.anandtech.com/show/7889/...level-graphics-programming-comes-to-directx/2

I think the use of compressed resources such as jpeg is one of the fixed functions embedded inside the Xbox hardware.
 
Could someone speak to some of the possible advantages of going with jpeg for resources? And if the relative trade off of compression vs image quality is worth it?
 
Could someone speak to some of the possible advantages of going with jpeg for resources? And if the relative trade off of compression vs image quality is worth it?

well, less memory and bandwidth consumption. It would not only save main memory bandwidth, it would save bandwith for the HDD, which could be an advantage while streaming content.
JPG images can offer a really good quality, it is just the question how small you want to have it.

But I don't know, where the decompression takes place. maybe the JPG will be written directly uncompressed into main memory. this would still save HDD bandwidth.
 
According to Ubisoft (see here on page 57), both PS4 and Xbox one have advanced GPU architectures, with lots of custom extensions, and capabilities not available on PCs. So I would not be surprised if these cards were up to DX 12 full specs or even above.

More specifically I expect this means "capabilities not exposed in current PC API''s". And further, I'd say there's a good chance Mantle doesn't come into that consideration.

Since the hardware itself is pretty much identical to that of GCN1.1 based GPU's available to the PC, it's more likely the custom API's of the consoles are exposing more of the GCN 1.1 functionality than DX11.x does.

I'd expect Mantle exposes similar functionality to the consoles and with luck, DX12 will as well. So in terms of currently exposed functionality, unless DX12 does require completely new hardware beyond the GCN 1.1 spec then yeah I guess you could say that both consoles currently offer DX12 feature level support. But then again so do the 260x and 290x, they just aren't exposed (yet) on the PC by DX11.

That's my take anyway, happy to hear from more experienced bods if I'm wrong.
 
But I don't know, where the decompression takes place. maybe the JPG will be written directly uncompressed into main memory. this would still save HDD bandwidth.

I strongly suspect that it would be something where you have to decompress to memory first, as opposed to doing it on-the-fly during texture sampling. Block-compressed formats are specifically designed to be fast and easy to decode, while JPEG is not.
 
Back
Top