Xbox One: DirectX 11.1+ AMD GPU , PS4: DirectX 11.2+ AMD GPU what's the difference?

I just found this Wordpress entry about the difference between Tiled Resources Tier-1 & Tier-2.


DX11.2 Tiled Resources

DX11.2 Tiled Resources
September 6, 2013 — MJP

Tiled resources seems to be the big-ticket item for the upcoming DX11.2 update. While the online documentation has some information about the new functions added to the API, there’s currently no information about the two tiers of tiled resource functionality being offered. Fortunately there is a sample app available that provides some clues. After poking around a bit last night, these were the differences that I noticed:

  • TIER2 supports MIN and MAX texture sampling modes that return the min or max of 4 neighboring texels. In the sample they use this when sampling a residency texture that tells the shader the highest-resolution mip level that can be used when sampling a particular tile. For TIER1 they emulate it with a Gather.
  • TIER1 doesn’t support sampling from unmapped tiles, so you have to either avoid it in your shader or map all unloaded tiles to dummy tile data (the sample does the latter)
  • TIER1 doesn’t support packed mips for texture arrays. From what I can gather, packed mips refers to packing multiple mips into a single tile.
  • TIER2 supports a new version of Texture2D.Sample that lets you clamp the mip level to a certain value. They use this to force the shader to sample from lower-resolution mip levels if the higher-resolution mip isn’t currently resident in memory. For TIER1 they emulate this by computing what mip level would normally be used, comparing it with the mip level available in memory, and then falling back to SampleLevel if the mip level needs to be clamped. There’s also another overload for Sample that returns a status variable that you can pass to a new “CheckAccessFullyMapped” intrinsic that tells you if the sample operation would access unmapped tiles. The docs don’t say that these functions are restricted to TIER2, but I would assume that to be the case.
  • Based on this information it appears that TIER1 offers all of the core functionality, while TIER2 has few extras bring it up to par with AMD’s sparse texture extension.
 
I just found this Wordpress entry about the difference between Tiled Resources Tier-1 & Tier-2.


DX11.2 Tiled Resources

Uhm, those exact words were already posted in this thread by MJP already. So its nothing new to those who read this thread. :LOL:

I know I'm a few days late to the party, but I spent a half hour or so trying to figure out what the difference is between the TIER1 and TIER2 feature levels exposed by DX11.2 for tiled resources. Unfortunately there's no documentation yet for the enumerations or the corresponding feature levels (or at least none that I could find), so all I have to go off is the sample code provided by MS. These are the major differences illustrated by the code:

  • TIER2 supports MIN and MAX texture sampling modes that return the min or max of 4 neighboring texels. In the sample they use this when sampling a residency texture that tells the shader the highest-resolution mip level that can be used when sampling a particular tile. For TIER1 they emulate it with a Gather.
  • TIER1 doesn't support sampling from unmapped tiles, so you have to either avoid it in your shader or map all unloaded tiles to dummy tile data (the sample does the latter)
  • TIER1 doesn't support packed mips for texture arrays
  • TIER2 supports a new version of Texture2D.Sample that lets you clamp the mip level to a certain value. They use this to force the shader to sample from lower-resolution mip levels if the higher-resolution mip isn't currently resident in memory. For TIER1 they emulate this by computing what mip level would normally be used, comparing it with the mip level available in memory, and then falling back to SampleLevel if the mip level needs to be clamped. There's also another overload for Sample that returns a status variable that you can pass to a new "CheckAccessFullyMapped" intrinsic that tells you if the sample operation would access unmapped tiles. The docs don't say that these functions are restricted to TIER2, but I would assume that to be the case.

Aside from those things, all of the core hardware functionality appears to be available with TIER1.
 
Uhm, those exact words were already posted in this thread by MJP already. So its nothing new to those who read this thread. :LOL:

LOL I had a feeling it might have been someone from B3D or someone who was lurking here because we seem to be the only people talking about this.
 
OK, so is that MJP's WordPress entry, or if not who plagiarized whom especially as both writings include a personal "I poked around and this is what I found..."?
 
OK, so is that MJP's WordPress entry, or if not who plagiarized whom especially as both writings include a personal "I poked around and this is what I found..."?
It's MJP's blog.

Edit:
Name of site is My Name Is MJP.
And in the about page it says so. :)
 
Last edited by a moderator:
Based on this information it appears that TIER1 offers all of the core functionality, while TIER2 has few extras bring it up to par with AMD’s sparse texture extension.

Doesn't this suggest that any card that supports AMD's sparse texture extension supports TIER 2 ?.
 
It's MJP's blog.

Edit:
Name of site is My Name Is MJP.
And in the about page it says so. :)

Lol, you'd think I would have caught that. Pulling it up on my phone when the URL flashed by, I thought it was something like mynameisjimp and I didn't notice the similarity! Good to know there isn't yet another leech out there though. :)
 
I just found this Wordpress entry about the difference between Tiled Resources Tier-1 & Tier-2.

DX11.2 Tiled Resources

That blog must be written by a true genius of graphics programming. :p

Doesn't this suggest that any card that supports AMD's sparse texture extension supports TIER 2 ?.

To be totally clear, TIER2 actually seems to support some functionality that's not mentioned in AMD's sparse texture extension. Namely the MIN/MAX texture filter modes, and the LOD clamp when sampling the texture. I couldn't say for sure if that functionality is present in all AMD GPU's with PRT support, or only a subset of them. It's possible that MS lumped those newer features together with the other functionality provided by PRT in order to avoid having 3 tiers, which would leave a bunch of AMD GPU's stuck on TIER1 with no access to the enhanced shader functionality.
 
The term was just made-up (unless it's real lol) to illustrate it. Because they clasified the GPU as DX11+ simply means that they know it can and or will support features which will be present in the next DX revision/specification.
11+ doesn't mean future extensions necessarily. It's far more likely to mean, as is the case with all GPUs, that the GPU supports features not exposed by the relatively high-level DX API, which devs can exploit through low level programming or different APIs. The plus doesn't mean 'next Direct X' but 'and a little other stuff besides'.
 
11+ doesn't mean future extensions necessarily. It's far more likely to mean, as is the case with all GPUs, that the GPU supports features not exposed by the relatively high-level DX API, which devs can exploit through low level programming or different APIs. The plus doesn't mean 'next Direct X' but 'and a little other stuff besides'.

It sometimes also refers to a particular GPU having features that were originally planned for the base DX level, but were cut as the specifications were finalized due to one IHV or the other not being able to implement those features in time. Of course, that inevitably leads to the x.1 levels. But rarely carries over into an x.2 level. Hence, the tiered structure in 11.2 with only one current PC GPU supporting tier 2.

Regards,
SB
 
To be totally clear, TIER2 actually seems to support some functionality that's not mentioned in AMD's sparse texture extension. Namely the MIN/MAX texture filter modes, and the LOD clamp when sampling the texture. I couldn't say for sure if that functionality is present in all AMD GPU's with PRT support, or only a subset of them.
LOD clamp is available in all GCN GPUs. This can't be the distinctive feature. And if I read the documentation right, there is a 2 bit field in the sampler description (basically a set of scalar registers) where one can set up a sampler not to use lerps but a min or max filter for sampling the residency map. That would mean it is also present in all GCN GPUs. The TFE (texture fail enable) bit in sampling instructions enables to execute a sample instruction for an unmapped memory location without crashing the GPU. In that case it returns an error code (which can be handled in the shader).
Or in other words, it would be hard for me to say, what feature of Tier2 is not supported already by Tahiti (or the documentation is not completely correct regarding this).

Edit:
One thing someone else could help, would be the packed mip levels for texture arrays.
 
Last edited by a moderator:
So... tier 1 = nV, pre-GCN/dx11 GPUs? :p
Not according to the hints of Dave. He appeared to imply that not all GCN cards are Tier2, only some. But I have no idea, what the distinction actually is. And I doubt a bit that pre GCN GPUs support Tier1 but would like to see evidence to the contrary.
 
I don't see why you would need a move engine for Tier 2?. Do you mind elaborating on your thoughts.

Sorry a bit of poor wording on my part, in relation to the tile/untile bits of the "Move" engines, and how that could possibly be an advantage for Tier 2 hardware. If so, would that lend evidence that Xbox is mostly likely tier 2. Etc.
 
Not according to the hints of Dave. He appeared to imply that not all GCN cards are Tier2, only some. But I have no idea, what the distinction actually is. And I doubt a bit that pre GCN GPUs support Tier1 but would like to see evidence to the contrary.
I think Dave seemed to imply that one of the consoles' GPU is Tier 2 while the other is Tier 1, to be precise, and he also hinted at the fact that this will show in the way games are going to be developed on both. Considering he outright didn't say what he meant I may well be wrong in my understanding of his words and I am also not familiar with the technology. That's what I got from his words, which are now made into a mystery tbh.
 
I think Dave seemed to imply that one of the consoles' GPU is Tier 2 while the other is Tier 1, to be precise, and he also hinted at the fact that this will show in the way games are going to be developed on both. Considering he outright didn't say what he meant I may well be wrong in my understanding of his words and I am also not familiar with the technology. That's what I got from his words, which are now made into a mystery tbh.

Actually he was only referring to desktop GPUs when he was throwing out vague hints. Basically that all GCN cards support Tier 1, but only a subset support Tier 2. It was strongly hinted that Tahiti and all GCN 1.0 cards don't support Tier 2. Which only left Bonaire as supporting Tier 2.

Regards,
SB
 
Actually he was only referring to desktop GPUs when he was throwing out vague hints. Basically that all GCN cards support Tier 1, but only a subset support Tier 2. It was strongly hinted that Tahiti and all GCN 1.0 cards don't support Tier 2. Which only left Bonaire as supporting Tier 2.

Regards,
SB
Ah, okay... I find it specially curious that he mentioned it here in the Console forum, and they say Bonaire reminds people of the XB1's GPU, not to mention that both pieces of hardware were shown almost at the same time. It's quite suspicious. Sure there are more capable GPUs than Bonaire and they aren't antique hardware nor a museum piece but the 7790 HD Bonaire seems to be a very modern, appropriate and forward looking hardware.
 
Well, I think it mainly was something he wanted to somewhat clarify about AMD's desktop GPU's since it wasn't coming up anywhere else.

It may or may not have any relevance to the consoles. I don't know if we'll ever find out for sure whether one or both consoles have Tier 2 hardware features.

Regards,
SB
 
Back
Top