Xbox One: DirectX 11.1+ AMD GPU , PS4: DirectX 11.2+ AMD GPU what's the difference?

Anand has a beautiful article about what AMD formerly called "Sea Islands" aka Bonaire aka GCN1.1: http://www.anandtech.com/show/6837/...feat-sapphire-the-first-desktop-sea-islands/2
Yea, I read that and it made me wonder if because of the way gcn 1.0 and 1.1 is not making any use of any added features due to dx11.1 and keeping it in current range, it has made people _neglectful_ of the differences.
Not sure how correct this is, but some of the differences:
http://forums.anandtech.com/showpost.php?p=35400496&postcount=30
The PRT is the same on all GCN version. The differences in the other features.
GCN1.0: S.I. implementation, basic features
GCN1.1: C.I. implementation, multi queue compute, system unified addressing, device unified addressing, support for unaligned memory accesses, new debug features
GCN2.0: V.I. implementation, ... what I know: basic features for system integration, new compute instructions, dynamic parallelism

May I ask what makes you think the XBONE is Bonaire whilst the PS4 Pitcarin and exactly what the difference between the two are?.
Some of differences is outlined in the anandtech link eVolvE provided.

AFAIR it was assumed/proven that:
- XB1 is cape-verde, with some suggestions earlier this year that it was Bonaire.
- PS4 is pitcairn.

Also http://www.eurogamer.net/articles/digitalfoundry-can-xbox-one-multi-platform-games-compete-with-ps4
On the face of it, Microsoft's GPU sounds remarkably similar to AMD's Bonaire design, as found in the Radeon HD 7790, while Sony's choice is uncannily like the Pitcairn product, the closest equivalent to the PS4's graphics chip being the desktop Radeon HD 7870, or closer still in terms of clock-speed, the laptop Radeon HD 7970M.

Obviously this is all just fun and games trying to guess what Dave is hinting at... :LOL:
 
Last edited by a moderator:
Yeah, the PS4's additional compute queues would actually imply the reverse. In fact almost everything we know about the PS4 specific enhancements suggest it's further along the GCN development path than Xbox One. I think the only reason anyone is assuming the reverse in this case is that a lot of people pinned their hopes on tiled resources as some kind of savior for the Xbox One's ESRAM limitations.
 
Yeah, the PS4's additional compute queues would actually imply the reverse. In fact almost everything we know about the PS4 specific enhancements suggest it's further along the GCN development path than Xbox One. I think the only reason anyone is assuming the reverse in this case is that a lot of people pinned their hopes on tiled resources as some kind of savior for the Xbox One's ESRAM limitations.

This is my thoughts too. Everything we have seen so far seems to suggest they started with the same base, but Microsoft left more of it stock standard and added shared hardware page tables, whilst Sony changed the number of Queues and improved the caching system.
 
Looking at the video bbot posted, I would guess that PRT (tier-2) hardware support is a big deal on the Xbox One. Microsoft used a 16 MB (PRT texture) demo and a 16 MB (PRT shadow) demo, as far as I can tell, with really big 16k textures in the textures demo.
So in effect, with a robust PRT API, the eSRAM could be used a very fast PRT/tile memory and a large part of the DDR3 memory can be used as a virtual memory space for textures. All supported by the GPU hardware with low performance hit from anisotropic filtering.
Again, a laymans guess, and I may be completely off-chart, but certainly a more interesting use of the 32 MB eSRAM cache than 'just' a general scratchpad. So, in essence, tier-1 or tier-2 support may not be a trivial distinction. Any thoughts on this?
 
Looking at the video bbot posted, I would guess that PRT (tier-2) hardware support is a big deal on the Xbox One. Microsoft used a 16 MB (PRT texture) demo and a 16 MB (PRT shadow) demo, as far as I can tell, with really big 16k textures in the textures demo.
That was all NVidia hardware which does not have PRT. NVidia solely has "unlimited" surface count support (D3D 11.0 limits this to 128 surfaces bound to a shader), which means the driver can simulate tiles by naming each tile as a distinct surface, hiding that fact from the developer.
 
That was all NVidia hardware which does not have PRT. NVidia solely has "unlimited" surface count support (D3D 11.0 limits this to 128 surfaces bound to a shader), which means the driver can simulate tiles by naming each tile as a distinct surface, hiding that fact from the developer.

Having seen the video again. that may be true but didn't seem like the point of the presentation. They did talk about hardware-based PRT on a high level, and the immidiate benefits. I would guess PRT tier-2 support certainly will revovle around those benefits, even if I can't seem to find papers that define the differences.

edit: Had to rephrase post, sorry.
 
Last edited by a moderator:
So, basically real world results with PRT capable hardware would be even better, then? I think the demo was just there to show the principles of PRT, which seems viable on the Xbox One. Or am I missing something?
PRT is a superset. The question at large is:

  • define the superset of PRT with respect to tiled resources
  • identify whether there is more than one version of PRT in AMD hardware
  • identify the subset of PRT at tier 1
  • identify the subset of PRT at tier 2
 
Yeah, the PS4's additional compute queues would actually imply the reverse. In fact almost everything we know about the PS4 specific enhancements suggest it's further along the GCN development path than Xbox One. I think the only reason anyone is assuming the reverse in this case is that a lot of people pinned their hopes on tiled resources as some kind of savior for the Xbox One's ESRAM limitations.

After looking at some 7790 reviews, it seems likely that Bonaire=XB1. (Bonaire seems focused on improved efficiency, power consumption and memory architecture - all XB1 keywords).

The HD8xxx series seem to contain the compute-focused changes from the PS4, along with the XB1 improvements.

Although quite how that ties back to the OP is beyond me.
 
After looking at some 7790 reviews, it seems likely that Bonaire=XB1. (Bonaire seems focused on improved efficiency, power consumption and memory architecture - all XB1 keywords).

How are those "XB1 keywords" any more than they are "PS4 keywords"? Bonaire's memory architecture is completely unlike the Xbox One's. We have heard no technical details to suggest it's efficiency or power consumption is enhanced in any particular way compared to vanilla GCN or PS4. All we really know for sure is it definitely lacks the added compute queues which PS4 definitely has.
 
Looking at the VGleaks GPU reveal for Durango, they say the following:

All GPU memory accesses on Durango use virtual addresses, and therefore pass through a translation table before being resolved to physical addresses. This layer of indirection solves the problem of resource memory fragmentation in hardware—a single resource can now occupy several noncontiguous pages of physical memory without penalty.

Virtual addresses can target pages in main RAM or ESRAM, or can be unmapped. Shader reads and writes to unmapped pages return well-defined results, including optional error codes, rather than crashing the GPU. This facility is important for support of tiled resources, which are only partially resident in physical memory

http://www.vgleaks.com/durango-gpu-2/2/

If you compare that to the AMD slide set on Partially Resident Textures on the HD7970, it seems to be describing the same functionality.

The GPU virtual memory page table translates tiles into a resident texture tile pool

§How does the application know which texture tiles to upload? §Answer: PRT-specific texture fetch instructions in a shader
– Return a “Failed” texel fetch condition when sampling a PRT pixel whose tile is currently not in the pool

https://www.google.ca/url?sa=t&rct=...=lIFJjKAAiltOsjBzWHCKdA&bvm=bv.51495398,d.b2I

So if anyone can figure out what the difference is between tier1 and tier2 support, then you'll at least have an idea of where Xbox One fits.
 
I guess currently there isn't enough information publicly available to us in order to fully grasp the differences between the two console solutions.

Although DX11.2 is currently available in the Win8.1 preview, I guess the missing part at this point are the corresponding AMD drivers that enable PRT support. These are supposed to come sometime around the official Win8.1 release. Until then, no one can actually test which AMD cards support which PRT tier.

I have just looked up a GDC presentation by AMD regarding PRT that talks about the 7970, the first GCN-based card and how it enables the usage of PRT. To reiterate what we already know, all GCN-based cards support some form of PRT. GCN1.0 cards probably support only tier1 while most likely the Bonaire/GCN1.1-based (and upwards) solutions support tier2, but what the console's actually support is anyone's guess at the moment.

I guess, Dave can't tell us more than this at the moment. From what he told us, it seems that there is a difference between the GPUs in the console's regarding their GCN iteration where one is probably GCN1.0-based and the other GCN1.1- or probably even GCN2.0-based. Taking other things into consideration, like higher number of compute queues in PS4, it's plausible that PS4 is the one to support tier2.
 
I'm just curious to know where the "tier" description is coming from, because I can't find reference to it anywhere. Surely if people know there are two tiers of support, someone knows what those two tiers actually mean. I haven't been able to find it.

Maybe a good place to start is to find out if there were any changes to AMDs implementation of PRTs on GCN before or after HD7970.
 
May I ask what makes you think the XBONE is Bonaire whilst the PS4 Pitcarin and exactly what the difference between the two are?.

Since at least February of this year there were claims of xb1 gpu being Bonaire class chip supporting gcn1.1. You can goto to virtually any tech site and find articles supporting this with the relevant specs and comparisons.

That includes anandtech and even wiki radeon 7000 series pages. When looking at the known base architectures of ps4's chip it appears to be a modified Pitcairn somewhere between a pro and xt and XB1's gpu appears to be a modified Bonaire xt.

This is also in line with daves statements that Tahiti was simply the first chip within its class to support 11.1 hw features. As MS owns the DX11 specification and we know that DX11.2 will be exposed this fall after a driver update, it makes sense that XB1 chipset is also a tier2 implementation of PRT and other extensions of 11.2.
 
Since at least February of this year there were claims of xb1 gpu being Bonaire class chip supporting gcn1.1. You can goto to virtually any tech site and find articles supporting this with the relevant specs and comparisons.

That includes anandtech and even wiki radeon 7000 series pages. When looking at the known base architectures of ps4's chip it appears to be a modified Pitcairn somewhere between a pro and xt and XB1's gpu appears to be a modified Bonaire xt.

This is also in line with daves statements that Tahiti was simply the first chip within its class to support 11.1 hw features. As MS owns the DX11 specification and we know that DX11.2 will be exposed this fall after a driver update, it makes sense that XB1 chipset is also a tier2 implementation of PRT and other extensions of 11.2.

So what your saying is it has no factual basis and is just a wild guess. How does it make sense for the PS4 to support less features when it supports more advanced features that the XBONE doesn't?.

The PS4 slide say it supports DX11.2+. This should be a hint that its pretty much the same basis, to suggest otherwise doesn't really seem to have any factual basis just a whole lot of 'because i said so', the only reason most of the websites went with the Pitcairn and Bonaire for the PS4/XBONE seems to be purely based on the number of CU's in the design but relatively this should be rather easy to scale up and down and wouldn't really be much of a reason to base a entire GPU off.
 
So what your saying is it has no factual basis and is just a wild guess. How does it make sense for the PS4 to support less features when it supports more advanced features that the XBONE doesn't?.

The PS4 slide say it supports DX11.2+. This should be a hint that its pretty much the same basis, to suggest otherwise doesn't really seem to have any factual basis just a whole lot of 'because i said so', the only reason most of the websites went with the Pitcairn and Bonaire for the PS4/XBONE seems to be purely based on the number of CU's in the design but relatively this should be rather easy to scale up and down and wouldn't really be much of a reason to base a entire GPU off.

In the context of ps4's gpu what does dx11 mean? They use OpenGL. Does it mean dx11.2 specifications? What does that mean? Dx11.2 is only exposed through drivers. Apparently there is a distinction however considerable or minimal between tiers. Who said anything about supporting less features? Both gpus are fairly bespoke with the Durango gpu being quite a bit more extensively modified. Xb1 gpu fits the profile of a discrete Bonaire not a Pitcairn.

Ms owns the dx11 specification and roadmap. Do you not think it's possible for them to incorporate much of that in their own hardware. You seem really caught up in the vs thing almost to the point of being offended if it were possible that the xb1 could be based on a slightly newer architecture than the ps4. At least that's the basic tone of your post.
 
In the context of ps4's gpu what does dx11 mean? They use OpenGL. Does it mean dx11.2 specifications? What does that mean? Dx11.2 is only exposed through drivers. Apparently there is a distinction however considerable or minimal between tiers. Who said anything about supporting less features? Both gpus are fairly bespoke with the Durango gpu being quite a bit more extensively modified. Xb1 gpu fits the profile of a discrete Bonaire not a Pitcairn.

Ms owns the dx11 specification and roadmap. Do you not think it's possible for them to incorporate much of that in their own hardware. You seem really caught up in the vs thing almost to the point of being offended if it were possible that the xb1 could be based on a slightly newer architecture than the ps4. At least that's the basic tone of your post.

dx11 means the direct dx11 feature set, which they obviously support. Yes Durango is clearly more modified, but it seems to me that the modifications for Durango seem more based around the eSRAM and the 'Move Engines' rather then the GPU Core which is what Sony seemed to concentrate on boosting.

I am offended when people like to try and trot out evidence when they have none, on either side.

I see no reason to believe that Durango is Bonaire and the PS4 is Pitcairn anymore then there are reasons to believe the reverse or both being either.

No one has yet to bring credible evidence of either being Bonaire or Pitcairn and until they do its a bit useless with this back and forth.
 
Back
Top