Bingo. Exactly my thoughts.
Yea, I read that and it made me wonder if because of the way gcn 1.0 and 1.1 is not making any use of any added features due to dx11.1 and keeping it in current range, it has made people _neglectful_ of the differences.Anand has a beautiful article about what AMD formerly called "Sea Islands" aka Bonaire aka GCN1.1: http://www.anandtech.com/show/6837/...feat-sapphire-the-first-desktop-sea-islands/2
The PRT is the same on all GCN version. The differences in the other features.
GCN1.0: S.I. implementation, basic features
GCN1.1: C.I. implementation, multi queue compute, system unified addressing, device unified addressing, support for unaligned memory accesses, new debug features
GCN2.0: V.I. implementation, ... what I know: basic features for system integration, new compute instructions, dynamic parallelism
Some of differences is outlined in the anandtech link eVolvE provided.May I ask what makes you think the XBONE is Bonaire whilst the PS4 Pitcarin and exactly what the difference between the two are?.
AFAIR it was assumed/proven that:
- XB1 is cape-verde, with some suggestions earlier this year that it was Bonaire.
- PS4 is pitcairn.
On the face of it, Microsoft's GPU sounds remarkably similar to AMD's Bonaire design, as found in the Radeon HD 7790, while Sony's choice is uncannily like the Pitcairn product, the closest equivalent to the PS4's graphics chip being the desktop Radeon HD 7870, or closer still in terms of clock-speed, the laptop Radeon HD 7970M.
Yeah, the PS4's additional compute queues would actually imply the reverse. In fact almost everything we know about the PS4 specific enhancements suggest it's further along the GCN development path than Xbox One. I think the only reason anyone is assuming the reverse in this case is that a lot of people pinned their hopes on tiled resources as some kind of savior for the Xbox One's ESRAM limitations.
That was all NVidia hardware which does not have PRT. NVidia solely has "unlimited" surface count support (D3D 11.0 limits this to 128 surfaces bound to a shader), which means the driver can simulate tiles by naming each tile as a distinct surface, hiding that fact from the developer.Looking at the video bbot posted, I would guess that PRT (tier-2) hardware support is a big deal on the Xbox One. Microsoft used a 16 MB (PRT texture) demo and a 16 MB (PRT shadow) demo, as far as I can tell, with really big 16k textures in the textures demo.
That was all NVidia hardware which does not have PRT. NVidia solely has "unlimited" surface count support (D3D 11.0 limits this to 128 surfaces bound to a shader), which means the driver can simulate tiles by naming each tile as a distinct surface, hiding that fact from the developer.
Why Tahiti? Its the very fist GCN to hit production so would be the first of an IP set.
PRT is a superset. The question at large is:So, basically real world results with PRT capable hardware would be even better, then? I think the demo was just there to show the principles of PRT, which seems viable on the Xbox One. Or am I missing something?
Yeah, the PS4's additional compute queues would actually imply the reverse. In fact almost everything we know about the PS4 specific enhancements suggest it's further along the GCN development path than Xbox One. I think the only reason anyone is assuming the reverse in this case is that a lot of people pinned their hopes on tiled resources as some kind of savior for the Xbox One's ESRAM limitations.
After looking at some 7790 reviews, it seems likely that Bonaire=XB1. (Bonaire seems focused on improved efficiency, power consumption and memory architecture - all XB1 keywords).
All GPU memory accesses on Durango use virtual addresses, and therefore pass through a translation table before being resolved to physical addresses. This layer of indirection solves the problem of resource memory fragmentation in hardware—a single resource can now occupy several noncontiguous pages of physical memory without penalty.
Virtual addresses can target pages in main RAM or ESRAM, or can be unmapped. Shader reads and writes to unmapped pages return well-defined results, including optional error codes, rather than crashing the GPU. This facility is important for support of tiled resources, which are only partially resident in physical memory
The GPU virtual memory page table translates tiles into a resident texture tile pool
§How does the application know which texture tiles to upload? §Answer: PRT-specific texture fetch instructions in a shader
– Return a “Failed” texel fetch condition when sampling a PRT pixel whose tile is currently not in the pool
May I ask what makes you think the XBONE is Bonaire whilst the PS4 Pitcarin and exactly what the difference between the two are?.
Since at least February of this year there were claims of xb1 gpu being Bonaire class chip supporting gcn1.1. You can goto to virtually any tech site and find articles supporting this with the relevant specs and comparisons.
That includes anandtech and even wiki radeon 7000 series pages. When looking at the known base architectures of ps4's chip it appears to be a modified Pitcairn somewhere between a pro and xt and XB1's gpu appears to be a modified Bonaire xt.
This is also in line with daves statements that Tahiti was simply the first chip within its class to support 11.1 hw features. As MS owns the DX11 specification and we know that DX11.2 will be exposed this fall after a driver update, it makes sense that XB1 chipset is also a tier2 implementation of PRT and other extensions of 11.2.
http://support.amd.com/us/kbarticles/Pages/AMDCatalystWIN8-1PreviewDriver.aspxAlthough DX11.2 is currently available in the Win8.1 preview, I guess the missing part at this point are the corresponding AMD drivers that enable PRT support. These are supposed to come sometime around the official Win8.1 release. Until then, no one can actually test which AMD cards support which PRT tier.
So what your saying is it has no factual basis and is just a wild guess. How does it make sense for the PS4 to support less features when it supports more advanced features that the XBONE doesn't?.
The PS4 slide say it supports DX11.2+. This should be a hint that its pretty much the same basis, to suggest otherwise doesn't really seem to have any factual basis just a whole lot of 'because i said so', the only reason most of the websites went with the Pitcairn and Bonaire for the PS4/XBONE seems to be purely based on the number of CU's in the design but relatively this should be rather easy to scale up and down and wouldn't really be much of a reason to base a entire GPU off.
In the context of ps4's gpu what does dx11 mean? They use OpenGL. Does it mean dx11.2 specifications? What does that mean? Dx11.2 is only exposed through drivers. Apparently there is a distinction however considerable or minimal between tiers. Who said anything about supporting less features? Both gpus are fairly bespoke with the Durango gpu being quite a bit more extensively modified. Xb1 gpu fits the profile of a discrete Bonaire not a Pitcairn.
Ms owns the dx11 specification and roadmap. Do you not think it's possible for them to incorporate much of that in their own hardware. You seem really caught up in the vs thing almost to the point of being offended if it were possible that the xb1 could be based on a slightly newer architecture than the ps4. At least that's the basic tone of your post.