Xbox One: DirectX 11.1+ AMD GPU , PS4: DirectX 11.2+ AMD GPU what's the difference?

So if hardware prt is NOT the reason Durango devkits used hd7970s then why? Penello definitely ruled out the presence of a dgpu.

Could it be that it was to simulate the computing power of the cloud? At the Xbox One unveiling MS engineers said that the power of XB1 is 10X XB360 w/o the cloud and 40X with the cloud. 4*1.3TF = 5.2 TF.
 
So if hardware prt is NOT the reason Durango devkits used hd7970s then why? Penello definitely ruled out the presence of a dgpu.

Could it be that it was to simulate the computing power of the cloud? At the Xbox One unveiling MS engineers said that the power of XB1 is 10X XB360 w/o the cloud and 40X with the cloud. 4*1.3TF = 5.2 TF.

The cloud power only includes the CPU / HDD / RAM, not the GPU. at least that was the last I heard.
 
So if hardware prt is NOT the reason Durango devkits used hd7970s then why?
I don't know about the reason, but the two tiers are both hardware implementation of prt, tier 2 has additional hardware support though.
Also I believe that only the Radeon 7790 has tier 2 level support.
Yes that may sound strange, as it is a mid level card, but that's the way it is.

Maybe they needed the additional power to emulate the XB1 regardless of the tier level, and DX11.2 wasn't exposed in the API at the time either.
 
So if hardware prt is NOT the reason Durango devkits used hd7970s then why? Penello definitely ruled out the presence of a dgpu.

Could it be that it was to simulate the computing power of the cloud? At the Xbox One unveiling MS engineers said that the power of XB1 is 10X XB360 w/o the cloud and 40X with the cloud. 4*1.3TF = 5.2 TF.

Maybe because they needed to Emulate the Audio Chip , Kinect SoC & all the other specialize hardware that's in the Xbox One.
 
So if hardware prt is NOT the reason Durango devkits used hd7970s then why? Penello definitely ruled out the presence of a dgpu.
It was pointed out before that the earliest development occurred before Durango physically existed.
The earliest GCN silicon would have been the first Tahiti chips.

What good would coding for VLIW GPUs and then tossing the work done when GCN became available do?
 
Could it be that it was to simulate the computing power of the cloud? At the Xbox One unveiling MS engineers said that the power of XB1 is 10X XB360 w/o the cloud and 40X with the cloud. 4*1.3TF = 5.2 TF.

No. The "cloud multiple" only applies to the console's CPU, RAM, and to some degree, disk storage. Nothing was claimed about additional GPU resources lurking in the cloud. Microsoft's cloud infrastructure contains no such resources.

If an XB1 has 100GF of CPU, 8GB of RAM, and 500GB of disk storage, there may be a cloud allocation of as much as 300GF of CPU resources, 24GB of RAM, and lots of disk space.

(In reality, these are all virtual, shared resources, so there aren't that many resources dedicated to each console. CPU resources will be used in very short bursts, or allocated to slow but long running batch processes, then released back into the pool. RAM and disk storage can be accessed by many consoles simultaneously.)
 
I've been out of the loop for a while on these consoles but I'm confused as to why people keep referring to the PS4 as using the DirectX 11 API? Sony's PS4 hardware uses two new API's, one of which is called GNM and i don't recall the second name but I'm quite sure both are derived from OpenGL/CL/ES

Ubisoft said:
A more crucial issue is that, while the PS4 toolchain is designed to be familiar to those working on PC, the new Sony hardware doesn't use the DirectX API, so Sony has supplied two of their own.

http://www.eurogamer.net/articles/digitalfoundry-how-the-crew-was-ported-to-playstation-4


That being the case, it clearly pokes a bunch of holes in the claim by Charlie that the PS4 uses DX11.2 and is a "newer" tech chip than what Microsoft is using.
 
I've been out of the loop for a while on these consoles but I'm confused as to why people keep referring to the PS4 as using the DirectX 11 API? Sony's PS4 hardware uses two new API's, one of which is called GNM and i don't recall the second name but I'm quite sure both are derived from OpenGL/CL/ES



http://www.eurogamer.net/articles/digitalfoundry-how-the-crew-was-ported-to-playstation-4


That being the case, it clearly pokes a bunch of holes in the claim by Charlie that the PS4 uses DX11.2 and is a "newer" tech chip than what Microsoft is using.

Not the DirectX API, just DirectX functions at hardware level.
 
I've been out of the loop for a while on these consoles but I'm confused as to why people keep referring to the PS4 as using the DirectX 11 API?
It doesn't use the API, but the hardware is designed to implement the features of DX11.2. that means it has various technologies like partial resident textures and hull shaders not present in older GPUs which have hardware designations for earlier DirectX levels. Putting it another way, put the Liverpool SoC in a laptop and it'll be capable of running DX11.2 games with full hardware acceleration. Same for XB1 - the DX designation only describes the hardware features in relation to a very popular API. Put the XB1 SoC in a Linux laptop running OGL and it'll still be a DX11.2 part even if it's not running DirectX code.
 
To put it another way, the basis of both consoles GPU is CGN. What level of GCN doesn't matter as all levels of GCN are Dx 11.2 compliant.

So, whether or not they use the DirectX API, the hardware itself is capable of Dx 11.2.

Basically, you couldn't build a PS4/Xbox One without it being Dx 11.2 once you chose to use GCN as your basic building block for the GPU.

Once you did that you just have to make sure you have a graphics API that can expose all the features developers would require.

Regards,
SB
 
I believe that Sony created a DX translator to help with the porting process.
So in that sense, maybe you could say it supports Direct X..... :LOL:
Although you still have to go in and tune, and replace some parts that aren't covered.

But yea, you can say that a gpu is at a feature level of a dx level without actually ever having drivers or using dx. Just look at smart phones gpu's for example.
 
What level of GCN doesn't matter as all levels of GCN are Dx 11.2 compliant.

That isn't what I took away from this thread. All GCN architectures will support 11.2 but only a subset implement all features natively in hardware. May be a trivial performance difference, may not, but it seems there IS a difference.
 
That isn't what I took away from this thread. All GCN architectures will support 11.2 but only a subset implement all features natively in hardware. May be a trivial performance difference, may not, but it seems there IS a difference.
There is a difference, but it doesn't mean that they all aren't DX11.2 and therefore can be categorised as such, just different tiers.
Best way to look at it is DX11.2 tier 1, DX11.2 tier 2, both are still _DX11.2_ regardless of differences.
 
Semantics. That still implies hardware level differences, some implementing the full set or full extent of all 11.2 features, others having to use a lesser implementation for some features, whether that means a specific hardware acceleration of one bit is missing - perhaps a specific hardware memory shortcut- or even as far as an emulation mode of some function whether with more generalized shader compute or even in software. Call it tier 1 vs 2, or 11.2 compliant/supported vs full hardware 11.2, there is a hardware difference. I think the information teased out earlier in this thread makes that clear.

Thus I didn't think it was very clear to just use a blanket statement that all GCN are 11.2 hardware level when that appears to be a bit broader and more ambiguous term than one would think.
 
Saying PS4 supports DX11 is the same as saying the Xbox 1 supports shader model 4.
Because the hardware is compliant doesn't mean that the games are programmed as such.

Sony just meant that they can go beyond the spec/current known DX features, as I am pretty sure that AMD had access to the DX roadmap and planned features. Example:
AMD knows that DX 11.3 will support 'global stencil inversion', which requires certain hardware functions which are present in PS4 hardware, so they stated DX11+ compatible.
 
AMD knows that DX 11.3 will support 'global stencil inversion', which requires certain hardware functions which are present in PS4 hardware...
How do you know that PS4 supports hardware features necessary to implement 'global stencil inversion'?
 
How do you know that PS4 supports hardware features necessary to implement 'global stencil inversion'?

The term was just made-up (unless it's real lol) to illustrate it. Because they clasified the GPU as DX11+ simply means that they know it can and or will support features which will be present in the next DX revision/specification.
 
The term was just made-up (unless it's real lol) to illustrate it. Because they clasified the GPU as DX11+ simply means that they know it can and or will support features which will be present in the next DX revision/specification.
I don't know how you can say that unless you know something about the next DX spec.

It can and probably does just mean that it conforms to the DX11 spec, and does stuff that isn't in the spec, and has nothing to do with the next DX revision directly.

Over the years GPU's has always done stuff that wasn't in the spec.
 
Back
Top