They are both DX 11.2+ capable.
No difference in GPU abilities aside from the PS4 being 50% more powerfull.
AMD stated that GCN is capable of DX11.2 and that it would be available via a windows driver update.
I'm pretty sure Sony can update it's Unix drivers and tools to feature parity if it's needed.
Um. You must have missed an upclock as it is now 39% more or One being 29% less depending on how you want to look at it
Or more importantly, "powerful" encompasses a lot more than compute unit count. There may also be other areas ps4 has an advantage, and still other areas where xb1 has an advantage. Even if "all other things were equal" 50% more cu wouldn't mean 50% (now 40%) more powerful in many or most typical real world uses. But other things aren't even equal so make such a sweeping simplification simply isn't warranted.
Every abhors car analogies so here's an extreme one... you shove an V12 engine somehow into a 1970's estate wagon that weighs 3 tons and race it against a V8 Ferrari. What could you say about more power? Are all other things equal? And say for the engines at least they were... 800bhp vs 600bhp. What could you say about real world performance?
And no, I'm not implying the xb1 is a Ferrari or the ps4 is a station wagon. Just making what should already be an obvious point about using one spec to compare. PS4 could turn out to have a gpu greater than 40% more powerful. Or less.
They are both DX 11.2+ capable.
X1 GPU specs have nothing to do with HD7000. X1 was initially reported as 11.1+ because 11.2 hasn't been announced back then. It's 11.2+ just like PS4's GPU. As for the feature set exposed in 11.2 that wasn't there in 11.1: mostly fast presents (cutting present latency) and tileable resources (resources only partially resident in GPU memory useful for stuff like streaming in general or megatextures specifically).Yeah I seen where it was reported that the AMD HD7000 cards was getting a driver update to make them DX11.2 capable but I was wondering if it was a difference in the DX11.1 & DX11.2 hardware.
You're obviously new here, so here's a hint: don't try to be snarky if your arguments are about numbers and not deeper knowledge of the hardware. Especially if your generalizations are incorrect from the get go: there is one "X1 specification" that exceeds PS4 - embedded RAM. Perhaps others, perhaps not. But all this is irrelevant, hardware is not a set of ticks on a box and values that you compare 1:1. There are significant architectural wins in both PS4 and X1 designs that will be beneficial in certain scenarios. It's always like that.Aside from a 10 gigaflop audio-chip, there is not a single known specification in which the XBone exceeds the PS4. Is this correct? And no, "transistor count" or "# of kinects included in the BOM" don't count as specifications
Except that 11.2 features compared to 11.1 have little to do with HW and a lot to do with how you access it/what you expose.By that logic a GeForce 2MX is DX9 capable I mean, I did install compatible DX9 drivers when they came out
By that logic a GeForce 2MX is DX9 capable I mean, I did install compatible DX9 drivers when they came out
...there is one "X1 specification" that exceeds PS4 - embedded RAM....
Look, don't be a jerk. It won't get you anywhere on these forums but on the list of users who have no posting rights.
As explained, the hardware is 11.2+. There is no gimmick with driver levels as you would hope to imply.
Aside from a 10 gigaflop audio-chip, there is not a single known specification in which the XBone exceeds the PS4. Is this correct? And no, "transistor count" or "# of kinects included in the BOM" don't count as specifications
It is an obvious advantage in certain scenarios. Anyone who's programmed graphics understands that. Most of the renderers these days are deferred in some way and they will benefit from ESRAM. And since it's behind the MMU, there isn't much one has to do to use it: you map it where you need it in GPU's virtual address space and you reap benefits of faster g-buffer access.The esram is a merely a patch, nothing more. And most certainly not an advantage, sorry.
How is DirectX 11.2 a PR specification exactly?Ok, sorry for my mistake, to be honest I didn't check the 11.2+ specification, I merely anticipated it to be inline with other MS PR specifications.
...Anyone who's programmed graphics understands that. ...
How is DirectX 11.2 a PR specification exactly?
Anyone who's programmed graphics understands that
In this case, if you are to be believed: 11.2+ is not a PR specification, the rest of the specifications like : "40 times the power in the cloud!", "let's add up all the individual bandwidths and present them as actual bandwidth!" and "whoops, turned out we invented memory that is capable of simultaneously reading and writing at the same time!", are PR specifications; they won't explain )read very carefully) come november 2013 why watchdogs runs in a much lower resolution and without bokeh. (hint: bookmark this post ;-) )