Technical Comparison Sony PS4 and Microsoft Xbox

Status
Not open for further replies.
so leaving apart any clock argument, it will be complex to speculate on framerate, maybe impossible as there's too many factor that we don't know about
we don't know about the main parts, let alone what we can know about bottlenecks in the systems and other important factors

this remind me the x360/ps3 question, the ps3 was talked being 2x the x360, remenber?
reality said quite the contrary, mostly of the multiplatform games looked better on x360

this is because a lot of factor are involved, are those facts or my conclusions, Brad?

Any comparison to last generation is completely invalid due to the major differences in not only the architectures but the focuses of the architectures.

We know clock rate, its pretty obvious is the same as the PS4. The APUs are so large it would be pushing it just at the 800mhz/1.6ghz rate.

Also don't forget the cloud specs are only the CPU and storage (so add 300GFLOPS ontop of the XBONE) and the amount of things it can leverage for is a lot less then what you can use local resources for.
 
Any comparison to last generation is completely invalid due to the major differences in not only the architectures but the focuses of the architectures.

We know clock rate, its pretty obvious is the same as the PS4. The APUs are so large it would be pushing it just at the 800mhz/1.6ghz rate.

Also don't forget the cloud specs are only the CPU and storage (so add 300GFLOPS ontop of the XBONE) and the amount of things it can leverage for is a lot less then what you can use local resources for.

I'm interested to see evidences of actual clock for X1 gpu (Nd why you call it XBONE instead of XOne, XB1, X1? sounds really bad to me, it's used from such kind of people on neogaf)
can you provide a link please?
 
I'm interested to see evidences of actual clock for X1 gpu (Nd why you call it XBONE instead of XOne, XB1, X1? sounds really bad to me, it's used from such kind of people on neogaf)
can you provide a link please?

We know it by intuition, its clocked high for a APU (GPU wise) and its a giant APU. AMD doesn't sell a APU with anywhere near the CU's nor the clock rate on the GPU. Also there appears to be a rather large spike with TDP and power usage at higher clocks. These are the reasons why a clock bump of any kind is unlikely.

A clock bump of 1Ghz+ is impossible the TDP increase alone would probably cause major headaches.

If you want hard evidence for most of this you won't get it, Its likely Microsoft will never release the information its in there best interest they don't make themselves directly comparable to the PS4. Even if they do have all these fixed function units and eSRAM at first glance it still looks really bad.

But if you want to continue to believe that Microsoft at the last minute threw out all of these design and went with a new one go for it.
 
so we don't know and you speculate about, ok

People said the same thing about dual APU raytracing chips with CU arrays. Look how that turned out, most of the outlandish rumours turn out to not be true for a reason.

The problems i listed are serious issues that Microsoft would face with a up clock and you have yet to provide a solution for any of them. Except for this random hand waving dismissal.

Also. Don't compare HD6xxx cards. Compare HD7xxx cards.

Specifically the HD7770 and the HD7790 seem to fit in pretty close with the performance difference between the two consoles.

Although you are not likely to see a large difference in the FPS to a degree on the PC versions due to the fact that the workloads have not been tailored to the cards.
 
so using a GPU 100%+ faster the gain is 4 FPS in 1024x768 (5%) and 24 FPS in 1280x1024 with AA on (20%)

so in the worst scenario, (a) are you really think you'll see a 30 FPS VS 45 FPS or 30 FPS VS 32-36 FPS?

This sort of comparison is completely meaningless. It is a fair assumption that in both examples the game is primarily CPU bound - so the performance of the GPU will have fast diminishing returns (less so as GPU workload increases - as your example clearly shows!). Why on earth compare GPU performance at 1024x768? A comparison at higher resolution, etc, would have almost certainly showed a difference much closer to the theoretical performance difference (assuming that becomes the bottleneck area).

Using this as a comparison with the next gen consoles is flawed. 3rd parties will be presented with a system that has a 50% CU advantage, 100% ROP advantage and non-trivial bandwidth advantage compared to its competitor. In my opinion an argument can be made that this will be the difference between 720p and 1080p - at the same frame rate - by looking at the distribution of workload in both these cases you can see how the advantages fit surprisingly well.

But as most have said, chances are the majority of next gen games will go for a dynamic 1080p output - in which case how the advantages manifest themselves will likely differ depending on the studio priorities and if they scale down / scale up (as already mentioned).
 
That's quite an inaccurate statement, how don't get how you get there.
Intel just put AMD APU to shame, there is no way AMD can catch-up without relying on expansive (and more power hungry) GDDR5 solutions. Actually iso power Intel lets everybody behind... by a land mile, when it comes to compute performances the level of performance they provide is stellar.

How that is relevant to Durango? I'm not sure that it is that it is relevant at all those are significantly different architectures.


I think it´s not completely fair

Intel it´s showing how integrating RAM puts their lackluster GPU on par with mobile parts.

I do think that Durango it´s the way to go for AMD for integrated apus, even Intel thinks the sweetspot it´s 32mb.

I know that Intel is using Edram, much denser than 6tram, but it´s not even on the same die
 
For people wanting a comparison between sram performance vs edram, i think the conclusion is that for the same area size, the larger amont of edram offsets the low latency of sram in term of performance. So for the same amont of silicon, it should be better to choose edram.
www.hpl.hp.com/techreports/2000/HPL-2000-53.pdf

I think EDRAM is less flexible, can only be fabbed certain places, and cant be integrated onto the die.

All these were presumably why MS went with ESRAM.
 
And that's the point. It's a feature that might actually make some people buy a PSVita to go along with their PS4. Can't hurt. Synergy is no one-way street.

See ShadowRunner's earlier post.

If I get a PS4 I would pickup the Vita if they had a second revision or update to make it a bit more powerful (the vita 2.0). As it stands now, no real interest in the original Vita.
 
I think it´s not completely fair

Intel it´s showing how integrating RAM puts their lackluster GPU on par with mobile parts.

I do think that Durango it´s the way to go for AMD for integrated apus, even Intel thinks the sweetspot it´s 32mb.

I know that Intel is using Edram, much denser than 6tram, but it´s not even on the same die
Intel integated GPu does not qualify as lack luster imho even previous arch were pretty stellar at compute and were not doing bad at all if you consider thing ceteris paribus: how much power they burn and which silicon footprint.

As for the 32MB well whereas it happens to match the amount of scrathpad durango embark it is not linked in anyway. Intel engineers considered the hit rate for the L4 (it is a cache) whereas MSFT engineers could not really afford more eSRAM. The esram in Durango is not a cache handling it-self you would benefit from more room, as the software (be it system or devs) decides what to put in it.

The 2 things are different in many way, bandwidth comes to mind too. Haswell has 50GB/s read and 50GB/s write to the Edram whereas it seems that Durango has +100GB/s read or write.

Intel implementation is way more interesting in that it is not a scratchpad but cache accessible by both the CPU and GPU, it may be useful for parts that don't even embark a GPU (/ some server workloads I guess).
 
All MS and Sony can demand is equality/parity withing the possibilities of the hardware.
PS4 strengths can't be overlooked simply because "MS wills it" and vice versa.

True.

This topic was already better discussed in the technical vs thread, but we know there's already at least one dev that thinks we could see differences in frame rate.

Discussing the differences between PS4 and Xbox One with Eurogamer, Stewart Gilray, who is working on a secret PS4 game at Just Add Water, said:

"We might see slightly smoother framerates on PS4. We're working with Sony right now, and they're trying to actively push 60 frames per second, 1080p.

"You might get situations where the graphics will be a little, but not much, lower quality on the Xbox One," he continued. "Or, you might get some fixed at 30 frames per second situations in 1080p. It depends on the scale of the game.

Given the state of dev tools for MS at this point and the hurried nature of working on launch titles, I think it's safe to say most will lead on Xbone for now to make development less hard on themselves. I think if there's any PS4 optimizations it'll be the option of leaving the framerate unlocked.

I'm guessing in a 3rd scenario where the PS4 game is 1080P 30FPS the developer isn't going to have much of a choice when porting to the Xbox One they will have to lower the effects/resolution to reach 30FPS.

How do you think they should handle the ports in these situations?

This would be the best situation but I'm not sure it would happen until PS4 steamrolls Xbone. For now it'll be Xbone lead with some PS4 optimizations, but I still think a little down the line we might see more devs more willing to explore the extra power, be it a full on 60fps version being made or a version with more texture variety, extra compute effects, etc.
 
If I get a PS4 I would pickup the Vita if they had a second revision or update to make it a bit more powerful (the vita 2.0). As it stands now, no real interest in the original Vita.

Why would a more powerful Vita interest you when the current one doesn't?
 
Why would a more powerful Vita interest you when the current one doesn't?

Better quality titles. Maybe it would bring more versatile applications too. Perhaps a 7" edition?

Right now I don't do mobile gaming as I don't ride public transportation, but I do use my Nexus-7 tablet a lot. So I'm likely looking to replace one with the other.

Oh, and the Vita would be of interest if I got a PS4. By itself, the Vita as it stands doesn't interest me. But with a PS4 system and an upgraded Vita, the possibilities that exist seem to be interesting.
 
And that's the point. It's a feature that might actually make some people buy a PSVita to go along with their PS4. Can't hurt. Synergy is no one-way street.

See ShadowRunner's earlier post.

There's also little stopping Sony from just streaming a PS4 game to a Vita directly from a GaiKai server. You wouldn't even need to personally own a PS4.
 
Yes, what they do nowadays is throw more ALUs at it, increase the clocks, and swizzle textures so their memory format is better for streaming. And then they just take any latency hits they have to, since that's the cost of doing business. It'll only get worse as GPGPU workloads become more common.

Thanks for the reply :D

"swizzle texture" made a great google search and coughed up:
http://fgiesen.wordpress.com/2011/01/17/texture-tiling-and-swizzling/

Answers a few questions even though I really only get half of what is being said.

As for the GPGPU workloads there is hardware on the PS4 to address such things supposedly:
http://www.gamasutra.com/view/feature/191007/inside_the_playstation_4_with_mark_.php?page=2

Still more work for PS4 developers maybe not as much for XB1 devs on the latency part compute GPU usage. New solutions for new puzzles... God help me I love a new console cycle.

Wonder if years down the road one or both of these devices couldn't be turned into an APU Graphics Card solution ? MS has already virtualized the thing and while not virtualized the PS4 seems like a reasonable fit. Wonder if AMD have been in talks about such things ?

In any case the XB1, while maybe not the most tricked up for gaming, sure does seem to have been engineered nicely for a hybrid machine for "apping" and gaming. I wonder when they start talking about plans for cloud computing to buttress the App OS side of things. How can a cloud enabled Office Experience or the like be that far off ?

Pardon my OT musings.
 
Better quality titles. Maybe it would bring more versatile applications too. Perhaps a 7" edition?

Right now I don't do mobile gaming as I don't ride public transportation, but I do use my Nexus-7 tablet a lot. So I'm likely looking to replace one with the other.

Oh, and the Vita would be of interest if I got a PS4. By itself, the Vita as it stands doesn't interest me. But with a PS4 system and an upgraded Vita, the possibilities that exist seem to be interesting.

That's impossible. It's a handheld gaming device- creating a new set of hardware would basically relegate all existing Vita's useless and developers would need to target two specifications of hardware. Vita isn't even 2 years out the gate yet.

You know this.
 
If I get a PS4 I would pickup the Vita if they had a second revision or update to make it a bit more powerful (the vita 2.0). As it stands now, no real interest in the original Vita.
As far as I'm concerned, I'd rather like to see twice the battery life instead of twice the performance in the next major Vita revision ... ;)

No need for better performance when you'll be able to stream PS4 games anyway.
 
Last edited by a moderator:
As far as I'm concerned, I'd rather like to see twice the battery life instead of twice the performance in the next major Vita revision ... ;)

No need for better performance when you'll be able to stream PS4 games anyway.

Agreed, but bigger screen would be nice. All in all, my son is quite happy with the Vita. It's used alot more than the WiiU.
 
That's impossible. It's a handheld gaming device- creating a new set of hardware would basically relegate all existing Vita's useless and developers would need to target two specifications of hardware. Vita isn't even 2 years out the gate yet.

You know this.

Yes I do. And I talk with some Vita developers. And still I want more performance for a larger screen with better resolution. Oh, and I also want more battery life. I'm a demanding bastard.
 
Status
Not open for further replies.
Back
Top