Technical Comparison Sony PS4 and Microsoft Xbox

Status
Not open for further replies.
And the 7770 used GDDR5, ran with 1Ghz instead of 800Mhz and let's also not forget 10% GPU performance the QoS will grant the OS which probably makes the 2 extra CUs a non factor. So the table above will very likely look closer than some here might wish.

So the PS4 requirements for OS will come out of thin air ?
 
I think it doesn't make sense to go in low and leave no option to expand if it becomes necessary to match features of the new Xbox. They can go smaller if they don't need it.

Yeah, but since when was a 2GB OS reservation considered low for a games console?

Look how much the iPhone 5 can do with only 1GB, surely double that would be enough for whatever Sony wanted to do.

Having 3GB reserved seems quite unnecessary - if it is indeed true.
 
interesting interview on next gen consoles with 4A developer !

Digital Foundry: Let's talk about next-gen console. What's your take on the general design in terms of CPU and graphics processing power?

Oles Shishkovstov: We are talking PS4, right? I am very excited about both CPU and GPU. Jaguar is a pretty well-balanced out-of-order core and there are eight of them inside. I always wanted a lot of relatively-low-power cores instead of single super-high-performance one, because it's easier to simply parallelise something instead of changing core-algorithms or chasing every cycle inside critical code segment (not that we don't do that, but very often we can avoid it).

Many beefier cores would be even better, but then we'll be left without a GPU! With regards the graphics core, it's great, simply great. It's a modern-age high-performance compute device with unified memory and multiple compute-contexts. The possibilities of CPU-GPU-CPU communication are endless, we can easily expect games doing, for example, AI pathfinding/route planning executing on GPU to become a common thing.

Digital Foundry: To what extent is the 8GB of GDDR5 in the PlayStation 4 a game-changer? What implications does that have for PC, where even the standard GTX 680 ships with just 2GB of GDDR5?

Oles Shishkovstov: RAM is really, really important for games, but all of it actually being useful depends on available CPU-side bandwidth and latency to the external storage device. I think that they put slightly more RAM than necessary for truly next-generation games this time, but considering the past history of Sony stealing significant percentage of RAM from developers for OS needs - that may be exactly the right amount!

Digital Foundry: The last few years have seen a ton of poorly optimised PC ports of console games. Is the move to x86 architecture across all formats a good or bad thing for PC gaming?

Oles Shishkovstov: In general - yes, especially for indie developers. You have to understand that x86 is much more friendly for beginners at least because of its simplified memory model. Just try to describe to somebody what the memory barrier is and where and when to put it in - usually you'll be left with the guy getting stuck in an infinite loop! Joking aside - the less time we spend on platform-specific optimisations, the more is left to innovate.

Digital Foundry: Do you think that the relatively low-power CPUs in the next-gen consoles (compared to PC, at least) will see a more concerted push to getting more out of GPU Compute?

Oles Shishkovstov: No, you just cannot compare consoles to PC directly. Consoles could do at least 2x what a comparable PC can due to the fixed platform and low-level access to hardware.

Back to the question - yes, yes and yes. There are some things which are just more efficient to do on massively parallel machines like GPUs are. I think that at least initially, with launch titles, the GPU-Compute will be underutilised, but during console's lifetime we'll see more and more unbelievable and innovative things purely thanks to GPUs.

Digital Foundry: Early PS4 work we've seen appears to have utilised either 2x MSAA or post-process AA. Do you think your SSAA/AAA combo could be viable for next-gen console?

Oles Shishkovstov: SSAA is all about decoupling rendering resolution (and grid) from output resolution (and grid). So, yes, in some form or another it will be useful. As for any form of post-processing AA - definitely yes, it was used in the past and will be used in the future. As for MSAA - I'd rather like the GPU vendors to use that (rather significant) amount of transistors in other parts of GPUs. Anti-aliasing is the job of graphics programmers and not some magical hardware feature.

Digital Foundry: We've seen Unreal Engine 4 step back from true real-time global illumination. Is it simply too expensive, even for next-gen consoles? Can you talk us through 4A's GI solution?

Oles Shishkovstov: Actually that's not true global illumination, but more of a really advanced algorithm producing convincing results. Yes, all that voxelisation and cone tracing is very expensive, too expensive even for Titan-like hardware.

I did a lot of research on GI during our last project, but we did not ship with it. The fundamental problem is: when an artist tweaks lighting on PC (with GI) it usually looks like crap on current-gen consoles (without GI). Next-gen console will solve it, enabling us to use some kind of real-time GI, so both the PC and consoles will get it. Personally I still lean towards coarse scene voxelisation and tweaking from here, quite possibly live with some amount of light leakage.

Digital Foundry: Once it's financially viable to let go of Xbox 360 and PS3, what rendering advancements do you hope to see in next-gen gaming?

Oles Shishkovstov: It seems that personally we will jump to next-gen rather sooner than later. We are currently focused on another important aspect of our games - characters. I mean we are not only working on believable appearance/visualisation, but also are in deep research on believable motion and animation. In a nutshell that's full-time physical simulation of the entire body with generated animations based on a "style" model learned from motion capture. That's a really compute-intensive process, but the one greatly suited to a GPU Compute model.

That's just one example. The whole industry was held back with current-gen consoles, because they are a very important source of revenue. Now the lowest common denominator will be 10x higher, and that's incredible. We can expect some form of GI to become common, it will be rare stuff to see a shadow without umbra/penumbra, every model will be properly tessellated and displaced, the OIT will be commonplace (for games who needs it badly), we will forget forever about smoke not casting shadow onto itself, etc, etc - great times really.

I am not saying that we'll solve all the problems at once and the result will be available in every game onto every console, but a 10x more powerful baseline will spawn all types of research and resulting advancements will translate into many games, [and] not only console ones - the PC graphics will get a huge improvement as a result as well.
 
Last edited by a moderator:
Once you get into the higher levels of AA the output bandwidth of the ROP's starts to get cut in half/quarter. They probably have a good reason for going with 32.

TBH i doubt many games will use MSAA, the general direction is post processing like FXAA or SMAA, i think especially SMAA is pretty good und needs far less perfomance.

And the 7770 used GDDR5, ran with 1Ghz instead of 800Mhz and let's also not forget 10% GPU performance the QoS will grant the OS which probably makes the 2 extra CUs a non factor. So the table above will very likely look closer than some here might wish.

So the PS4 requirements for OS will come out of thin air ?

I agree, i would be shocked, if the PS4 doesnt reserve some GPU time for the OS.

Nobody is arguing that the GPU in the PS4 isnt better, but i very much doubt you will see multi games that run 30 FPS on X1 and 60 FPS on PS4 (sebbi explained why); i think the main difference will be in resolution (pixel and texture).
Exclusive games is a different matter, i think Naughty Dog will simply "crush" everyone next gen :smile:
 
TBH i doubt many games will use MSAA, the general direction is post processing like FXAA or SMAA, i think especially SMAA is pretty good und needs far less perfomance.

This is true, do we know anything about the performance of the given ROPs on given formats?, this might be a reason for them picking a high number if some run at 1/2 or even 1/4 speed and are common then it would be worthwhile no?.
 
Radeon 7770 GE vs 7850 benchmarks. Ok, lets compare the benchmark results with the card specifications.

ALU (GFLOP/s): 1280 vs 1761.28 (+38%)
Bandwidth (GB/s): 72 vs 153.6 (+113%)
Fillrate (GP/s): 16 vs 27.52 (+72%)
Texture sampling (GT/s): 40 vs 55.04 (+38%)

Actual performance in Anandtech benchmark (all games):
- Average: 1070/1689 = 1.58 (+58%)
- Minimum gain (Battlefield): 61.2/43.3 = 1.41 (+41%)
- Maximum gain (Batman): 62/36 = 1.72 (+72%)

Performance gain is limited by each game's/engine's bottlenecks. Battlefield seems to be mostly ALU or TEX bound (or both). Thus it only gains slightly more than the 38%. It represents a modern game engine. Older games/engines tend to be more fill rate bound (simple pixel shaders with less ALU/TEX instructions). Thus Batman's performance gain (+72%) is identical to the raw fillrate improvement from 7770 -> 7850. To support that extra fillrate and extra TEX you of course need extra bandwidth. That +113% BW improvement is enough to feed the +38% TEX and +72% fill capabilities of 7850.

Unfortunately we do not have any official ROP or TMU information about the consoles, and neither any ALU (FLOP/s) information for Xbox One. The only official information we have is the bandwidth figures: 200+ GB/s for Xbox One (XBox One reveal tech panel) and 176 GB/s for PS4 (Sony's PS4 press release). It's too early to draw any conclusions about that information, except that memory bandwidths of those devices are quite similar. And that's good since memory bandwidth is one of the most important thing for GPU compute (future engine graphics pipelines are also mostly compute based).
 
Radeon 7770 GE vs 7850 benchmarks. Ok, lets compare the benchmark results with the card specifications.

ALU (GFLOP/s): 1280 vs 1761.28 (+38%)
Bandwidth (GB/s): 72 vs 153.6 (+113%)
Fillrate (GP/s): 16 vs 27.52 (+72%)
Texture sampling (GT/s): 40 vs 55.04 (+38%)

Actual performance in Anandtech benchmark (all games):
- Average: 1070/1689 = 1.58 (+58%)
- Minimum gain (Battlefield): 61.2/43.3 = 1.41 (+41%)
- Maximum gain (Batman): 62/36 = 1.72 (+72%)

Performance gain is limited by each game's/engine's bottlenecks. Battlefield seems to be mostly ALU or TEX bound (or both). Thus it only gains slightly more than the 38%. It represents a modern game engine. Older games/engines tend to be more fill rate bound (simple pixel shaders with less ALU/TEX instructions). Thus Batman's performance gain (+72%) is identical to the raw fillrate improvement from 7770 -> 7850. To support that extra fillrate and extra TEX you of course need extra bandwidth. That +113% BW improvement is enough to feed the +38% TEX and +72% fill capabilities of 7850.

Unfortunately we do not have any official ROP or TMU information about the consoles, and neither any ALU (FLOP/s) information for Xbox One. The only official information we have is the bandwidth figures: 200+ GB/s for Xbox One (XBox One reveal tech panel) and 176 GB/s for PS4 (Sony's PS4 press release). It's too early to draw any conclusions about that information, except that memory bandwidths of those devices are quite similar. And that's good since memory bandwidth is one of the most important thing for GPU compute (future engine graphics pipelines are also mostly compute based).

I'll be cheeky and do this then. Making an assumption that both GPU's are clocked at 800MHz. XBox One & PS4. Based on best available information.

FLOPS 1200 / 1840 (+53%)
ALU (GFLOP/s): 1200 vs 1840 (+53%)
Bandwidth (GB/s): 170 vs 176 (+3%)
Fillrate (GP/s): 12.8 vs 25.6 (+100%)
Texture sampling (GT/s): 32 vs 51.2 (+60%)

That will give me worst return of around 55% and a best of 100%, dependent upon what the game engine is limited by?
 
That will give me worst return of around 55% and a best of 100%, dependent upon what the game engine is limited by?
Improved fill rate and texture sampling rate needs extra bandwidth in addition to ROPs and TMUs. Otherwise it's going to become bandwidth limited. If you compare Radeon 7770 -> 7850, there's +113% extra bandwidth to feed that +72% fill rate and +38% texture sampling rate.

In your chart the worst case return is %3 (bandwidth). This occurs if the engine is already bandwidth bound.
 
Improved fill rate and texture sampling rate needs extra bandwidth in addition to ROPs and TMUs. Otherwise it's going to become bandwidth limited. If you compare Radeon 7770 -> 7850, there's +113% extra bandwidth to feed that +72% fill rate and +38% texture sampling rate.

In your chart the worst case return is %3 (bandwidth). This occurs if the engine is already bandwidth bound.

Thanks for the intelligent breakdown sebbi.
Won't the ps4 have much more general bandwidth though to fill those rops? Durango only has 32mb sram to play with, is it really a fair comparison to say ps4 has only 3gb/s bandwidth advantage? In most cases wouldn't liverpool soc have much more wriggle room?
 
Thanks for the intelligent breakdown sebbi.
Won't the ps4 have much more general bandwidth though to fill those rops? Durango only has 32mb sram to play with, is it really a fair comparison to say ps4 has only 3gb/s bandwidth advantage? In most cases wouldn't liverpool soc have much more wriggle room?

Go to Vgleaks, copy between ESRAM and DDR 3 are not free.;) Xbone has more than 6 GB/s deficit in memory bandwidth...
 
Oles Shishkovstov: RAM is really, really important for games, but all of it actually being useful depends on available CPU-side bandwidth and latency to the external storage device. I think that they put slightly more RAM than necessary for truly next-generation games this time, but considering the past history of Sony stealing significant percentage of RAM from developers for OS needs - that may be exactly the right amount!

This probably further solidifies 3Gb RAM reserved on Ps4 likely. He obviously just cant day it cause of NDA so he talks around it like that.
 
It's not the engineers fault - you can blame MS management for changing their strategy to focus on courting the casuals and $$$

Still, given their transistor budget going with ESRAM to support DDR3 kind of blew away most of their options. Cost wise I don't know how that compares to PS4 though. Given what we know, would the overall SoC cost more than PS4's? Is the total BoM even that much lower than PS4's? Minus Kinect.
 
Still, given their transistor budget going with ESRAM to support DDR3 kind of blew away most of their options. Cost wise I don't know how that compares to PS4 though. Given what we know, would the overall SoC cost more than PS4's? Is the total BoM even that much lower than PS4's? Minus Kinect.
The transistor count is a bit misleading because it's not necessarily proportional to the die area. An SRAM array is extremely dense compared everything else. I wouldn't be surprised if both SoC end up just a bit above 300mm2. So about the same cost.

Savings for using DDR3 instead of GDDR5 could be more than enough to counterbalance the additional cost of Kinect2 versus the PSEyes. Both consoles could have a very close BOM, depending on what Sony decides to bundle.
 
This probably further solidifies 3Gb RAM reserved on Ps4 likely. He obviously just cant day it cause of NDA so he talks around it like that.

No it doesn't if fact it says nothing other then he expects them to use a non infitessimile portion of it for the OS. 2GB's which i think is probably the local spot for Sony to end up is still 25% of the overall ram which is not a small amount.
 
I still understand why any console would need such a huge amount of memory for the OS...

They need remember what these things were built for..GAMES...

Ditch the bloated OS and give the extra RAM and processing power to the developers to improve the GAMES...
 
I still understand why any console would need such a huge amount of memory for the OS...

They need remember what these things were built for..GAMES...

Ditch the bloated OS and give the extra RAM and processing power to the developers to improve the GAMES...

Uhm, how much RAM do you think games will use when they will still mostly support PC gaming which is run primarily on 32bit OS where there is a limit of 2GB for application space?
 
I still understand why any console would need such a huge amount of memory for the OS...

They need remember what these things were built for..GAMES...

Ditch the bloated OS and give the extra RAM and processing power to the developers to improve the GAMES...
There is enough RAM even with the bloated OS. If these were the simle game boxes of old, they likely would not feel it necessary to stick 8GB in there.
 
Also, given that each Xbox One will come with a HDD standard, wouldn't it be possible to just stream in asset if the RAM has been used up? 5gb of RAM is a lot but I would imagine that a mixture of RAM access and streaming in assets should do.
 
Uhm, how much RAM do you think games will use when they will still mostly support PC gaming which is run primarily on 32bit OS where there is a limit of 2GB for application space?
I have been thinking the same :)

PC games must work perfectly with just 2 GB of system memory + 1 GB of graphics memory. Windows XP is still widely used, and there is a 32 bit version of Windows Vista, 7 and 8. You can't just drop support for all of these operating systems. Also most players are using a 1080p display, but 70% of the players (according to Steam Survey) have 1 GB of less graphics memory. PC games must support 1080p flawlessly on graphics cards that have only 1 GB of memory.

Bottom line: A PC game must work perfectly with just 3 GB of memory (total). All current PC games do so. A 2+ GB graphics card is only needed for super high resolution gaming (2560x1440+), and less than one percent of gamers have monitors like that.
 
Last edited by a moderator:
Status
Not open for further replies.
Back
Top