Technical Comparison Sony PS4 and Microsoft Xbox

Status
Not open for further replies.
To your point, sebbi said in another thread that Haswell benchmarks will be a good indication on the gains to be had by going with a fast,low-latency, on-die cache. So if intel feels this is the architecture to use for a high-performance multimedia APU, i'm not clear on why some are treating the XB1 design as a consolation prize?

I don't understand the validity of this comparison. Gains to Haswell of the on die cache will be largely due to the more than double bandwidth achieved though it vs a normal DDR3 interface.

Xbone on the other hand is gaining no bandwidth in using esram + DDR3 vs GDDR5 alone.

Of course if we're talking about comparing Haswell GTe vs the rumoured Kaveri using GDDR5m, both with similar overall bandwidth then it might make an interesting analog for PS4 vs Xbone performance (albeit a likely higher inaccurate one!).
 
Is GDDR5 the only memory type in PS4? is this shared between ARM and system CPU?

There's a feeling that the ARM chip probably has it's own tiny on-board RAM, but it's believed that 8GB of GDDR5 is the only significant amount of RAM in the system.

How much memory is needed for the multi-tasking in PS4 (including the ability to pause and resume a game)?

Short answer: it depends what you're multitasking ;). If you're multitasking a full android environment alongside your game, then 1GB of RAM seems good.

Personally, I'd look at 1GB for the OS and 0.5GB for buffers (I'd want to keep the video record thingy in RAM, assuming it's 720p@30, [constant writing to the HDD seems a bad idea?] - that seems to be around 400MB? then 100MB for buffering game downloads?). So I'd look at reserving around 1.5GB maybe...

Pausing/resuming a game doesn't use memory really. In the simplest version the game's memory is streamed to the disk along with register states etc, resume reverses the process. It's a bit more complicated [or could be a case of the game developer has to restore the state :( ], but I wouldn't expect any significant memory consumption.
 
I don't understand the validity of this comparison. Gains to Haswell of the on die cache will be largely due to the more than double bandwidth achieved though it vs a normal DDR3 interface.

Xbone on the other hand is gaining no bandwidth in using esram + DDR3 vs GDDR5 alone.

Of course if we're talking about comparing Haswell GTe vs the rumoured Kaveri using GDDR5m, both with similar overall bandwidth then it might make an interesting analog for PS4 vs Xbone performance (albeit a likely higher inaccurate one!).

Yeah because, Haswell would need games written for it to take specific advantage of the low latency of the EDRAM, wouldn't it? Which will never happen.
 
I don't see the PS4 using 3GB, but 1GB seems a bit too low if they want to add video streaming and PS Eye in the mix. I suspect the reality is somewhere inbetween...

I'm not commenting on the amount of RAM reserved, just pointing out that video sharing and the PS4 Eye where accounted for when 8 GB was only a wet dream for Cerny (and a nightmare for Kaz :p) based on reports we have from various articles discussing the controller, share button and social features during the design phase. I noticed many people seem to relate the share features displayed at at the reveal being enabled due to the additional RAM. However, by all accounts that does not appear to be the case.
 
I'm not commenting on the amount of RAM reserved, just pointing out that video sharing and the PS4 Eye where accounted for when 8 GB was only a wet dream for Cerny (and a nightmare for Kaz :p) based on reports we have from various articles discussing the controller, share button and social features during the design phase. I noticed many people seem to relate the share features displayed at at the reveal being enabled due to the additional RAM. However, by all accounts that does not appear to be the case.

What if the 8GB upgrade was because it is need it? Maybe 4GB was not enough.
 
What if the 8GB upgrade was because it is need it? Maybe 4GB was not enough.

Simply put, that doesn't appear to be the case. By all accounts Share functionality was accounted for from near the beginning of the console's design, before the denser 4Gb chips could meet their BOM target. Besides, I'm not sure why people are assuming streaming/saving video uses memory. An 24 Mb encode would write roughly 3.125 MBps to the HDD. That's a sequential write and I suspect they'll use a variable bitrate which should further help. They could also be using both. A 50 MB chunk of memory from the OS reserve, for example, would give a nice 16 second buffer for the encode before hitting the HDD to ensure the game has priority access. A 100 MB would give a full 30 seconds of buffered content to be written to the drive in between game HDD requests.
 
Last edited by a moderator:
I'm not commenting on the amount of RAM reserved, just pointing out that video sharing and the PS4 Eye where accounted for when 8 GB was only a wet dream for Cerny (and a nightmare for Kaz :p) based on reports we have from various articles discussing the controller, share button and social features during the design phase. I noticed many people seem to relate the share features displayed at at the reveal being enabled due to the additional RAM. However, by all accounts that does not appear to be the case.
Ah, okay, that might explain it. I was leaning towards dumbo11's line of thought. Not that 1GB vs 1,5GB is going to make much of a difference when you have 8GB of RAM to work with and the Xbox One reserves 3GB for the OS.

I believe the extra RAM is going to favour the PS4 in certain ways, like faster loading times. Textures are going to be the same in both consoles, imho.

Additionally, I was reading this article.... http://www.pcmag.com/article2/0,2817,2419795,00.asp

There is a picture by the end of the article comparing both consoles and I noticed that for them the PlayStation 4 CPU is a "Single Chip, AMD Jaguar Processor, 8 Cores" but when they talk about the Xbox One CPU they define it like this "8 Core Microsoft Custom CPU".

I was sure that both are Jaguar processors, and the level of customization in the Xbox One CPU should be minimal, or basically non-existent, I think. :eek:

0,1462,sz=1&i=324340,00.jpg
 
Not sure if this is the right place but I'm curious about what game sizes are going to be on average next generation. I imagine they will be more on par with each other than they were in some cases this gen.
 
Ah, okay, that might explain it. I was leaning towards dumbo11's line of thought. Not that 1GB vs 1,5GB is going to make much of a difference when you have 8GB of RAM to work with and the Xbox One reserves 3GB for the OS.

I believe the extra RAM is going to favour the PS4 in certain ways, like faster loading times. Textures are going to be the same in both consoles, imho.

Additionally, I was reading this article.... http://www.pcmag.com/article2/0,2817,2419795,00.asp

There is a picture by the end of the article comparing both consoles and I noticed that for them the PlayStation 4 CPU is a "Single Chip, AMD Jaguar Processor, 8 Cores" but when they talk about the Xbox One CPU they define it like this "8 Core Microsoft Custom CPU".

I was sure that both are Jaguar processors, and the level of customization in the Xbox One CPU should be minimal, or basically non-existent, I think. :eek:

Bluetooth in both? :?:
 
Anyone know if they are using a costum OS or is it based in something well know (IIRC I saw some forum BSD):?:

Ubuntu can do a lot of stuff with incredible DE and several apps, browser/Youtube and fit in a 1Gb of RAM, or less. Probably BSD could do even better.

Personally I find hard that today a OS for a closed system that doesn't do much beside gaming would take more than a 1Gb.

XBone is using 3 OS and uses 3Gb while using windows and doing multitasking.
 
Last edited by a moderator:
Not sure if this is the right place but I'm curious about what game sizes are going to be on average next generation. I imagine they will be more on par with each other than they were in some cases this gen.

The size a big-budget game would probably be related to how much content they can make per dollar spent.

The amount of content per disc may be higher with the advent of full-install gameplay. Why duplicate assets on disc to hide optical seek times if the assumption is that it's ripping to HDD?
 
Anyone know if they are using a costum OS or is it based in something well know (IIRC I saw some forum BSD):?:

Linux can do a lot of stuff with incredible DE and several apps, browser/Youtube and fit in a 1Gb of RAM, or less. Probably BSD could do even better.

Personally I find hard that today a OS for a closed system that doesn't do much beside gaming would take more than a 1Gb.

XBone is using 3 OS and uses 3Gb while using windows and doing multitasking.

PC-BSD minimum requirements are 512MB and recommended is 1GB. I think web browsers are memory hungry applications, Firefox eat up 400MB~500MB.
 
PC-BSD minimum requirements are 512MB and recommended is 1GB. I think web browsers are memory hungry applications, Firefox eat up 400MB~500MB.

PC BSD is hardly concerned with being very low resource hungry, PS4 should be much more I guess.

Depending on the number of tabs.

Anyway just did a test and while recording my desktop at 1080p60FPS (using kdenLive) and browsing ~30 Tabs in firefox including a playing 1080 youtube video it hits 1.5GB Ram. Plus there is a number of other smaller stuff. Ubuntu is hardly a distro optimized for low performance (although quite nice on it too) even for pretty desktops like Unity.

Anyway it is really hard to believe that in a closed controlled environment they couldn't use much less than that.
 
BTW can anyone tell me how many DSPs are in PS4?

I know there is audio dsp, anything more?

BTW any specs in the audio dsp?
 
... So yes, as long as what they are processing in a frame is purely streaming, congruent data, then the PS4 will easily surpass the XB1. Make the data a bunch of different textures in different memory locations, or GPGPU physics calculations, or complex shaders that aren't just streaming data, then the result may surprise you. In those cases, the PS4 may take up to 10x longer to retrieve a piece of data than the XB1, stalling a non trivial amount of the GPU.


Shirley... I mean surely this is a known issue with GPU cards as well to some extent. Wouldn't some of the solutions that have been used with GPU cards with high bandwidth-high latency be applicable to the PS4 ? I mean to my very untrained eye :oops: the PS4 is like a big GPU card with an APU on it.

I do understand that when dealing with a starved CPU there would be better performance with the Xbox way of things but with the extra GBs of RAM available couldn't one map data in a consistent way to maximise congruency and the like ?
 
Bluetooth in both? :?:

Not sure how they got that confused. Everything else seems relatively accurate given what's "officially" announced. Although there's some distinctions for both they completely gloss over or miss entirely.

BTW can anyone tell me how many DSPs are in PS4?

I know there is audio dsp, anything more?

BTW any specs in the audio dsp?

In terms of dedicated function hardware (DSP or otherwise) there's an audio processor (ACP), video encoder (VCE), video decoder (UVD) and Zlib decompression hardware. The only info on the ACP is that it can handle approximately ~200 concurrent MP3 streams for game audio and that it will also be used for handling in game audio chat. How advanced it is beyond that, if its fully programmable or not, has not be leaked or revealed so far as I'm aware.
 
Last edited by a moderator:
Shirley... I mean surely this is a known issue with GPU cards as well to some extent. Wouldn't some of the solutions that have been used with GPU cards with high bandwidth-high latency be applicable to the PS4 ? I mean to my very untrained eye :oops: the PS4 is like a big GPU card with an APU on it.

I do understand that when dealing with a starved CPU there would be better performance with the Xbox way of things but with the extra GBs of RAM available couldn't one map data in a consistent way to maximise congruency and the like ?
Yes, what they do nowadays is throw more ALUs at it, increase the clocks, and swizzle textures so their memory format is better for streaming. And then they just take any latency hits they have to, since that's the cost of doing business. It'll only get worse as GPGPU workloads become more common.
 
Status
Not open for further replies.
Back
Top