Technical Comparison Sony PS4 and Microsoft Xbox

Status
Not open for further replies.
Can I just remind people this is the technical comparison, hardware v hardware. Business discussion belongs elsewhere, and specific points like whether a feature (steamed computing) belong elsewhere.

I'll clean it up later. but please don't add to my work by contributing to OT conversations here - wait until they move.
 
What's a good link to their architecture talk ? Need to understand what and why they went the VM route.

http://majornelson.com/2013/05/21/xbox-one-architecture-panel/

I don't know the performance characteristics of running two VMs on a machine that supports VM in hardware vs running one big OS and having multitasking supported in the traditional threaded way. It seems having exact bounded hardware for each VM was important and maybe the traditional OS model was less predictable for developers. They do say the OS running on the System VM is more of a traditional OS that supports third party add-ons. I'm guessing they just mean small apps that do various things that all need to be scheduled. I don't know enough about VM and hypervisors to know how VMs are scheduled against eachother and how the partitioning of hardware works.
 
http://majornelson.com/2013/05/21/xbox-one-architecture-panel/

I don't know the performance characteristics of running two VMs on a machine that supports VM in hardware vs running one big OS and having multitasking supported in the traditional threaded way. It seems having exact bounded hardware for each VM was important and maybe the traditional OS model was less predictable for developers. They do say the OS running on the System VM is more of a traditional OS that supports third party add-ons. I'm guessing they just mean small apps that do various things that all need to be scheduled. I don't know enough about VM and hypervisors to know how VMs are scheduled against eachother and how the partitioning of hardware works.

Thanks. Will watch tonight.

Digital Foundry has a comparison with a convincing argument - Sony made a gamble that paid off. MS committed to 8GBs early on which seemed cost prohibitive, so had to go with local storage again. Sony chose GDDR5 and was looking at 4 GBs, but was able to stretch to 8GBs for long-term advantage. Had MS picked GDDR5 and expected 8 GBs to be cost effective, it'd be a different story.

We really need to know what the advantages of ESRAM are. It seems very wasteful at the moment to go with that over eDRAM. A local store to make up the BW deficit of DDR3 is one thing, but committing 2 billion transistors to the task instead of 300,000 seems a major decision. Of course there's the foundry issue to factor in. We can't be sure which is the greater influence - better performance or better economies of production.

It's not so much a gamble since Sony is a h/w company. It sounds more like a calculated risk vs their commitment to make an easy-to-develope platform. Cerny mentioned that they considered alternative split pool, terabyte bandwidth type design. On our end, we also heard Sony flip flopping between different Cell and unCell designs.
 
Anyone care to estimate transitor counts for the SOCs? At least the CPU+GPU (+ESRAM) part? Anyone with GCN GPU knowledge probably knows a suitable reference design, and Jaguar details must be somewhere. I make it 1.5 billion transistors for ESRAM (32 MB x 8 bits x 6 T per bit).
 
http://majornelson.com/2013/05/21/xbox-one-architecture-panel/

I don't know the performance characteristics of running two VMs on a machine that supports VM in hardware vs running one big OS and having multitasking supported in the traditional threaded way. It seems having exact bounded hardware for each VM was important and maybe the traditional OS model was less predictable for developers. They do say the OS running on the System VM is more of a traditional OS that supports third party add-ons. I'm guessing they just mean small apps that do various things that all need to be scheduled. I don't know enough about VM and hypervisors to know how VMs are scheduled against eachother and how the partitioning of hardware works.

I skipped to Boyd's segment. It is as I understood.

The 2 VMs running side by side will contend for CPU (2 cores for the apps partition ?), memory (3 v 5 GB), and bandwidth. Not sure how they are going to guarantee QoS when both sides are in play. They can prioritize the requests but it's still an overhead. At this early stage, the OSes may have bugs too.

The partitions seem to be there more for security (and potential portability) rather than QoS. Would be nice if someone leak the GPU access API. :p

If things are running inside VMs via APIs, and relying on the Display Engine to overlay stuff, it can get complicated.

I don't know how Sony intend to solve it, but hopefully the simpler the better. And don't waste the silicons.
 
I think Richard Leadbetter's speculation on Digital Foundry makes a lot of sense, that Microsoft did a hell of a lot of engineering to make sure they'd have 8 gigabytes of RAM with usable bandwidth no matter what happened with GDDR5 densities, whereas Sony spent their silicon budget on a greater amount of GPU power and wound up getting lucky that they could reach 8 gigs of GDDR5 by the time it came to launch the system.

That's a simple, compelling story, and it sounds like things happened to break towards Sony in a really big way, leaving Microsoft at a disadvantage this generation. No wonder Microsoft's PR team put out stories in Wired and etc. showing off their engineering lab, talking up how hard it was to do custom silicon, their impressive chip emulation simulation hardware, the impressive number of transitors in their system and etc. When you've got lemons, make lemonade.

At this point, I'm far more interested in the software layers the two companies will be deploying. The PS3 ran GameOS on top of a security hypervisor last generation, it'll be interesting to know whether they keep that kind of layered approach this gen, or whether they're confident that they can have a single BSD OS with good enough scheduling, GPU virtualization, and memory management that they can do all the resource sharing with a single OS image.

Presumably, the 64 ACE channels they commissioned for their GPU will simplify sharing the GPU between a running game and whatever kind of simultaneous app processing they're going to want to do.

From a UI perspective, it'll be interesting to see how much Sony can do with their camera sensor bar, and whether they are able to do the voice operations that Microsoft is going to be selling their system with.
 
From Microsoft's Architecture chat yesterday, it does sound like MS's VM strategy will help ensure that each game runs with the same (virtualized) firmware / operating system that the game passed cert on. That seems a real advantage versus having a constantly changing operating system kernel that a game runs on top of, as you'd have with a single system image BSD kernel, though Sony could keep a lot of the performance critical pieces in user land or as reloadable ring-0 stuff.

Hopefully both Sony and Microsoft will someday publish a post-mortem on their operating system choices, because there's fascinating engineering at this level.
 
Yes, their priority, how they layer the OS, and layout the memory is more interesting than the pure h/w numbers.

For apps, besides the VM approach, PS4 can also run them on Mono (PS Mobile) or HTML5 (PS3).

Will be interesting to see how they solve fast switching and background tasking.
 
I think Richard Leadbetter's speculation on Digital Foundry makes a lot of sense, that Microsoft did a hell of a lot of engineering to make sure they'd have 8 gigabytes of RAM with usable bandwidth no matter what happened with GDDR5 densities, whereas Sony spent their silicon budget on a greater amount of GPU power and wound up getting lucky that they could reach 8 gigs of GDDR5 by the time it came to launch the system.
It's a personal hunch, but for a project of this importance I wouldn't assume Sony passively sat around and just got lucky with GDDR5 density increasing. There could have been uncertainty, but I think they would have been discussing this with a number of stakeholders like AMD and the memory manufacturers.
Assurances of certain volumes might encourage manufacturers to stick to a more favorable time line.

There was a possible turning point during the design phase of AMD's Bonaire GPU, which may have opted to remain with a 128-bit bus in anticipation of a density increase.
 
It's a personal hunch, but for a project of this importance I wouldn't assume Sony passively sat around and just got lucky with GDDR5 density increasing. There could have been uncertainty, but I think they would have been discussing this with a number of stakeholders like AMD and the memory manufacturers.
Assurances of certain volumes might encourage manufacturers to stick to a more favorable time line.

Oh, certainly. I didn't mean to suggest that Sony didn't work their ass off to get that 8GB once they became confident that Microsoft would go for 8GB, but the fact that it became technically possible to get that RAM made by the time the two systems launched couldn't have been certain to them when they started the project, or else it would have been certain to Microsoft as well and they wouldn't have gone through all the contortions and spent all those transistors trying to make up for DDR3 bandwidth.

I mean, maybe it wasn't luck, maybe both Sony and Microsoft knew that 8GB GDDR5 would be technically available at launch and Microsoft just engineered for cheaper unit costs, but..
 
It was luck, and MS and developer pressure that pushed them the rest of the way to 8GB. If that pressure wasn't there I doubt they would've gone to 8. And I'm fairly certain an end of 2013 launch was something non negotiable. They couldn't let MS beat them to the punch, so if 4gbit modules weren't available in time it would've been c'est la vie.
 
On a side note, I wonder if there is an alternative solution (but not exactly identical) to seamlessly switch from TV/apps/games without having to reserve too much console resources to do that.
We have seen some very minor handshake through HDMI between the TV and the PS3 Slim.
We havent seen much evolution in that area after that.
But with the power smart TV's boost these days and the power the PS4 will boost, perhaps they can handshake in more advanced ways and provide seamless switching from one another and PiP instead of relying on the console to do all the work like XBOX One.
The TV outputs the channels anyways and has an HDMI in to output the console. If they find creative ways to communicate with each other it could be a good alternative to offer a similar experience between jumping from channel to gaming and console apps
 
On a side note, I wonder if there is an alternative solution (but not exactly identical) to seamlessly switch from TV/apps/games without having to reserve too much console resources to do that.
We have seen some very minor handshake through HDMI between the TV and the PS3 Slim.
We havent seen much evolution in that area after that.
But with the power smart TV's boost these days and the power the PS4 will boost, perhaps they can handshake in more advanced ways and provide seamless switching from one another and PiP instead of relying on the console to do all the work like XBOX One.
The TV outputs the channels anyways and has an HDMI in to output the console. If they find creative ways to communicate with each other it could be a good alternative to offer a similar experience between jumping from channel to gaming and console apps

The One's solution is a kludge in itself. It's a voice commanded slingbox without the sling. It relies on IR blasting and individualized support for existing cable boxes. It's not a DVR and relies on a cable box. It's only a good proposition if you're comfortable with the xbox doing everything and could mess up your control scheme with a universal remote.

On top of that, it has doubtful value outside the US. Japan may not be much, but Europe matters as a market, and the XBone hasn't shown anything that promises to make it better than the 360 there.

All the PS4 needs to do is support every streaming app/solution out there with quick app switching and state suspending. The rest of the ecosystem will evolve organically - probably towards more content available digitally and less dependence on a cable box. I don't think there's anything technically limiting the PS4 from that right now, it's just a software question.
 
The One's solution is a kludge in itself. It's a voice commanded slingbox without the sling. It relies on IR blasting and individualized support for existing cable boxes. It's not a DVR and relies on a cable box. It's only a good proposition if you're comfortable with the xbox doing everything and could mess up your control scheme with a universal remote.

On top of that, it has doubtful value outside the US. Japan may not be much, but Europe matters as a market, and the XBone hasn't shown anything that promises to make it better than the 360 there.

All the PS4 needs to do is support every streaming app/solution out there with quick app switching and state suspending. The rest of the ecosystem will evolve organically - probably towards more content available digitally and less dependence on a cable box. I don't think there's anything technically limiting the PS4 from that right now, it's just a software question.

Well the dependence of XBOX One on US cable sure makes it a more inconvenient solution.
Whereas everyone own a TV and Smart TV's are becoming more common. So I think if the HDMI handshake can really get that advanced it can be a feature that more people will have meaningful access too in more territories ;)
Not sure about the feasibility though and how much resources the PS4 will have to allocate for that. But I think it would be less than whats needed to do all the fancy looking and unnecessary stuff on the XBOne
 
Well the dependence of XBOX One on US cable sure makes it a more inconvenient solution.
Whereas everyone own a TV and Smart TV's are becoming more common. So I think if the HDMI handshake can really get that advanced it can be a feature that more people will have meaningful access too in more territories ;)
Not sure about the feasibility though and how much resources the PS4 will have to allocate for that. But I think it would be less than whats needed to do all the fancy looking and unnecessary stuff on the XBOne

Look at how much the dash/springboards evolved over the generation. I think they can accommodate.
 
Digital Foundry has a comparison with a convincing argument - Sony made a gamble that paid off. MS committed to 8GBs early on which seemed cost prohibitive, so had to go with local storage again. Sony chose GDDR5 and was looking at 4 GBs, but was able to stretch to 8GBs for long-term advantage. Had MS picked GDDR5 and expected 8 GBs to be cost effective, it'd be a different story.

Actually, Durango was 4GB originally, they increased to 8GB in Nov 2011. This was just before the first third party reveals.
 
If games are only designed to the lowest common denominator (Xbox One) for the major multiplatform titles then will it make a significant difference if the PS4 is faster in multi-platform games? Will we simply get a situation where the PS4 gets computationally expensive but low impact features thrown at it?
 
If games are only designed to the lowest common denominator (Xbox One) for the major multiplatform titles then will it make a significant difference if the PS4 is faster in multi-platform games? Will we simply get a situation where the PS4 gets computationally expensive but low impact features thrown at it?
I would imagine the main difference would be improving things.. higher-res shadows or other shader effects that may be spotty on the lesser system. But I'm not sure what specific visual features would most benefit from the extra muscle, to be honest.

Could we see things like higher-res textures? I know that relies heavily on the amount of memory, but I would imagine the speed of memory could allow them to move larger assets around. Maybe better AA on the faster GPU?

There's talk of the framerate difference, but I imagine they'd use it to improve visuals overall rather than just speed them up.
 
Status
Not open for further replies.
Back
Top