PlayStation 4 (codename Orbis) technical hardware investigation (news and rumours)

Status
Not open for further replies.
Sony could have gone with a potent OoO PPC/Power though. And Cerny was investigating x86, not HSA, and concluded it was okay (in 2007). So the complaints of the devs had to be regards x86 as they understood it back then. I expect many of Sony's first parties hadn't had decent experience with low-level x86 coding for long enough to not appreciate the architecture's potential, maybe, but I'm not really seeing the obvious issues with x86 nor what PPC could do better aside from tools and experience (and considering most folk are coding in higher level languages, I'm not sure what value x86 experience really brings anyway!).

By late 2007, other non-technical trends were already kicking in that favored x86 over PPC.
Apple had dumped PowerPC by then, with indications that IBM did not find a benefit in pushing the design envelope in that range of product. There were and are very few vendors with the capability to manage large-scale OoO high-performance cores and the infrastructure needed to feed them, and IBM was phoning it in.
This was also following other things that may have been worrisome to Cerny, such as IBM's wrong turn with the in-order POWER6.
IBM showed a willingness to invest in very large POWER server chips, but it kept the crown jewels far from the mortal chips Sony would have wanted.

The stagnation would have been worrisome. ISA differences, barring exceptionally bad choices, are a second-order effect, which are imminently survivable in the power-cost space of a console and can with enough money and engineering be overcome. Just knowing that an architecture is going to iterate a lot means it can grow itself to a better position than an ostensibly superior evolutionary dead end.

AMD and Intel were stepping things up, and by that year AMD had announced SSE5, which brought vector extensions that had better permute capability and FMA. This showed that x86 was going somewhere useful.
By the time IBM got back to OoO server chips it historically would never let out of the vault, it was receding in mindshare and lacking in GPU tech.

It looks to me that Sony and Microsoft probably called this one right.
On the other hand, it would have been a scary number of years, since the call was made just as AMD proceeded to flounder architecturally.
The two-horse race Cerny may have hoped would continue to drive design evolution fell apart, but Intel did continue to innovate and an AMD's Fusion initiative at least showed some kind of significant design initiative and progress on the GPU front that IBM had no presence with.

It's not clear if the core philosophy that went into Bobcat and then to Jaguar had really come into its own at that time.
If AMD hadn't screwed the pooch so badly, one would wonder if the rumors of Orbis and a Steamroller APU could have been where Cerny wanted to go. Jaguar isn't a bad single-threaded performer, but I feel its selection is also something of a repudiation of the direction AMD tripped itself into with its big cores.

I wonder if Cerny lost any sleep in the years Bulldozer was losing to its predecessors, and Llano was delaying and disappointing the Fusion effort. Then there was the prospect of losing AMD as a design alternative, and then having to face Intel's much stronger bargaining position and very unimpressive graphics in that time frame.
 
Is if fair to say that AMDs CPUs are about as undesirable as Intel's GPUs?

I wouldn't say that at all. especially what we have seen with jaguar. Bulldozer is a bad miss step but even then the core itself isn't that bad, its performance is inconsistant. whats bad is how 72mm of cores needs 240mm of L2/L3+uncore to operate.
 
I wonder if Cerny lost any sleep in the years Bulldozer was losing to its predecessors, and Llano was delaying and disappointing the Fusion effort. Then there was the prospect of losing AMD as a design alternative, and then having to face Intel's much stronger bargaining position and very unimpressive graphics in that time frame.

I don´t know if is true that they really wanted Steamroller as first option. But, the result would have been similar or worse, if they really wanted a <40 watts cpu. Cerny could have very well seen early Jaguar IPC emulation results and change his mind all around withoutht a doubt.
 
It looks to me that Sony and Microsoft probably called this one right.
I agree, and maybe IBM weren't in a position to provide a decent processor (although with enough money and some actual console-focus criteria, I'm sure they could have produced something). Why do you think devs were reportedly against x86 though? What were the other options and why would Sony's first parties prefer that? They would be in a fairly unique position of going lower level with their code, I guess, so perhaps the low level quality of the hardware was considered important, but between x86, PPC, and maybe MIPS again, why would the devs be against x86?

Seems odd to me. My response would have been, "whatever, just so long as it's good and fast." ;)

Had extracting good performance from x86 been difficult prior to then? My perception of the architecture is that it's relatively fast on poor code but relatively slow on good code, designed around PC legacy software. Is that a misconception of mine (and maybe Sony's 1st parties)?
 
I don´t know if is true that they really wanted Steamroller as first option. But, the result would have been similar or worse, if they really wanted a <40 watts cpu. Cerny could have very well seen early Jaguar IPC emulation results and change his mind all around withoutht a doubt.

I think it's likely that Steamroller was plan A. That's what the pastebin from December 2011 that accurately predicted basically everything else about the PS4 indicated. I also think that rumor that came out of Japan suggesting they switched to Jaguar because they were worried about being tied to Global Foundries was probably also accurate. It probably didn't hurt if they knew at that point MS was also going with 8 Jaguar cores.
 
I think it's likely that Steamroller was plan A. That's what the pastebin from December 2011 that accurately predicted basically everything else about the PS4 indicated. I also think that rumor that came out of Japan suggesting they switched to Jaguar because they were worried about being tied to Global Foundries was probably also accurate. It probably didn't hurt if they knew at that point MS was also going with 8 Jaguar cores.

That´s the reason i defend that Sony CPU could very well be clocked at 2GHz. Their plan A had a beefier CPU. I can imagine them saying: ok, lets change to Jaguar but AMD, give us the more powerful variant to be the nearest possible in performance to the initial goal.

Their initial target was a 1,84Tflops GPU and for sure a 50-60 watts downclocked 4 Steamroller cores. Why not go now to the same GPU plus a 40-50 watts 8 cores Jaguar cpu?.
 
Well, it was supposedly a 2 module/4 core Steamroller at 3.2 Ghz. That would have produced the same 102.4GFlops as the 8 Jaguar cores at 1.6Ghz. It would have been faster in single threaded situations, and probably preferable overall, but maybe not worth the heat/wait/being tied to GloFo.
 
Is if fair to say that AMDs CPUs are about as undesirable as Intel's GPUs?
Well actually Intel GPU turned out pretty great though nobody has a crystal ball and their track records was not exactly pristine... :LOL:

I wouldn't say that at all. especially what we have seen with jaguar. Bulldozer is a bad miss step but even then the core itself isn't that bad, its performance is inconsistant. whats bad is how 72mm of cores needs 240mm of L2/L3+uncore to operate.
The release of Stream roller is eargerly awaited by a lot of people, I try to hope that it will turn out well.
Still my belief is that they are beating a dead horse with their CMT approach.
I've light hopes at best.

On the contrary I think that they have a real good basis with jaguar. As I was posting I went to read (quickly) again the techreport review of Kabini A4-5000, and my memory was right, kabini is competitive with at least a ULV core i3. It looses in single thread performance but overall it does really, really well.
Once you take in account the price, the size of the chip and Intel process advantage (and Intel is one node ahead), also the fact that for some reason Turbo is not activated on Kabini ( I read it should not for tablet oriented CPUs), it is clear to me that AMD definitely has something worthy here.
Though the pictures is not ideal because power consumption is still too high, but clearly looking at the market (overall PC, tablet) I think that AMD should put the focus on that architecture.
Sadly Intel has more than IB or Haswell to compete (which are costly solutions) => AMD need to focus on something, and I question their ability to deliver while focusing on different architecture. In some presentation they spoke about a possible convergence of the CPU line after excavator, I wish it will happen, actually I'm close to think that it should happen earlier, push out stream roller and put all the working force of AMD on Jaguar successor and try to have rapid iteration of the architecture and on GPU (one arch). Edit they keep being late, Jaguar goodness is going to be tarnished by the fact that Silvermont are to launch (luckily late too but betting on INtel being late and late again is more than risky). /Edit

Anyway the point is that Jaguar are competitive, BD based parts is another matter /if you consider that letting high end part go on sale for a misery is good and is going to put AMD on a sustainable path, imo I think it is fine for costumer for now, but how much longer AMD is going to be able to play that gig is quite the question.

So I think it is fair to say that AMD desktop CPU was indeed undesirable like was Intel GPUs at the time the decision about next gen were made.
 
Last edited by a moderator:
I agree, and maybe IBM weren't in a position to provide a decent processor (although with enough money and some actual console-focus criteria, I'm sure they could have produced something). Why do you think devs were reportedly against x86 though? What were the other options and why would Sony's first parties prefer that? They would be in a fairly unique position of going lower level with their code, I guess, so perhaps the low level quality of the hardware was considered important, but between x86, PPC, and maybe MIPS again, why would the devs be against x86?
It might depend on how they were asked.
There would definitely be a history effect.
For those that leveraged Altivec and the like well, x86 didn't stop looking very bad until it had an x87 replacement in SSE2 (various complaints wouldn't go away until the SSE5 and AVX timeframe), which is post-Xbox. Even then, how many Sony first parties would have seriously evaluated the Xbox?
First parties would at that time have invested a lot in leveraging the stuff they had pretty well, so the comparison wouldn't have been as stark as it was for those that didn't have the direct pipeline.

Also, by that time it may have been clear the only real alternative to a backwards compatible PPC or at least a PPC solution that leveraged existing tools or experience was x86 and so the alien architecture was doubly penalized.

Had extracting good performance from x86 been difficult prior to then? My perception of the architecture is that it's relatively fast on poor code but relatively slow on good code, designed around PC legacy software. Is that a misconception of mine (and maybe Sony's 1st parties)?

For integer code, it was competitive or leading by the Pentium Pro or Pentium II timeframe. It was pretty dominant by the Pentium III that made it into the first Xbox.
FP was worse for years on a per-clock basis, and vector functionality was haphazard and constrained for years. The extension free-for-all between AMD and Intel probably didn't help. It did get 7-8 chance to arrive at something serviceable, though.
 
on themagicbox, it says Geometrics said PS4 has 7GB DDR5 memory allocated to games. Not sure where they got the source from though
 
on themagicbox, it says Geometrics said PS4 has 7GB DDR5 memory allocated to games. Not sure where they got the source from though

They got it from here http://gamingbolt.com/xbox-one-ddr3...am-both-are-sufficient-for-realistic-lighting
But I wouldn't read too much into it, my guess is that the interviewer is the one that stated 5 and 7 respectively, and you can read into the response what you want, but I wouldn't assume that the interviewee was familiar with both systems.
 
Status
Not open for further replies.
Back
Top