Predict: The Next Generation Console Tech

Status
Not open for further replies.
<snip>

I am very curious as to whether the CPU will remain IBM or not. I mean I love all novel architectures and Cell especially, so I would love to see its life extended in a v2.0, if only to see what it would look like.

I think a Cell2 is due in PS4. 2ppe+32spe cell2 roadmap that was published in 2006 at 2010 launch is still there, was shown at sc08 by the head of the STI-cell center at georgia tech. He must have some inputs on future development otherwise he wouldn't have said it publicly.

And if Sony had told IBM that it won't be using Cell2 in PS4,IMO R&D costs would be too high for IBM to go it alone this time, considering the relatively tiny market outside consoles for cell.

Further, working larabee silicon is already with Intel for quite a few months now. If it has been decided, only interfacing it with cell2 would need to be done (and the easier sw part), not a very big deal.
 
I think a Cell2 is due in PS4. 2ppe+32spe cell2 roadmap that was published in 2006 at 2010 launch is still there, was shown at sc08 by the head of the STI-cell center at georgia tech. He must have some inputs on future development otherwise he wouldn't have said it publicly.

What was published wasn't a "Cell 2" though, just a scaled Cell architecture. If Sony sticks with Cell... and I've been a proponent of the theory by the way - simply that if Intel's in the mix an alternate scenario cannot be ignored... but if Sony sticks with Cell I would expect/hope for a more fundamental improvement to the architecture. A 'Cell 2' should include a more fleshed out PPE core, if nothing else.
 
During a course of cell programming, the Bill Lundgren of Gedae said that he had recently been informed that IBM had abandoned it's cell roadmap. If true, what could Sony use in it's place, and how would backward compatability be maintained?
 
So if we take both into account, wouldn't some sort of multichip module make a lot of sense in terms of manufacturing efficiency, especially as it would help keep latency to a minimum and facilitate the use of the GPUs massive parellel processing advantage. Its about efficiency and cost, and wouldn't making best use of the advantages and mitigating the disadvantages of their available resources make the most sense in developing a cheap and powerful system for the next generation?

That is what they are doing with Fusion and it indeed does have some advantages even from a pure performance POV, but it will hit the wall (in terms of max total performance, and also heat) very soon.

To answer your question in other way, if they want a relatively small and cheap console, then it is probably better to make a single chip.

Anyway many things we dont know (like what wll bring DX12?).
 
the cell saga keeps getting more confusing. I guess that is to be expected for a product intended for a smaller and (more closed) market. If what you say is true, then maybe Intel snagged a single/dual larabee contract, ie merging cpu and gpu just like it was the plan in beginning for dual cell ps3. :) :)
 
During a course of cell programming, the Bill Lundgren of Gedae said that he had recently been informed that IBM had abandoned it's cell roadmap. If true, what could Sony use in it's place, and how would backward compatability be maintained?

I think they can add the SPUs to any core, is replace the PPE, plus Sony shouldnt be able to make extensions/modification/improvements on their own?
 
During a course of cell programming, the Bill Lundgren of Gedae said that he had recently been informed that IBM had abandoned it's cell roadmap. If true, what could Sony use in it's place, and how would backward compatability be maintained?

Hmmm... well, Gedae's a great source no doubt. It also doesn't help that for Cell's target markets, the financial crises has sort of put a freeze in. But the question you ask is definitely *the* question, and word was that Sony was sending a survey around to devs a little while ago asking for what they felt the future direction of PS architecture should be.

I would believe that the Cell roadmap is dead, while still also believing that a Cell successor was in the works. But the farther you get away from the LS model and the ISA, I mean... it's not going to be easy to emulate. There was some discussion on methods/manners a couple of months ago when Panajev brought forth an IBM patent that seemed in the vein of Cell, but cache-based. It would be a very difficult architecture to emulate though, and maybe what we'd see is that B/C really isn't maintained at all (if they do go away from IBM/STI).
 
I would believe that the Cell roadmap is dead, while still also believing that a Cell successor was in the works. But the farther you get away from the LS model and the ISA, I mean... it's not going to be easy to emulate. There was some discussion on methods/manners a couple of months ago when Panajev brought forth an IBM patent that seemed in the vein of Cell, but cache-based. It would be a very difficult architecture to emulate though, and maybe what we'd see is that B/C really isn't maintained at all.

Why not just keep the 8 SPEs while stocking up on regular cores? They are tiny, - or certainly will be in 32nm. They could prove useful in a future PS4 CPU as the auxillary DSPs they really are, and providing full BC at the same time.

Cheers
 
It seems that a lot of the extra workload for the next generation CPUs are extremely floating point intensive if you consider physics, animation and perhaps next generation interfaces which involve a lot of data points. Couldn't these be offloaded onto a hypothetical Direct3d 12 GPU which would leave the CPU free to perform the tasks it does strictly better than the GPUs of the day?
That is the premise of GPGPU, though for a console or any serious close to metal programming, the extra layer of API would be stripped away. There are still notable problems because the typical slaved GPU setup relies heavily on the CPU and there are often issues with additional software complexity and possibly limited bandwidth between the GPU and host processor.

Furthermore if we consider Intel's assertion that off-chip bandwidth is expensive as a form of gospel, would it not make sense for the console itself to utilise a 'local store' of memory to store frequently used information without having to go over an expensive outside bus?
In most cases, that would be a cache.

So if we take both into account, wouldn't some sort of multichip module make a lot of sense in terms of manufacturing efficiency, especially as it would help keep latency to a minimum and facilitate the use of the GPUs massive parellel processing advantage.
MCMs inject certain inefficiencies into the manufacturing process, though they potentially save on other headaches.
When packaging multiple chips, the probability that one of the chips is bad, that the packaging process for one of the chips fails, or that the package itself is bad increases.
The design and manufacture of the actual MCM is more complex and expensive, so it's not normally done unless other factors really make it necessary.
 
Why not just keep the 8 SPEs while stocking up on regular cores? They are tiny, - or certainly will be in 32nm. They could prove useful in a future PS4 CPU as the auxillary DSPs they really are, and providing full BC at the same time.

Cheers

Uh oh you did not go the DSP route! :p

Anyway yeah, they could do that. In fact technically they'd only need 6 SPEs I think, since 'reserved for OS' considerations could be handled external to the SPEs.

But even though it makes some sense, I just don't see Sony doing it on a completely foreign architecture. No good reason per se, but just a feeling stemming from the present corporate culture at SCE, which seems more passive and less proactive now from a technology standpoint.
 
The SCE-specific fabs were sold to Toshiba (Nagasaki SOI lines and the OTSS JV lines), and a joint venture was set up to manage them. Sony still holds an output interest at East Fishkill I think though, not that that interest would be a leading consideration probably in what they do.
 
I am a bit out of touch: does Sony still own their own fabs?

What did you vote in the survey ? to mantain cell ? to mantain spes ? to switch to x86 cores ? ;)

Sony going with Intel for the gpu is the smartest thing they could do to compete against Microsoft and not fall in the costs of this generation. Intel is the best in manufacturing techs, so they could ensure a 32 nm chip for the time PS4 goes out, while ATI will struggle much more with that process, and maybe Intel may be the only one to reach 22 nm in 5-7 years.
Besides Intel is willing to put Larrabee in a console, so, even they could be even willing to sell it without profit.

Talking about the CPU, it would be a pity to trash cell, after all developers are developing tools and engines based on it that as seen in games like Killzone 2 and Uncharted 2 would deliver in spades with a PS4 with a Larrabee inside instead of RSX.

The problem is, would the rest of developers, the ones that develope multi games, go on making worse versions in PS4 than in Xbox 720 due to keep going with cell ?... Then, the only solution would be to trash cell...

In this case i am sure Sony will go with Intel for everything ( CPU + GPU ).

On the other hand, for tech geeks like many of us ( i won´t include all the developers that post in Beyond3d ) a CELLV2 + LARRABEE would be a dream machine ;).
 
Had the rumor been that Larrabee is taking the place of both the CPU and GPU for the PS4, this I could perhaps see a little sense in. Larrabee as a GPU with another arch for a CPU just doesn't make any sense to me. Why even bother? This detail alone makes me toss this rumor into the round file, or at least think that if part of this is true that the details are all wrong.

If Sony did initially think about a CELL only PS3 but ultimately added the RSX anyway because the CELL wasn't a good target for texture operations, a Larrabee only PS4 doesn't seem all that far away from the goal of a fully unified massively parallel software rendering platform.

Wasn't it posted somewhere that Larrabee's single thread scaler performance to be about 1/3 the equivalent clocked x86 core which has all the fancy out of order execution, register renaming, etc. Likely Larrabee could be clocked higher and make up some of this difference. Could Larrabee's scaler performance on just one core be around or likely better than what we have now with the PPE. Why even bother with a separate CPU. Game developers are going to have to get used to (some are already there) single thread performance getting nearly maxed out soon, the only future is parallel programing.

Why would anyone want a separate CPU when then you have similar limitations as we do now. Larrabee has the possibility with a custom software render to remove the issues with CPU side visibility and display traversal, limitations in number of draw calls, etc. Why go through the mess of SPUs (or something else) doing culling, and skinning, and more, all of which has to go through a round trip to main memory, when with something like Larrabee, you simply do effectively GPU side scene traversal and everything stays in the cache? All this complexity and overhead goes away with Larrabee as both the CPU and GPU.
 
Talking about the CPU, it would be a pity to trash cell, after all developers are developing tools and engines based on it that as seen in games like Killzone 2 and Uncharted 2 would deliver in spades with a PS4 with a Larrabee inside instead of RSX.

Same goes for a PS4 with whatever NVidia is doing for the GT3xx.

On the other hand, for tech geeks like many of us ( i won´t include all the developers that post in Beyond3d ) a CELLV2 + LARRABEE would be a dream machine ;).

Speaking from a developers perspective, CellV2 + Larrabee would be a nightmare machine. You'd have two vector machines each with completely different arch, with completely different memory/cache models doing nearly the same type of workload. You would have to carefully engineer and design separate systems to keep each (cellv2 and larrabee) fully busy, with a huge amount of re-programming and re-engineering to transfer work between the two types of processors. And if you are not a Sony studio, a lot of this PS4 code isn't going to be getting used on the 720.
 
Cell2 + Larabbee makes absolutely no sense. None at all.

If memory serves Sony signed a contract to collaborate with Nvidia going forward. Correct me if I am wrong. We should also temper any Larabbee predictions with how it would perform better in the system as its primary task then something from Nvidia or AMD/Ati.
 
fudzilla said:
Fortunately, Sony Computer Entertainment has moved quickly to deny this latest rumor that Intel will be producing the GPU for the PlayStation 4. To further correct the given claims, TechRadar spoke with a Sony Computer Entertainment Europe representative who clearly stated, "it's nonsense, and is quite possibly the best work of fiction I've read, since Lord of the Rings."

http://www.fudzilla.com/index.php?option=com_content&task=view&id=11885&Itemid=1
 
Status
Not open for further replies.
Back
Top