Predict: The Next Generation Console Tech

Status
Not open for further replies.
Will they offer backwards compatibility from the get-go? What options do they have here? If they go to another architecture could they simply sell the PS3 as an SOC on an add-in card, much the same as how Microsoft offers HDD swapping.
 
Would you give up say a bank of 200 shaders (AMD style) in order to have say another 20MB of ED-RAM on top of everything else? Sure it's nice to have but how much of a tradeoff would you be willing to make to get it, assuming you're designing a system for yourself.
I am not even sure if I would personally want any MSAA hardware in the next generation console. It of course depends how much die space all that extra hardware takes (we have to assume MSAA would be at least 8x). I would take double ROPs (double fillrate) any day over 8xMSAA hardware (double fillrate requires also less BW than 8xMSAA). Without MSAA the EDRAM would offer less benefit. Hierarchal Z-buffering techniques and Z/color compression techniques have evolved since last generation consoles. With all the new bandwidth saving technologies in place, it would likely be more cost effective to simply have separate graphics memory, instead of EDRAM. However having UMA (shared GPU&CPU memory & memory bus) without having EDRAM sounds like a suicide. And without UMA you will likely have more GPU->CPU latency, since you have to transfer data from graphics memory to main memory (mixed CPU/GPGPU calculation would suffer). As long as the extra latency is under one frame, it doesn't matter for most algorithms (virtual texturing for example).

DX11 hardware has also new compressed HDR texture formats (shared exponent 9995 and BC6 block compressed HDR). 11f-11f-10f render target is also good way to save BW compared to 4x16f. These formats help with the GPU bandwidth bottleneck when rendering/sampling HDR data. With current limited formats, we need to waste ALU instructions when storing/reading HDR data or use wider format that uses 2xBW (LogLUV, RGBE/RGBM, etc).
 
Last edited by a moderator:
And without UMA you will likely have more GPU->CPU latency, since you have to transfer data from graphics memory to main memory
No you don't. From what I know in PS3 both GPU and CPU can directly access each others memory banks. It's not as low-latency as their local memorys but still fast enough.
 
Part 3, it seems that larrabee failure pretty much convinced every body that the idea is insane till I'm not.

Well I remember when they talked about designing Cell, they were considering the memory model candidates that was going to be used in massively multi-cores chip, the design team consider cache hierarchy like the one proposed for Larrabee, unsuitable. That's why Cell went with DMA and local store.

I honestly don't know why Larrabee project was canned, but Cell is working in PS3 and Larrabee is vapoware, so I guess the Cell team made the right choice by going with DMA and local store at the time. Maybe something like Larrabee just doesn't work today and still a decade away.
 
Indeed I could see flash being a viable medium for the gen after as physical storage.

Flash is perhaps viable as the medium of distribution as well. Imagine quieter operation, little or no loading time, and no more scratched discs. It will add a few bucks to each game, but flash memory's fast random access means the console can make do with less RAM. When backward compatibility is no longer an issue, the optical drive can be omitted too.
 
I don't think that the comparision between the Cell and Larrabee is valid they did not pursue the same goals, there was an order of magnitude in regard to the transistors budget between the two chips.
The cell went with the cheapest solution possible, it lets the burden of moving data on the programmers only. It's neither a successful bet as Cell roadmap has been terminated. Ok it actually shipped and sell a bit, but that's not on the PC market, say it aimed the PC market (just replace the PPU by a X86) it would not most likely not have shipped either. Larrabee wanted to enter an existing market, high GPUs that's tougher than existing "in insulation" for a while.

As you say Larrabee like products may not exist for another decade but that doesn't mean that the product could not exist in insulation (like in a console), the platform is at a deficit in perfs it's indeed suicidal to wanted to emulated an specific API as directX on it, it simply goes against the system strength.
In my proposal I tried to make clear that if you were to compare one chip to a GPU of the same size it's a "non match" the many-cores will get crushed, it's suicidal but my idea is that if you use to chips then the situation differs a bit. The difference in power between a CPU+GPU and 2 chip like the one I describe lessens consistently.
There is also the raw power the chip would aimed at, it's way lower than Larrabee, it's a lot less cores, a lot room for optimizations, etc. Intel aimed 2TFLOPS on a single chip and failed.
I've no hope that such a chip will ship but I'm far from sure than the concept is doomed even using nowadays lithography processes.
 
No you don't. From what I know in PS3 both GPU and CPU can directly access each others memory banks. It's not as low-latency as their local memorys but still fast enough.
Yes, you will get more latency. The amount of extra latency of course depends on the memory architecture. I doubt we will get as high latency as PC hardware does (data needs to be copied over PCI Express bus), but if the new console is using more mainstream hardware (with completely separate graphics memory), we have to expect a higher CPU->GPU->CPU latency as well.

And nothing really beats the latency of sending commands directly from CPU L2 cache to GPU. I really wish future fusion chips would support something like that also. Fast CPU<->GPU callbacks would offer a huge array of new options.
 
What about Series 6 ? Will it ?
Thats a good question and we have only little informations*.. like 1 core powerVR 6 have almost same performance of 2 cores powerVR 5 SGX543(MP2) at same clock (for smartphones generally 200MHz) and new shader archtecture called USSEx **..and they have Caustics Ray-Tracing soft and hardware engine ( http://www.caustic.com/ ).


* http://www.electronista.com/articles/11/02/18/imagination.slips.out.first.powervr.sgx600.details/

**
http://en.wikipedia.org/wiki/PowerVR

Som info here too
http://www.anandtech.com/Show/Index/4225?cPage=6&all=False&sort=0&page=5&slug=the-ipad-2-review

A very interesting interview about powervr archtecture:

http://www.gamesindustry.biz/articles/digitalfoundry-powervr-tech-interview

Same Interview here
http://www.neogaf.com/forum/showthread.php?p=27128124

Dream mode here...i think is customized powerVR 6 SGX600 MP16,800MHz,60/80 watts,1.68TFlop,150/200 mm2 at 28nm could be an very interesting option for ps4.
 
Last edited by a moderator:
The cell went with the cheapest solution possible, it lets the burden of moving data on the programmers only. It's neither a successful bet as Cell roadmap has been terminated. Ok it actually shipped and sell a bit, but that's not on the PC market, say it aimed the PC market

cell doesnt need a roadmap, all it needs is a mass market device to be put in

once kutaragi left, it no longer had to be aimed at the pc market. ps3 could possibly sell another 50 million consoles in the next 5 to 10 years and have an install base of 100 million, which would make it very successful, as for this case of 'not worth the investment' - if it was in ps4, they could recoup the costs over 2 generations and install base of possibly 200 million

as with ps3, sony "could" throw whatever they want into the ps4 and know that devs are going to have to learn it just like ps3, might as well throw in something they already know, also porting between ps3 and ps4 would be made easier with cell (along with backwards compatability)

also, your "cheapest solution possible" is not the case at all, silicon used for managing data is silicon that could have been used for execution, cell is about processing power, at the time intel was still releasing core duos
 
Last edited by a moderator:
cell doesnt need a roadmap, all it needs is a mass market device to be put in

once kutaragi left, it no longer had to be aimed at the pc market. ps3 could possibly sell another 50 million consoles in the next 5 to 10 years and have an install base of 100 million, which would make it very successful, as for this case of 'not worth the investment' - if it was in ps4, they could recoup the costs over 2 generations and install base of possibly 200 million

as with ps3, sony "could" throw whatever they want into the ps4 and know that devs are going to have to learn it just like ps3, might as well throw in something they already know, also porting between ps3 and ps4 would be made easier with cell (along with backwards compatability)

also, your "cheapest solution possible" is not the case at all, silicon used for managing data is silicon that could have been used for execution, cell is about processing power, at the time intel was still releasing core duos
Blabla point is IBM cut it, toshiba cut it and for Sony it's still unclear. The no road map no further development that's it, linux development have stalled really early etc.
Sony sells 50 millions PS3 how that's related to the Cell? Do you have any proof that with something else it would not have sold more units? Actually in regard to early PS3 games one may wonder if they could indeed have sold more.
They will sell some more Cell? They will sell some more ps3 and it's quiet different than selling a Cell to costumers. On top of it it's not like Sony can go with something else and still sell a PS3 as a PS3.
Are you sure IBM, Sony and Toshiba manage to offset the R&D cost of the Cell? My best case scenario is that they'll end in the grey. Sony doesn't make money by selling the Cell, they make money out of the ps3 as a whole, actually it's more like they lose a lot of money. Toshiba lose money most likely too. I'm not sure IBM manage to get in the grey by selling tenths of thousand chips for the HPC.
And yes it's the cheapest solution, you say it too, they put more execution units and let all the burden on developers. It's cheap in silicon, it's power efficient, it's powerful point is the market rejected it.

You take one of the sentence of my post in insulation and push more than iffy arguments. My point is like Cell a many cores may find in place in a console which is really different than shipping the part on its own on already established market (like CPU and GPU ones). Is it worse the R&D? It's pretty questionable. Is it doable? It's questionable too. But jumping on the guns about how the Cell is successful is a joke, where are the AMD, Intel, etc. part to counter such a threat? Why nobody copied the design? Why there is no further development?
 
Now that's strong... I'm the one in the 2006 stance when you're the one jumping on the guns because in one of the sentence of my post I dare to criticize "Da Cell" Sorry but that's How YOU sound.

Cell was not intend at the ps3 only go figure, whereas it makes sense or deliver in the ps3 is not relevant to what I was saying. Larrabee was not to ship on "insulation".
The EE+GS was successful because it allowed pretty outstanding things, end up cheap and Sony made money. How the Cell compare to that taking in account the huge R&D budget the partners etc. ?
I guess for you the answer "it's great I'll jump on the gun on any body saying something about it"?

Now can we please resume to either many cores, powerVR products or anything relevant to the discussion. Even a evolution of the Cell that could make sense for Sony in the PS4?
 
Definitely. I've never understood why people suggest Power7 when it's a server processor. Well, I guess it's a higher number than other Power cores, so must be better. ;)

At least (7/5) times faster per core than the 360 dev kits!

I also wonder if ARM can't be wielded. As long as they have a scalable system where a console could use loads of cores, it could do the job. Effectively being what Cell set out to be, a scalable architecture with portable code between devices, but starting from the power efficient end and scaling up, instead of Cell which started from the high performance end and never managed to scale down competitively. ARM cores are suitably titchy; they'd just need the right topology. this would be the best solution IMO. The same cores could be used in a mobile, tablet, netbook, laptop and console, running exactly the same code, just scaled accordingly, without needing middleware getting in the way of code efficiency. One architecture to rule them all! ;) Combine it with SGX's scalability, and the final console may not be the most powerful, but if it was bytecode compatible with the handheld and tablet versions, it'd offer superb value. Buy one game, play it on all devices you own. Stream content from one device to another. The requirement for a single set of service and application code would make maintaining complex device interactions nice and easy, unlike maintaining a middleware like PSS across very varied hardware.

Well it'd save on running everything for the handheld and tablet as managed code like XNA and Windows Phone 7. And NGP is Arm 9, and PS4 could be Arm 15, and 15 is higher than 9 so it could still be a lot faster ...

If Sony don't go with the likely candidate of IBM then maybe Arm is a possibility after all ...

OOOE will fit in better with big developers with a mix of abilities, allowing junior programmers to churn out functional, if not pretty, code that the processor does a reasonable job of running, with critical functions given over to the senior programmers to write. Given the complexity of modern development, a console that can offer cheaper, easier development is going to be attractive.

You never know where the next killer app is going to come from, and no-one can guarantee always being the lead platform next gen, given the way this gen is shaping up. If less, more complex cores offered less risk (and more chance to focus on network features and library) maybe that'd be worth it.
 
I've been wondering how high can one actually clock those newer ARMs. I know the A8 in my N900 has been OC'd from 600 to 1700MHz. With decent cooling I don't think it's out of question to see ARMs at well over 3GHz.

The stuff I was reading yesterday was talking about A15s going up to 2.5gHz. A couple of years, a few tens of millions of dollars and a big heatsink may well be able to extend that. :D
 
Sony Banking On New Super Fast Cell Processor For PS4 & Bravia TV
http://www.smarthouse.com.au/Gaming/Industry/F5C6F8A6?page=1
IBM sources claim that the new multi core PowerPC processor, which Big Blue, has spent several years developing is now part of a joint development project with the Japanese Company.

Sony recently moved to buy back the Toshiba Cell factory in Nagasaki for $600 million with some tipping that this will be used to manufacture the processor for both Sony Toshiba and IBM devices.

The new 32nm Cell processor is tipped to be capable of up to 16 SPEs which is twice as fast as the current Cell processor according to IBM leaks.

Japanese sources claim that Sony is gearing up to manufacture the Cell processor in bulk with some analysts tipping that the new processor will also appear in Sony notebooks and built into new Sony Bravia TV's.
 
The stuff I was reading yesterday was talking about A15s going up to 2.5gHz
Was it inside a tablet or netbook that has nearly non-existent cooling and most likely <<20W TDP? Allow it to eat some 100W on the initial release and then it gets interesting :)
 
Sony Banking On New Super Fast Cell Processor For PS4 & Bravia TV
http://www.smarthouse.com.au/Gaming/Industry/F5C6F8A6?page=1

Well, good. I remember reading that IBM got the Sony contract for PS4 after Larrabee fell through, and a new Cell chip makes a very great deal of sense to my way of thinking.

I have no idea how reliable this site is, though. It would be really great to see something official from IBM, perhaps at next year's Hot Chips.
 
I'm pretty sure the part about notebook is bs as there's not that much Os that run on PowerPC.
I also wonder about the utility of a better Cell in TV, Cell delivers enough power already for that purpose.
16 SPUs well it could make sense but on bulk? Have we seen CPU working @ 3,2GHz made on bulk?
As for the fabs isn't that fab the one that were delivering SOI process for cell?
Waiting for some people to dig further as I've some paper work to do now, some stuffs smell funny.
 
Status
Not open for further replies.
Back
Top