Predict: The Next Generation Console Tech

Status
Not open for further replies.
if for example to achieve 100GB/s with xdr2 you need only 64bit but 200 traces, and with gddr5 you need 128bit but only 160 traces, the best option is gddr5 because wider xdr2 bus are out of budget

It is not like more traces on the PCB = higher price.
Price of the PCB is given by total surface, number of layers and design/construction class (if that is right term in English, higher design class mean lower width of the traces and insulating gap).
So whatever memory need more traces it doesn't automatically mean higher price and there are probably other more important factors in decision process.
 
Sony basically paid the cost for making XDR mass production ready, regardless of whether the consoles themselves are loss leaders I don't think so much money will be spent on R&D this time ... so unless Rambus does a U-turn and actually spends some money on getting their products used instead of getting their patents used XDR2 seems right out.
 
Don't understand why some of you gang up on Elan, he just want new PlayStation to be as powerful as possible and clever use of available tech and manufacturing process is way to go. I still have hope for something interesting from Sony as CTO Masaaki Tsuruta was talking about through-silicon vias and through-chip interfaces.
Off-the-shelf PC parts will be outright boring.

@Brit: XDR2 is real, so why dissing Elan.
 
Last edited by a moderator:
There's nothing wrong with dreaming, except it doesn't fit in with this Prediction thread. This is not a dream or wish list thread. The predictions are supposed to have some basis in reality.

Thank you, move along now.
 
when i talk about generic stuff i mean things like cpu design, memory design etc. pointless really.

The only thing that is a remotely a pc part in the ps3 is the rsx.
You mean the only thing that is even remotely not a PC part is the CPU and split RAM with XDR. Everything else (ports, GPU, IO devices, optical_ is conventional hardware design.

And nvidia shafted sony in that aspect.
Yeah. 7xxx was only the most powerful GPU architecture availble at the time PS3 was being put together... Or rather, to people with a more realistic view of what happened, nVidia didn't shaft anyone, but AMD pulled out a gorgeous architecture that worked extremely well. Credit to AMD, and not criticism to nVidia.

That's why sony should go for something more powerful and unique that is their signature trademark. Something that would take sometime for pc to ages to catch up. even now the cell trumps everything in processing power in terms of costs and wattage.
Unique isn't always better. The rate of GPU progress by those dedicated to its development is massive. Why stick in some random freako GPU like GS^2 if it won't be competitive in features or performance? Or stick in a freako CPU that makes your developers lives a misery? That's just poor design, to put in something odd just for the sake of being different.

the ps4 must last another 7-10 years.
Why?

Don't understand why some of you gang up on Elan...
Because he's not applying reason to his arguments, but emotional passion and a desire to see his dream hardware made, regardless of how economically poor or technical difficult that may be. the thread isn't suppsed to be a dreamlist, but a considered investigation into what might actually be appearing in the next boxes.
 
There is no point in dreaming about tech that just can't be put in a box that can only cost a couple of hundred bucks.

I wonder how cost efficient if would be for Sony to design their own cpu/gpu. They would need to invest hundreds of millions in designing a chip they can only sell 50~100 million times and when they do they can't even make any real profit on it as the console needs to be as cheap as possible. Not to mention that it would probably be hard to design a chip out of nowhere where nvidia/amd are steadily evolving their designs. I really think that if you want highish end hardware there is no way of getting around going to amd/nvidia/intel/ibm to have them design your stuff. Much cheaper and I doubt it would make much of a difference performance wise if you would design your own chips.
 
I wonder how cost efficient if would be for Sony to design their own cpu/gpu...Not to mention that it would probably be hard to design a chip out of nowhere where nvidia/amd are steadily evolving their designs.
Exactly. What exactly is a custom part supposed to do that the cheaper options from the experts won't be able to handle? Real-time raytracing? Hardware voxel rendering? We have the likes of IMG who have been in the business as long as anyone fighting to grow into the large GPU market. Sony are supposed to come up with a magic solution from nowhere without prior expertise that trumps everyone else? That's plain unrealistic thinking.
 
Don't understand why some of you gang up on Elan, he just want new PlayStation to be as powerful as possible and clever use of available tech and manufacturing process is way to go. I still have hope for something interesting from Sony as CTO Masaaki Tsuruta was talking about through-silicon vias and through-chip interfaces.
Off-the-shelf PC parts will be outright boring.

@Brit: XDR2 is real, so why dissing Elan.


i can only surmise the majority of the folks here are mundane and conservative in nature. Please dont jump on me. I was only joking.


sony should go big, unique and seperate from the rest. That's what makes them special as hardware manufacturer. when they copy they become very bad.

Take ps3 graphical power/specs. quadruple them , quadruple somemore and that's your ps4. Enough to last another 10 years.
 
sony should go big, unique and seperate from the rest. That's what makes them special as hardware manufacturer.
You need to extend with an explanation of how, with what design, and how they are going to run that effectively as a business. Your finishing remark that they should release 16x PS3 shows your wishes aren't at all grounded in a business reality.
 
Take ps3 graphical power/specs. quadruple them , quadruple somemore and that's your ps4. Enough to last another 10 years.

And yet that kind of power is available right now, today from those "generic out of shelf crap parts".

So it begs the question, if that's the kind of power you're looking for, what's you problem with off the shelf parts (modified appropriately to better work in a console environment)?
 
i can only surmise the majority of the folks here are mundane and conservative in nature. Please dont jump on me. I was only joking.


sony should go big, unique and seperate from the rest. That's what makes them special as hardware manufacturer. when they copy they become very bad.

Take ps3 graphical power/specs. quadruple them , quadruple somemore and that's your ps4. Enough to last another 10 years.
The question you have to ask yourself is what did Sony gain this gen from the design of PS3? There are several answers to that question and all of them are bad and all of them steadily led Sony (along with other decisions in other parts of business) into serious decline.

One more generation like this and they are out. Generation of complicated and expensive hardware that in real life situations only means loss of money and not better looking game is not what they need, and they got that with Vita finally so they are obviously not so dumb.
 
I don't agree with Elan in general (I am taking a shower after this post) but I don't think the problem with XDR2 is manufacturing. Rambus is an IP company. XDR1 sampled later in its development cycle than XDR2 has in regards to a potential new console in 2013+. Very similar situation in regards to *if* a company wanted XDR2 to what Sony did with XDR1 -- which is not all that different from MS going to Charter, TSMC, etc and having them fab Xenon. And for whatever price premium XDR1 is costing in the PS3 it hasn't slowed cost reduction. So in theory I don't think getting XDR2 to market is the big challenge--no more so than the PS3 and XDR really wasn't what was killing PS3 component costs.

No, the problem I see is how would they even use XDR2? AMD has never made an XDR memory controller for an X86 or X64 processor line. Nor Nvidia for their GPUs or ARM line. Nor ATI (AMD Graphics) for their GPU line. IBM's PPC models don't have one and they had to invest a lot of work to get the EIB for Cell with XDR (totally custom design). ARM in general doesn't. Memory controllers are NOT trivial. There really is no indication that Sony is doing custom chips so they are pretty much left contracting customizations of current products. So while a Piledriver x64 core with a different caching architecture and core count is *likely* and Piledriver with XDR2 memory interface is *not likely*.

And if I were Sony why would you even be looking at XDR2? If you are going to invest all the resources into XDR2 development for production and memory interfaces why not use the same effort to work with one of the DRAM makers to bring in stacked memory? Stacked memory is going to address a) size b) bandwidth c) power d) PCB footprint, etc issues to varying degrees.

If I had my 300 engineers to allocate to a design problem (XDR2 versus stacked memory for the PS4) I think the one you look to push for in the 2013-2014 time window is obvious. XDR2 is living in the past IMO and really is inferior to the benefits of stacked memory.
 
You need to extend with an explanation of how, with what design, and how they are going to run that effectively as a business. Your finishing remark that they should release 16x PS3 shows your wishes aren't at all grounded in a business reality.

Is it really so unreasonable to expect 16X? It actually fits very well with Moore's law for a 2014 ps3 or a 2013 xbox (8 years, doubling every 2).

Teraflops wise, if that's anything to go buy, it's also at least vaguely possible. If Ps360 are quarter TF, Tahiti is near 4TF. I realize we've been ingrained to think Tahiti is unimaginable in a upcoming console, but I'm not 100% sure it is (especially if we get to 22nm by then). It's a 350mm, 250 watt chip on 28nm. That's not obscene, especially on 22nm.

Note I dont expect 8GB of RAM, but I also dont think it's totally out there, especially if they go with some DDR3 main RAM/cache+GDDR5 Video RAM type setup as some suggest.

I do think in general this thread perhaps tends to be overly pessimistic. If you look at all the UE4 talking Mark Rein is doing out of GDC, nothing there suggests we're looking at less than some powerful machines.
 
The issue with 8GB of memory is the current memory chips are at 4Gb (aka 512MB) so hitting 8GB of DDR3 means 16 chips. But then again who says it has to be on the PCB? With how cheap DRAM modules are on the PC they could always look in that direction (crazy, I know) if someone felt that size was necessary. I didn't see any info on DDR4's density so it probably isn't above 4Gb either. But it begins product at the end of 2012 with 2013 availability. It should offer some speed improvements but more importantly power.

Personally, it looks like late 2012/early 2013 is an inflection point for a LOT of new technology (stacked memory, stacked chips to a degree on SIs, FINFETs) that address a lot of power and performance issues. 2013 seems like a bad time to try to push out a new platform, especially if you are going to be relegated to 28nm (which will be a 2 year old process then). I am sure there are strategic reasons to not aim for 2014 but it seems there would be a lot more clarity in regards to memory and other problems. And some of the nice toys, like SSD, only benefit over time.
 
Is it really so unreasonable to expect 16X? It actually fits very well with Moore's law for a 2014 ps3 or a 2013 xbox (8 years, doubling every 2).

Teraflops wise, if that's anything to go buy, it's also at least vaguely possible. If Ps360 are quarter TF, Tahiti is near 4TF. I realize we've been ingrained to think Tahiti is unimaginable in a upcoming console, but I'm not 100% sure it is (especially if we get to 22nm by then). It's a 350mm, 250 watt chip on 28nm. That's not obscene, especially on 22nm.

Note I dont expect 8GB of RAM, but I also dont think it's totally out there, especially if they go with some DDR3 main RAM/cache+GDDR5 Video RAM type setup as some suggest.

I do think in general this thread perhaps tends to be overly pessimistic. If you look at all the UE4 talking Mark Rein is doing out of GDC, nothing there suggests we're looking at less than some powerful machines.

To the bolded It is when the 90nm RSX was somewhere around 70W based on Cell being ~50W at 90nm according to this link (if my maths is right)

http://realworldtech.com/page.cfm?ArticleID=RWT022508002434&p=1

45nm Cell is under 20W with a ~60% power consumption reduction from 90nm. Which I make ~50W.

I can't see any next gen system going for a 200-250W GPU.
 
The issue with 8GB of memory is the current memory chips are at 4Gb (aka 512MB) so hitting 8GB of DDR3 means 16 chips. But then again who says it has to be on the PCB? With how cheap DRAM modules are on the PC they could always look in that direction (crazy, I know) if someone felt that size was necessary. I didn't see any info on DDR4's density so it probably isn't above 4Gb either. But it begins product at the end of 2012 with 2013 availability. It should offer some speed improvements but more importantly power.

if they can achieve 1GB ultra fast memory on interposer next to the GPU, and 4GB ddr4 on a 64bit ddr4 channel for the CPU I believe you've got there a hell of a great system.

if all can fit on an interposer with the ddr4 off die / off chip you can get ultra fast communication between the CPU and GPU but that requires concerted design and a not so off-the-shelf CPU.
 
The question is can they get it in 2013? But then again maybe we should be looking more toward 2014? Because, yes, based on the "tech that would be available" your first scenario would really fit the traditional mold for a console and something that could be exploited in neat ways in a closed box system for years.
 
Status
Not open for further replies.
Back
Top