Predict: The Next Generation Console Tech

Status
Not open for further replies.
Parts of the consoles are always outdated in no time. It usually takes pretty exotic architecture choices like that of cell for it not to be outdated within the first few years of release. The way the PC's upgrade cycle moves everything will be outdated pretty quickly. So it's pretty pointless to try and design a console that won't be outdated by pc tech quickly and instead it's far more effective to pick a price you want to sell for then design for the most bang for your buck.
 
Remember when new consoles actually rivaled if not surpassed cutting edge PCs?

The PS1 did for awhile. It came out in Dec 1994 in Japan and wasn't matched until mid/late 1996.

The N64 surpassed PCs for a very short time.

The Dreamcast surpassed PCs in 1998.

The PS2 surpassed PCs in 2000.

The original Xbox surpassed/rivaled PCs in 2001.

It wasn't until the GPU market settled down with just Nvidia and ATI/AMD that PCs surpassed consoles at launch. It wasn't until the current generation of consoles that this happened. GPU development exploded.
 
Do you remember when consoles performance wasn't hindered almost entirely by how much power they could use? Efficiency makes up for brute force only up to a point and consoles these days have no hope of even matching one cutting edge PCs graphics card on power consumption grounds alone.

Case in point:

HD 6990 can use >300W

Xbox 360 excl power supply probably used < 150W at launch.
 
The 360 was just a bit over 200 watts at launch. Power consumption has become more of a concern in the desktop space lately as well. Perhaps not so much in the 6990, but you can see it was an issue for Nvidia and the 590. I'm sure it's going to be an ongoing concern going forward.
 
Excluding the power supply its probably around ~150 or close enough. It's better to compare post PSU numbers since that would make it an apples vs apples comparison.
 
Excluding the power supply its probably around ~150 or close enough. It's better to compare post PSU numbers since that would make it an apples vs apples comparison.

what do you mean excluding the power supply? Are you talking about efficiency?

If you want to make apples to apples comparisons its best not to use some number you pulled from your ass, but find a useful comparison point. Under 75% efficiency seems unlikely even for 2006.
 
I hope so. Half the fun of new consoles are the exotic architecture's I hope we don't get bog standard cpu's and gpu's this gen that'd be boring.

it's coming down to a cpu and a gpu everywhere, even on cellphone and handhelds now.

also when there's funny stuff (vector processing, image processing, decoding/encoding coprocessors, encryption etc) it's there but silently buried in the general or graphics processor rather than on separate chips. so you don't get a funny SNES, saturn or amiga anymore with various ram pools and dedicated chips. welcome to the era of the ARM SoC, intel sandy bridge and amd fusion.

there ought to be exotic architecture, a "sea of many cores". nvidia denver, intel's experiments? but I won't be surprised if it's for the gen after next gen. around 2020.
you need the transistor, power and bandwith budget.
 
Yeah, in Wii either Hollywood or Broadway has an integrated Arm CPU (aka Starlet) that runs the I/O, some OS and security features. ;) The Wiibrew people call it the chip that most makes Wii different from Cube. Not many people seem to be aware of this little middle man in there. Understanding it was instrumental for homebrew.

Honestly I would be surprised if Nintendo went with an off the shelf chip. They would most likely lose the eDRAM that they've had since Cube now. Backward compatibility would be a headache and it's obvious they took that very seriously with Wii. So I think I'm expecting something custom and maybe not all that powerful. Who knows though.

With Wii people were speculating on an X1600 but that sure didn't happen.
 
it's coming down to a cpu and a gpu everywhere, even on cellphone and handhelds now.

also when there's funny stuff (vector processing, image processing, decoding/encoding coprocessors, encryption etc) it's there but silently buried in the general or graphics processor rather than on separate chips. so you don't get a funny SNES, saturn or amiga anymore with various ram pools and dedicated chips. welcome to the era of the ARM SoC, intel sandy bridge and amd fusion.

there ought to be exotic architecture, a "sea of many cores". nvidia denver, intel's experiments? but I won't be surprised if it's for the gen after next gen. around 2020.
you need the transistor, power and bandwith budget.

What do you think are the transistor budgets (or in what time frame will they debut as dictated by manufacturing process) for next gen IMO this is the only discussion that makes sense especially if someone goes for custom hardware. I think any predictions as to architecture are venturing too far into the unknown.

I also disagree as to the statement that we are evolving towards a CPU and GPU only I think vector co-processors will evolve alongside them -maybe local eDRAm type storage as well.
 
what do you mean excluding the power supply? Are you talking about efficiency?

If you want to make apples to apples comparisons its best not to use some number you pulled from your ass, but find a useful comparison point. Under 75% efficiency seems unlikely even for 2006.

All I was saying is that a high end GPU like the 6990 which is rated to twice the power of the Xbox 360 at launch is the reason why consoles will never again approach the performance of the PC parts, though even discussing this is kinda stupid.
 
and power use as well.
graphics cards used to have no heatsinks :LOL:

Before the R9700 was introduced in the autumn of 2002, consumer graphics cards kept within the 25W limit imposed by the AGP port standard. It has been 8-9 years, during which the power draw of desktop graphics cards have skyrocketed by more than a factor of ten in order to be able to drive expanded feature sets, along with improvements in performance.

The Wii, at introduction and including power supply losses, drew less than 20W for the whole system going full blast. If the rumoured pictures of the Wii2 system aren't faked (they may well be), Nintendo is again building something with very modest cooling needs. Many have speculated on AMD HD4770 level graphics, but that card had a TDP around 75 watts at 40 nm, and moving to 28nm won't drop that down to Wii levels, even for the graphics alone. Speculation is difficult without having any idea of Nintendos power draw design targets, but it's difficult to see how they could manage even half a HD4770 and keep within the Wii power draw envelope for the console as a whole.

Power draw is really the number one constraining design target from a technological point of view.
It is actually remarkable how little game performance of graphics cards, at the same level of power draw, has changed in the last decade. Pretty much all of the lithographic advances has been spent on advancing the feature set of the GPUs (partly in an attempt to give the graphics IHVs another business leg to stand on outside gaming.) If Nintendo manages to match the current PS360 in performance at less than twice the power draw of the Wii, they've done a great job because I'm not convinced it's possible to do it within the same power envelope as the Wii even at 28nm. For instance, AMDs projected low end "Thames" mobile processor on 28nm is targeted at the 15-25W powerbracket, and even at twice the projected performance of the current low end (Seymour with 160 SPUs) it is no speed demon, and still a factor of two too high fit within the limits of the Wii. The Wii never got much credit around here for its low power draw, even after the problems of the PS3 and 360. Nor will the Wii2 I guess. But when you sit down and try to fit the whole device into 15W DC, current PC tech is not your friend.
 
Last edited by a moderator:
^ Wouldn't it be better to compare laptop GPUs where power/performance matters significantly more and the form factor and packaging considerations are the same rather than comparing consoles to desktop GPUs where overall power consumption and the ability to cool the chips are both in luxurious abundance relative to desktops.
 
Before the R9700 was introduced in the autumn of 2002, consumer graphics cards kept within the 25W limit imposed by the AGP port standard. It has been 8-9 years, during which the power draw of desktop graphics cards have skyrocketed by more than a factor of ten in order to be able to drive expanded feature sets, along with improvements in performance.

The Wii, at introduction and including power supply losses, drew less than 20W for the whole system going full blast. If the rumoured pictures of the Wii2 system aren't faked (they may well be), Nintendo is again building something with very modest cooling needs. Many have speculated on AMD HD4770 level graphics, but that card had a TDP around 75 watts at 40 nm, and moving to 28nm won't drop that down to Wii levels, even for the graphics alone. Speculation is difficult without having any idea of Nintendos power draw design targets, but it's difficult to see how they could manage even half a HD4770 and keep within the Wii power draw envelope for the console as a whole.

Power draw is really the number one constraining design target from a technological point of view.
It is actually remarkable how little game performance of graphics cards, at the same level of power draw, has changed in the last decade. Pretty much all of the lithographic advances has been spent on advancing the feature set of the GPUs (partly in an attempt to give the graphics IHVs another business leg to stand on outside gaming.) If Nintendo manages to match the current PS360 in performance at less than twice the power draw of the Wii, they've done a great job because I'm not convinced it's possible to do it within the same power envelope as the Wii even at 28nm. For instance, AMDs projected low end "Thames" mobile processor on 28nm is targeted at the 15-25W powerbracket, and even at twice the projected performance of the current low end (Seymour with 160 SPUs) it is no speed demon, and still a factor of two too high fit within the limits of the Wii. The Wii never got much credit around here for its low power draw, even after the problems of the PS3 and 360. Nor will the Wii2 I guess. But when you sit down and try to fit the whole device into 15W DC, current PC tech is not your friend.

But Wii at its launch had a tech more than 7 years old. And what is the reason to keep the power limit so low? Higher power brings with it higher costs (cooling, packaging, bigger psu), but also much higher performance, and performance was the problem with 3rd party software. I'm not saying they have to launch a 150-200W system, but 100 W is reasonable. There are 100W fanless,single slot card out there. A tweaked/revised RV740 may hit 50W (at 40nm, because i don't think they will use 28nm) and the entire system could fit into 100 W.
But it's true that the devkit doesn't seem power hungry at all. PSU may even be internal
Leaked-3.jpg


Trying to guess it's size.. it's seems around 26-30 cm tall (the optical slit in my imac is 12.5 cm) and 5 to 8 cm tick, which are not far from the original Xbox360 http://news.teamxbox.com/xbox/8417/Xbox-360-Physical-Dimensions/
 
I think it's pretty obvious that the power draw of this thing will be quite modest, but still way more than what the Wii had. Of course more details are needed to know for sure, but the rumoured performance and parts used will mean that something like 40w doesn't seem possible. Perhaps something like 70-80w?
 
I think it's pretty obvious that the power draw of this thing will be quite modest, but still way more than what the Wii had. Of course more details are needed to know for sure, but the rumoured performance and parts used will mean that something like 40w doesn't seem possible. Perhaps something like 70-80w?

At the risk of overinterpreting an image that may be fake in the first place, the really small fan exhaust on that box suggest that the first figure you gave, 40W is a reasonable upper bound. If it needed better ventilation than that little hole can provide with living room friendly fan speeds from a correspondingly sized fan, we would have seen a larger exhaust, and possibly intake vents.
Still, the box is substantially larger than the Wii, which implies that greater than Wii powerdraw is accomodated.

The GPU wouldn't have to follow the desktop balance in resources. For instance it could favour fill-rate in order to more comfortably accomodate 1080p, but skimp a bit on shader power. The rumours that it can render to four handheld screens at once implies decent fillrate, in line with 1080p requirements.
 
Yeah, in Wii either Hollywood or Broadway has an integrated Arm CPU (aka Starlet) that runs the I/O, some OS and security features. ;) The Wiibrew people call it the chip that most makes Wii different from Cube. Not many people seem to be aware of this little middle man in there. Understanding it was instrumental for homebrew.
It's an ARM926EJ inside Hollywood, running at the same clock as the GPU (246MHz).
While it did offload the main CPU for lots of stuff (mainly regarding waiting latencies for the CPU, as I don't see the ARM9 doing much actual work), there's not much you can do with a PowerPC 750 @ 750MHz anyways. Not since the mid-2000s, at least.
Even web-browsing feels jerky, even without Flash and at 800*480. Both my Defy and my N8 offer a better web-browsing experience, with all platforms using Opera.



Honestly I would be surprised if Nintendo went with an off the shelf chip. They would most likely lose the eDRAM that they've had since Cube now. Backward compatibility would be a headache and it's obvious they took that very seriously with Wii. So I think I'm expecting something custom and maybe not all that powerful. Who knows though.

Looking at the Dolphin forums, you'll see that even a Core 2 Duo @ 2GHz with a GMA 4500 can emulate many games at full speed and native resolution, and an Athlon X2 @ 2.5GHz with a Junpier can render the most demanding games (Mario Galaxy 1&2) @ 1080p and 2xMSAA, with a rock-solid 60fps.

The latest version of Dolphin uses a JIT recompiler for DirectX11 and OpenCL for a few functions.
Nintendo, with the full knowledge of the Wii's hardware and software, should be able to make at least an equally-performing emulator for their next console.

So I don't think the lack of eDRAM would be such a deal breaker.



With Wii people were speculating on an X1600 but that sure didn't happen.
I don't really think people were speculating, I think people were hoping that the Wii would be able to bring the same visuals as X360 but at 480p, which a X1600 would be able to confortably achieve (paired with a ~1GHz PowerPC CPU).
It would probably have increased the console's lifespan by 1 or 2 years.
 
Entropy said:
but it's difficult to see how they could manage even half a HD4770 and keep within the Wii power draw envelope for the console as a whole.
Clockspeed plays a large part... The 4770 was 750MHz, I'd guess something in the 400-500 MHz range for Nintendo (probably closer to 400 imho).
 
Status
Not open for further replies.
Back
Top