Of course it'll be a big jump compared to current consoles. But at a $400 price point, they will be out-dated in no time.You're delusional if you think they need to be that expensive in 2014 to be a big jump.
Of course it'll be a big jump compared to current consoles. But at a $400 price point, they will be out-dated in no time.You're delusional if you think they need to be that expensive in 2014 to be a big jump.
Excluding the power supply its probably around ~150 or close enough. It's better to compare post PSU numbers since that would make it an apples vs apples comparison.
I hope so. Half the fun of new consoles are the exotic architecture's I hope we don't get bog standard cpu's and gpu's this gen that'd be boring.
it's coming down to a cpu and a gpu everywhere, even on cellphone and handhelds now.
also when there's funny stuff (vector processing, image processing, decoding/encoding coprocessors, encryption etc) it's there but silently buried in the general or graphics processor rather than on separate chips. so you don't get a funny SNES, saturn or amiga anymore with various ram pools and dedicated chips. welcome to the era of the ARM SoC, intel sandy bridge and amd fusion.
there ought to be exotic architecture, a "sea of many cores". nvidia denver, intel's experiments? but I won't be surprised if it's for the gen after next gen. around 2020.
you need the transistor, power and bandwith budget.
It wasn't until the GPU market settled down with just Nvidia and ATI/AMD that PCs surpassed consoles at launch. It wasn't until the current generation of consoles that this happened. GPU development exploded.
what do you mean excluding the power supply? Are you talking about efficiency?
If you want to make apples to apples comparisons its best not to use some number you pulled from your ass, but find a useful comparison point. Under 75% efficiency seems unlikely even for 2006.
and power use as well.
graphics cards used to have no heatsinks
Before the R9700 was introduced in the autumn of 2002, consumer graphics cards kept within the 25W limit imposed by the AGP port standard. It has been 8-9 years, during which the power draw of desktop graphics cards have skyrocketed by more than a factor of ten in order to be able to drive expanded feature sets, along with improvements in performance.
The Wii, at introduction and including power supply losses, drew less than 20W for the whole system going full blast. If the rumoured pictures of the Wii2 system aren't faked (they may well be), Nintendo is again building something with very modest cooling needs. Many have speculated on AMD HD4770 level graphics, but that card had a TDP around 75 watts at 40 nm, and moving to 28nm won't drop that down to Wii levels, even for the graphics alone. Speculation is difficult without having any idea of Nintendos power draw design targets, but it's difficult to see how they could manage even half a HD4770 and keep within the Wii power draw envelope for the console as a whole.
Power draw is really the number one constraining design target from a technological point of view.
It is actually remarkable how little game performance of graphics cards, at the same level of power draw, has changed in the last decade. Pretty much all of the lithographic advances has been spent on advancing the feature set of the GPUs (partly in an attempt to give the graphics IHVs another business leg to stand on outside gaming.) If Nintendo manages to match the current PS360 in performance at less than twice the power draw of the Wii, they've done a great job because I'm not convinced it's possible to do it within the same power envelope as the Wii even at 28nm. For instance, AMDs projected low end "Thames" mobile processor on 28nm is targeted at the 15-25W powerbracket, and even at twice the projected performance of the current low end (Seymour with 160 SPUs) it is no speed demon, and still a factor of two too high fit within the limits of the Wii. The Wii never got much credit around here for its low power draw, even after the problems of the PS3 and 360. Nor will the Wii2 I guess. But when you sit down and try to fit the whole device into 15W DC, current PC tech is not your friend.
I think it's pretty obvious that the power draw of this thing will be quite modest, but still way more than what the Wii had. Of course more details are needed to know for sure, but the rumoured performance and parts used will mean that something like 40w doesn't seem possible. Perhaps something like 70-80w?
It's an ARM926EJ inside Hollywood, running at the same clock as the GPU (246MHz).Yeah, in Wii either Hollywood or Broadway has an integrated Arm CPU (aka Starlet) that runs the I/O, some OS and security features. The Wiibrew people call it the chip that most makes Wii different from Cube. Not many people seem to be aware of this little middle man in there. Understanding it was instrumental for homebrew.
Honestly I would be surprised if Nintendo went with an off the shelf chip. They would most likely lose the eDRAM that they've had since Cube now. Backward compatibility would be a headache and it's obvious they took that very seriously with Wii. So I think I'm expecting something custom and maybe not all that powerful. Who knows though.
I don't really think people were speculating, I think people were hoping that the Wii would be able to bring the same visuals as X360 but at 480p, which a X1600 would be able to confortably achieve (paired with a ~1GHz PowerPC CPU).With Wii people were speculating on an X1600 but that sure didn't happen.
Clockspeed plays a large part... The 4770 was 750MHz, I'd guess something in the 400-500 MHz range for Nintendo (probably closer to 400 imho).Entropy said:but it's difficult to see how they could manage even half a HD4770 and keep within the Wii power draw envelope for the console as a whole.