Wii U hardware discussion and investigation *rename

Status
Not open for further replies.
Well, i hadn't be following the rumors lately, but before VGleaks, there were two guys that mentioned Broadway.

One was Arkam from gaf, he said the term "enhanced broadway" came from Nintendo themselves. Another guy on this thread, username espresso, said:


If those guys were really devs or had connection, the term "enhanced broadway" were probably not speculation but terms directly from Nintendo. I also don't believe those came from a document/manual, but no said it was. I think it's just a brief description of WiiU on the warioworld page.

Just FYI (not disagreeing or agreeing) but Arkham was responsible for the VG leaks aswell, it seems. He said as much in a GAF thread about those very leaks. So it wasnt another dev corroborating his information.

Espresso, however may well be a different person. Seemed to have the exact same information though either way.
 
That is why I don't think it is Power7-based. Too much work and little "evidence" to support it. I believe the only times Power7 came up were from the twitter account(PR spin). "Watson" came from engadget and we all know how reliable they are.
I also lean more toward Broadway rather than 476FP because of BC, perhaps less changes in their tools, and their huge experience with it. Those may translate to faster adaptation and lower cost.
 
That is why I don't think it is Power7-based. Too much work and little "evidence" to support it. I believe the only times Power7 came up were from the twitter account(PR spin). "Watson" came from engadget and we all know how reliable they are.
I also lean more toward Broadway rather than 476FP because of BC, perhaps less changes in their tools, and their huge experience with it. Those may translate to faster adaptation and lower cost.


Am i right in thinking that Broadway CPU's cannot be clocked above 1Ghz? Is it possible it could be clocked higher, and feature OoOE, and still be considered a Broadway CPU? This is a similar query to those asking if a Power7 could be in that case without having such drastic alterations that it would be rendered no longer really a Power 7. Where's the cutoff?
 
I can't take it anymore. I've been reading this thread for months now, reading people's thoughts, waiting for leaks... I've thoroughly enjoyed the speculation here.

However, I am starting to go a little insane. You'd think that if IBM said they provided Nintendo with Power 7 processors, and then an IBM engineer confirmed it, that no one would be speculating that it's an upgraded Broadway tri-core. Apparently that is not the case!
 
Is it possible it could be clocked higher, and feature OoOE, and still be considered a Broadway CPU?
Yes and no. The names we give things are just based on conventions. The people that name the chip can use whatever name and justification they choose. It's not like it's a public name used to market the chip. If it was, and they called it a POWER7GT but in architecture it was a PPC740, people could complain about being misled. But internally they can call the thing whatever they want, and externally we can call it whatever we want. As we'll likely never know the architecture though, picking a name to mean something to us won't help.

Incidentally Broadway is OoOE AFAIK.
 
However, I am starting to go a little insane. You'd think that if IBM said they provided Nintendo with Power 7 processors
Except they didn't. The tweets are very ambiguous. As I trust you've seen in this thread, trying to untangle the mess of conflicting info isn't straightforward, but looking at all the examples and references and results, don't you agree that it's far more plausible that the rumours of it being a Broadway-type CPU x3 hold a lot more weight than a heavily changed POWER7?
 
CPU technology isn't just cores
today CPU technology is internal bus tech (ring bus...), multi-core tech, complex memory cache tech, multiple I/O tech and controller (RAM, network...), hardware accelerator tech (crypto, compression...), thermal sensors tech, dynamic power scaling...
and process technology (45nm, SOI, eDRAM...)

Maybe IBM pick up in the Power7 ecosystem (Power7 is probably the last modern IBM CPU) for all (or part) of this modern techs but with PPC476 cores instead of Power7 core. Power7 core is too big for a WiiU CPU and inadequate. "broadway echanced" or PPC476 cores is more coherent and logic
i think everyone are right but don't talk the same thing

for example Cell technology isn't the core CPU technology (PPE like X360) but EIB ring bus technolgy, Flexio I/O technology and XIO RAM controller, SPU technology ect...
 
No, I don't, and I also don't think the Power 7 has been changed that much. I also think modifying the Gamecube CPU yet again to be a multicore processor that's competitive with current HD consoles is no less of an engineering challenge than making a Power 7 with fewer cores at a lower clock that consumes less energy.

I think that in a press release IBM is going to be intentionally vague because Nintendo doesn't like to talk about specs, but being vague and being intentionally misleading are very different.

I think people only pay attention to those rumors that support their own internal expectations about the system and ignore the rest.
 
People are trying to find what's true and wondering how things work with what has been leaked.

The fact Wii U is backward compatible for exemple leads some people to think there must be hardware support, there's no proof it's not an emulator though neither the Wii CPU&GPU aren't embedded in the system either.
It's all guess work.
 
Dunno about that but IBM did make some grand statements in their press release:

The all-new, Power-based microprocessor will pack some of IBM's most advanced technology into an energy-saving silicon package that will power Nintendo's brand new entertainment experience for consumers worldwide. IBM's unique embedded DRAM, for example, is capable of feeding the multi-core processor large chunks of data to make for a smooth entertainment experience.

That may not make it a fullblown Power7 but it hardly sounds like an updated Broadway either.
 
No, I don't, and I also don't think the Power 7 has been changed that much. I also think modifying the Gamecube CPU yet again to be a multicore processor that's competitive with current HD consoles is no less of an engineering challenge than making a Power 7 with fewer cores at a lower clock that consumes less energy.

I think that in a press release IBM is going to be intentionally vague because Nintendo doesn't like to talk about specs, but being vague and being intentionally misleading are very different.

I think people only pay attention to those rumors that support their own internal expectations about the system and ignore the rest.
I mean I wished that they had something better than enhanced Broadway or a custom ppc 470s.
Thing is I search existing CPU the closest thing on the same process (minus edram) is a phenom II x3 p820 (tri core @1.8 GHz). The TDP is 25Watts, I don't know how much it pulls in peak performances.
Say IBM get it even to really 20/25 Watts by locking the thermal dissipation, it still doesn't let much for the rest of system. So how to fit a gpu, the ram, the optical drive?
I think that the hd 6450 has a tdp of 27 Watts (that's with GDDR5), a HD 5450 (DDR3) 19 Watts (all data courtesy of Anantech).
That could fit but it's a trade off (it could fit because there is redundancy in taking existing parts like memory controller power draw), those GPU provide nowhere near the performance some people are expecting for the WiiU. Either it's trading CPU for GPU power.
Not too mention that even if IBM can set the TDP in Power7 (I don't know to which extend they can) to reach such a lower power dissipation I guess with speak high bin parts, with low leakage that function under pretty low tension.

It can't be good in either imo.

For the part I put is strong, I think you are wrong, look at the investment Intel make to get power consumption down with their high performances CPU. And even with that the part that reach will low TDP have to be carefully selected.
The ppc 476 cores doesn't look that sucky if you put aside FP performances. I think the problem could have been alleviated with a greater number of cores. It's not like those cores are big ~4mm^2 (IBM own data) and implementing more L2 thanks to edram would have been costly neither in silicon or power.

Either way, may IBM sell high bins part to Nintendo for real cheap, as they started transitioning to 32nm, they may plan to have extra capacity soon. Point is a power7 will come no matter what at price for the system. a 20/25Watts best cases scenario for such a CPU is quiet a burden when the PSu support at max 75Watts
 
This thread needs some more GPU discussion IMHO.

I did some googling and came up with 400 gflops for PS3's RSX GPU. It should actually be a bit less than that because it was assuming 550 MHz clock and we know RSX runs at 500 Mhz. Still, that's significantly more than Xenos' 240 gflops. However, as I've learned (correct me if I'm wrong) comparing a flops figure from one architecture to another is often like comparing apples and oranges. So whatever WII U's GPU does as far as gflops may not be comparable to Xenos and RSX.

Or am I totally wrong?
 
Last edited by a moderator:
This thread needs some more GPU discussion IMHO.
that's if the thread name is any hint :LOL:
I did some googling and came up with 400 gflops for PS3's RSX GPU. It should actually be a bit less than that because it was assuming 550 MHz clock and we know RSX runs at 500 Mhz. Still, that's significantly more than Xenos' 240 gflops. However, as I've learned (correct me if I'm wrong) comparing a flops figure from one architecture to another is often like comparing apples and oranges. So whatever WII U's GPU does as far as gflops may not be comparable to Xenos and RSX.

Or am I totally wrong?
FLOPS and sustained FLOPS are differents for sure but I remember Sebbbi stating morethan once that the free blending in the 360 and overall the the smart edram daughter die in Xenos ended from his experience a greater advantage the unified shader architecture in Xenos (not that it doesn't provide advantage though).
 
Yes and no. The names we give things are just based on conventions. The people that name the chip can use whatever name and justification they choose. It's not like it's a public name used to market the chip. If it was, and they called it a POWER7GT but in architecture it was a PPC740, people could complain about being misled. But internally they can call the thing whatever they want, and externally we can call it whatever we want. As we'll likely never know the architecture though, picking a name to mean something to us won't help.

Incidentally Broadway is OoOE AFAIK.


Understood (and :0 at Broadway OoOE!)

But i didnt mean "can it be called Broadway" in a moral sense, I meant would it be referred to as that in a technical document? Remember where this leak supposedly came from. Surely, the only reason for Nintnedo themselves to refer to it as such would be to imply comatibility and continuity for developers who have been using Wii dev-kits?

I'm so interested to see what they've put in there. Whathever decision they've made will likely raise some eyebrows. Like someone above said, it would surely take as much of their budget/effort to turn broadway into a tri-core, embedded DRAM using cpu, than it would to take a power7 and disable enough of it to make it fit their target TDP.

Interestingly, from what I'm reading a tri-core Broadway doesn't sound that bad at all. And would potentially wipe the floor with 360's Xenon and PS3's Cell in quite a few areas. (not so in others). hmmm.
 
This thread needs some more GPU discussion IMHO.

I did some googling and came up with 400 gflops for PS3's RSX GPU. It should actually be a bit less than that because it was assuming 550 MHz clock and we know RSX runs at 500 Mhz. Still, that's significantly more than Xenos' 240 gflops.

Some recent developer (possibly Epic in one of their many interviews) referred to Xenos and RSX as ~250 gflops, and that's what I go with. They are pretty close together anyway imo, so it seems likely their real, usable flop count is similar.
 
But i didnt mean "can it be called Broadway" in a moral sense, I meant would it be referred to as that in a technical document? Remember where this leak supposedly came from. Surely, the only reason for Nintnedo themselves to refer to it as such would be to imply comatibility and continuity for developers who have been using Wii dev-kits?
Firstly, I thought the leak came from an artist? Secondly, you pick a name for something based on connotations and communicating in word some useful info, if you are going for continuity. When Sony picked EE and RSX names, they were just marketing gimmicks, but when Intel picked 286, 386, 486, it was to show progression. For Nintendo to pick 'Broadway' in a technical document, that would have to have been to reference the old design and convey a similarity, suggesting not a clean break in the architecture. But sadly that doesn't tell us a lot, as if Broadway is considered 'a CPU that runs Wii code', it can be considered Broadway regardless of internal architecture, similar to a processor referred to as an 'x86'.
 
Full translation of the recent Japanese Nintendo direct provides some insight

http://www.neogaf.com/forum/showthread.php?t=492101


This is the Wii U.
The Wii U is Nintendo's first high-definition game console.
Imagery like what you see now that couldn't be done by the 6-year-old Wii is now possible.
Not only is the HD-standard 720p resolution supported, but also "Full HD" 1080p.
On top of the increased resolution, the graphics processor can be used to handle various tasks aside from just graphics.
This usage is called "GPGPU." [Translator's note: GPGPU stands for General-Purpose computing on Graphics Processing Units.]
The Wii U also has 1 gigabyte of memory for games and 1 gigabyte for the system, for a total of 2 gigabytes of main memory.
This is the greatest amount of memory ever in a game console, and over 20 times the main memory found in the Wii.
The large amount of memory allocated to the system allows for switching to an Internet browser, utilizing Miiverse, or other system without ending the game, providing for smooth transitions from the game to functionality to be used in your living room, and back again.
Also, excess system memory is reserved for future feature expansion.

Due to the large amount of main memory, game worlds can be expressed in rich, minute detail, and multiple scenes can be pre-loaded into memory to reduce disk loading times at scene transitions.
The Wii U utilizes a proprietary disk format, the capacity of which is 25 gigabytes.
To match the increased amount of main memory, the read speed can reach up to 22.5 megabytes per second.
Not only is HD game console performance increased, but the system also meets an energy-saving specification even while increasing performance for modern times.
The Wii U is rated at 75 watts of electrical consumption.
Please understand that this electrical consumption rating is measured at the maximum utilization of all functionality, not just of the Wii U console itself, but also the power provided to accessories connected via USB ports.
However, during normal gameplay, that electrical consumption rating won't be reached.
Depending on the game being played and the accessories connected, roughly 40 watts of electrical consumption could be considered realistic.
This energy-saving specification has made this compact form-factor possible.
 
Firstly, I thought the leak came from an artist? Secondly, you pick a name for something based on connotations and communicating in word some useful info, if you are going for continuity. When Sony picked EE and RSX names, they were just marketing gimmicks, but when Intel picked 286, 386, 486, it was to show progression. For Nintendo to pick 'Broadway' in a technical document, that would have to have been to reference the old design and convey a similarity, suggesting not a clean break in the architecture. But sadly that doesn't tell us a lot, as if Broadway is considered 'a CPU that runs Wii code', it can be considered Broadway regardless of internal architecture, similar to a processor referred to as an 'x86'.


As far as I'm aware, the leak on VGleaks claimed it was taken from warioworld. Arkham then mentioned in the resulting neogaf thread that that was his info (didnt sound like he was too pleased with it coming out, so not sure if he actually let it slip to VGleaks himself). It was widely accepted that Arkham was some kind of artist, as he'd either mentioned it before or had at least confimred he was involved in game production but didnt have access to devkits first hand.

Edit; Also, from the above Japanese transalation
Iwata said:
Depending on the game being played and the accessories connected, roughly 40 watts of electrical consumption could be considered realistic

This sheds a little more light on the power draw. "Depending on the game" could really imply anything. Considering they are pushing the energy efficiency of the consoles hardware, my bet is that this is a very low end estimate, to make it sound as efficiewnt as possible. Maybe marketing speal. That could have been measured whilst playing some eShop Virtual console game which doesn't tax the GPU at all, and with nothing else connected.

Right now over at Gaf theres a huge debate over whats possible on the hardware front given the "WiiU's ~40w power draw". I really dont think that figure should be taken as a realistic average draw.
 
Last edited by a moderator:
Status
Not open for further replies.
Back
Top