Wii U hardware discussion and investigation *rename

Status
Not open for further replies.
That would be...no. Unless they rebooted the project.

What makes you think so? These would be 768 ALUs with VLIW4, according to rumors such a chip (HD7570) has a TDP of 50W with a clock of 750 MHz and 1GB of DDR5. With further optimization and slightly lower clocks this could be around 30-40W and with a good amount of eDRAM/1T-SRAM this would be really competitve (really doubting XboxNext/PS4 will be more than twice as powerful)

EDIT: 768 ALUs x 729 MHz (3x Wii GPU clock) = 1120 GFLOPS := Rumor suggesting around 1TFLOP
 
Last edited by a moderator:
Sigh, you know you're inevitably going to be proven wrong :LOL: Ahh well. No harm I guess. But I saw so many go through this whole denial routine with the Wii (huge firestorm directed at IGN when they revealed Wii was essentially an overclocked Gamecube).

Revolution VR Megaton revisited.:LOL:
 
What makes you think so? These would be 768 ALUs with VLIW4, according to rumors such a chip (HD7570) has a TDP of 50W with a clock of 750 MHz and 1GB of DDR5. With further optimization and slightly lower clocks this could be around 30-40W and with a good amount of eDRAM this would be really competitve (really doubting XboxNext/PS4 will be more than twice as powerful)

EDIT: 768 ALUs x 729 MHz (3x Wii GPU clock) = 1120 GFLOPS := Rumor suggesting around 1TFLOP

From what I understand the architecture is locked down a couple of years before release. It takes months to years to finalise a chip design then many months to turn that into a production chip.

I'm not sure that you can just assume that power consumption for a Nintendo chip can be drastically lower than a PC part, and 30 - 40 W for GPU is almost certainly way over what the current WiiU case/cooling will be happy with.

Best you can hope for at this point is, IMO, Nintendo pushing for higher clocks on what they've already got. Maybe they could partially slide the motherboard out from under the Bluray drive and go for a larger heatsink and a different fan arrangement. Probably nothing will change though.
 
From what I understand the architecture is locked down a couple of years before release. It takes months to years to finalise a chip design then many months to turn that into a production chip.

But so is the architecture for the PC parts, AMD definitely knew what they will bring in 2012 at least 1 year ago. VLIW4 chips were released nearly one year ago. I don't know Nintendo's time frame but they definitely know better than me what will be available in the future from AMD an so I think they sure knew about VLIW4 architecture in the middle of 2010 already.

I'm not sure that you can just assume that power consumption for a Nintendo chip can be drastically lower than a PC part, and 30 - 40 W for GPU is almost certainly way over what the current WiiU case/cooling will be happy with.

Well, this is the TDP for the desktop part, mobile and/or embedded chips will probably use even less. And I don't think even 40W would be too much, because the IBM CPUs aren't very power hungry normally (I think it won't be clocked higher than 2,2GHz), maybe another 20W TDP.

Best you can hope for at this point is, IMO, Nintendo pushing for higher clocks on what they've already got. Maybe they could partially slide the motherboard out from under the Bluray drive and go for a larger heatsink and a different fan arrangement. Probably nothing will change though.

I think Nintendo proved that they know how to design power efficient packages.
 
Last edited by a moderator:
But so is the architecture for the PC parts, AMD definitely knew what they will bring in 2012 at least 1 year ago. VLIW4 chips were released nearly one year ago. I don't know Nintendo's time frame but they definitely know better than me what will be available in the future from AMD an so I think they sure knew about VLIW4 architecture in the middle of 2010 already.

At this point they can't change what they're planning to go into the machine though. So far, no rumours indicate that VLIW4 stuff.

Well, this is the TDP for the desktop part, mobile and/or embedded chips will probably use even less. And I don't think even 40W would be too much, because the IBM CPUs aren't very power hungry normally (I think it won't be clocked higher than 2,2GHz), maybe another 20W TDP.

60W just for the CPU and GPU would probably be above what the 360S dissipates from its processors; Nintendo have the disadvantage of less room for a heatsink and a much smaller fan. I don't think that it's going to happen without an outrageously noisy little fan on the back, which is to say that I don't think it's going to happen.

I think Nintendo proved that they know how to design power efficient packages.

Nintendo have done nothing to prove they are sorcerers, but lots to prove that they are ruthless at keeping costs under control.
 
At this point they can't change what they're planning to go into the machine though. So far, no rumours indicate that VLIW4 stuff.

That's what I wanted to say though, still this doesn't mean that it wasn't planned from the beginning. Using VLIW5 (or whatever) parts in early devkits doesn't tell us anything really.

60W just for the CPU and GPU would probably be above what the 360S dissipates from its processors; Nintendo have the disadvantage of less room for a heatsink and a much smaller fan. I don't think that it's going to happen without an outrageously noisy little fan on the back, which is to say that I don't think it's going to happen.

Probably not: The original XBox360 used around 180W (Source) in games with a power supply of 203W (List of Revisions). The current XBox360 uses a 115W power supply, so I would say it draws around 90W of power. I think 60-70W TDP would be bearable for the WiiU's size.

EDIT: XBox360 draws around 80-90W in games according to AnandTech

Nintendo have done nothing to prove they are sorcerers, but lots to prove that they are ruthless at keeping costs under control.

Nobody said they are sorcerers, but IMO they did a good job with the Gamecube and the Wii regarding performance/watt.
 
Last edited by a moderator:
That's what I wanted to say though, still this doesn't mean that it wasn't planned from the beginning. Using VLIW5 (or whatever) parts in early devkits doesn't tell us anything really.

Well if they're still using the old shaders in current dev kits (a year after the the VLIW4 stuff hit the mass market) then I'd guess that's a sign that they're using the old stuff in their new machine. That'd my interpretation at any rate, though I don't know how long it takes to make revisions to a dev kit.

EDIT: XBox360 draws around 80-90W in games according to AnandTech

That's measured at the wall. On the other side of the power supply you're probably getting about 80% of that, and at least a few watts will go for the DVD drive, HDD, fan, wifi, wireless pads and other processors. With 60 watts of heat coming from the CPU and GPU you're at least in the same ballpark as the 360S with its big copper core cooler with bit fan plonked straight on top and an abundance of vents.
 
I think so too. Probably not the GCN architexture but one from the performance segment which are said to be still based on VLIW4 but are also on 28nm, indeed Lombok Pro would be a good fit. Would also fit the time frame because they will start in December AFAIK.

And NEC would be the one most likely making it so they wouldn't have to deal with TSMC or GF on possible supply issues.

That would be...no. Unless they rebooted the project.

Was there something you guys were told that that you didn't publish? I'm not doubting you, just asking for confirmation. Part of my basis for this idea came from your article when you guys said that it hadn't been taped out yet. From a heat perspective it seems like a logical choice.

Well if they're still using the old shaders in current dev kits (a year after the the VLIW4 stuff hit the mass market) then I'd guess that's a sign that they're using the old stuff in their new machine. That'd my interpretation at any rate, though I don't know how long it takes to make revisions to a dev kit.

The only thing available at the time was Cayman. Putting that in the dev kit would have been a huge misrepresentation of power. stifl said pretty much everything I would have said. Looking at Cayman's release date VLIW4 would have been developed concurrently when Nintendo started their plans for a GPU. AMD could have easily said, "This is what we are planning to do with our future GPUs. It will give the same amount of processing power while reducing some of the transistors. It will be readily available by the time you launch. Based on your target use [insert GPU used in the dev kit] for now."

As we've discussed before we saw the 360 go from a 9800, to an x800, then finally Xenos. I'm just not ready to rule out the idea till I have enough confirmation to write it off.
 
The only thing available at the time was Cayman. Putting that in the dev kit would have been a huge misrepresentation of power. stifl said pretty much everything I would have said. Looking at Cayman's release date VLIW4 would have been developed concurrently when Nintendo started their plans for a GPU. AMD could have easily said, "This is what we are planning to do with our future GPUs. It will give the same amount of processing power while reducing some of the transistors. It will be readily available by the time you launch. Based on your target use [insert GPU used in the dev kit] for now."

As we've discussed before we saw the 360 go from a 9800, to an x800, then finally Xenos. I'm just not ready to rule out the idea till I have enough confirmation to write it off.

There were also some kits that used SLI 6800 GTs to give a better representation of what the GPU could do. The issue with 360 kits would appear to be that there simply wasn't any closer hardware available from ATI, but that wouldn't seem to be the case with the WiiU.
 
There were also some kits that used SLI 6800 GTs to give a better representation of what the GPU could do. The issue with 360 kits would appear to be that there simply wasn't any closer hardware available from ATI, but that wouldn't seem to be the case with the WiiU.

But that's essentially what I just pointed out. There currently aren't any VLIW4 GPUs with a lower ALU count. At least not yet.
 
I'd have thought AMD could disable shaders and alter clocks enough to make something of roughly the capability as the GPU Nintendo intend to use?
 
but VLIW4 vs VLIW5 is mostly a shader compilation thing, they could use a radeon 6570 and it would not be too far from the final hardware.
 
but VLIW4 vs VLIW5 is mostly a shader compilation thing, they could use a radeon 6570 and it would not be too far from the final hardware.

I just assumed it'd affect how you wrote shaders in order to get optimal performance. Perhaps it doesn't though, I don't actually know.
 
I read that VLIW4 was essentially a concession for AMD's so-far-worthless GPGPU initiatives and it isn't actually a benefit for games. Tunafish posted above that VLIW5 is a better fit for 360's GPU, btw, and I do think it's clear that N wants access to other companys' game libraries. I think I'm expecting WiiU to be essentially a modernized 360.
 
I read that VLIW4 was essentially a concession for AMD's so-far-worthless GPGPU initiatives and it isn't actually a benefit for games. Tunafish posted above that VLIW5 is a better fit for 360's GPU, btw, and I do think it's clear that N wants access to other companys' game libraries. I think I'm expecting WiiU to be essentially a modernized 360.

I'd missed Tunafish's post - he makes some interesting points there.

I guess I'm expecting something similar to you from the WiiU. Better than parity from the most cost effective hardware they can put together, and in a family living-room friendly box. A bit more than that would be nice though, if it comes.
 
I read that VLIW4 was essentially a concession for AMD's so-far-worthless GPGPU initiatives and it isn't actually a benefit for games. Tunafish posted above that VLIW5 is a better fit for 360's GPU, btw, and I do think it's clear that N wants access to other companys' game libraries. I think I'm expecting WiiU to be essentially a modernized 360.

Maybe so as I've read something very similar to that (nothing about gaming benefits though, but wouldn't that help physics if used?). I also read that the switch to DX10 started to cause poor utilization of their shaders in VLIW5 which in turn leads to the transistor reduction in VLIW4 due to trying to improve utilization by "trimming the fat" so to speak. I've also read that AMD had more plans for VLIW4, but because the fab was still at 40nm they passed to avoid an even bigger die than what it was.

I can definitely agree with your view though about it ending up as a modern 360 (though our views on that might differ). I'm not saying the other direction as fact, just one that I believe is very plausible.
 
So... you dismiss my speculation that the size of the box stays the same by speculating that the box will get way bigger?

I'll advise you to be more careful when trying to put false statements into other people's mouths.

I dismissed (and still do) your speculation that the case size stays the same because Nintendo itself has stated that the current form isn't the final one.
Never have I "speculated" that the case will be "way bigger". In fact, I've made no speculation whatsoever to the case's size.




None to non-existent.
Why?
 
Status
Not open for further replies.
Back
Top