I think the GPU will be a 12 or 14 cluster VLIW4 chip.
That would be...no. Unless they rebooted the project.
Last edited by a moderator:
I think the GPU will be a 12 or 14 cluster VLIW4 chip.
That would be...no. Unless they rebooted the project.
Sigh, you know you're inevitably going to be proven wrong Ahh well. No harm I guess. But I saw so many go through this whole denial routine with the Wii (huge firestorm directed at IGN when they revealed Wii was essentially an overclocked Gamecube).
What makes you think so? These would be 768 ALUs with VLIW4, according to rumors such a chip (HD7570) has a TDP of 50W with a clock of 750 MHz and 1GB of DDR5. With further optimization and slightly lower clocks this could be around 30-40W and with a good amount of eDRAM this would be really competitve (really doubting XboxNext/PS4 will be more than twice as powerful)
EDIT: 768 ALUs x 729 MHz (3x Wii GPU clock) = 1120 GFLOPS := Rumor suggesting around 1TFLOP
From what I understand the architecture is locked down a couple of years before release. It takes months to years to finalise a chip design then many months to turn that into a production chip.
I'm not sure that you can just assume that power consumption for a Nintendo chip can be drastically lower than a PC part, and 30 - 40 W for GPU is almost certainly way over what the current WiiU case/cooling will be happy with.
Best you can hope for at this point is, IMO, Nintendo pushing for higher clocks on what they've already got. Maybe they could partially slide the motherboard out from under the Bluray drive and go for a larger heatsink and a different fan arrangement. Probably nothing will change though.
But so is the architecture for the PC parts, AMD definitely knew what they will bring in 2012 at least 1 year ago. VLIW4 chips were released nearly one year ago. I don't know Nintendo's time frame but they definitely know better than me what will be available in the future from AMD an so I think they sure knew about VLIW4 architecture in the middle of 2010 already.
Well, this is the TDP for the desktop part, mobile and/or embedded chips will probably use even less. And I don't think even 40W would be too much, because the IBM CPUs aren't very power hungry normally (I think it won't be clocked higher than 2,2GHz), maybe another 20W TDP.
I think Nintendo proved that they know how to design power efficient packages.
At this point they can't change what they're planning to go into the machine though. So far, no rumours indicate that VLIW4 stuff.
60W just for the CPU and GPU would probably be above what the 360S dissipates from its processors; Nintendo have the disadvantage of less room for a heatsink and a much smaller fan. I don't think that it's going to happen without an outrageously noisy little fan on the back, which is to say that I don't think it's going to happen.
Nintendo have done nothing to prove they are sorcerers, but lots to prove that they are ruthless at keeping costs under control.
That's what I wanted to say though, still this doesn't mean that it wasn't planned from the beginning. Using VLIW5 (or whatever) parts in early devkits doesn't tell us anything really.
EDIT: XBox360 draws around 80-90W in games according to AnandTech
I think so too. Probably not the GCN architexture but one from the performance segment which are said to be still based on VLIW4 but are also on 28nm, indeed Lombok Pro would be a good fit. Would also fit the time frame because they will start in December AFAIK.
That would be...no. Unless they rebooted the project.
Well if they're still using the old shaders in current dev kits (a year after the the VLIW4 stuff hit the mass market) then I'd guess that's a sign that they're using the old stuff in their new machine. That'd my interpretation at any rate, though I don't know how long it takes to make revisions to a dev kit.
The only thing available at the time was Cayman. Putting that in the dev kit would have been a huge misrepresentation of power. stifl said pretty much everything I would have said. Looking at Cayman's release date VLIW4 would have been developed concurrently when Nintendo started their plans for a GPU. AMD could have easily said, "This is what we are planning to do with our future GPUs. It will give the same amount of processing power while reducing some of the transistors. It will be readily available by the time you launch. Based on your target use [insert GPU used in the dev kit] for now."
As we've discussed before we saw the 360 go from a 9800, to an x800, then finally Xenos. I'm just not ready to rule out the idea till I have enough confirmation to write it off.
There were also some kits that used SLI 6800 GTs to give a better representation of what the GPU could do. The issue with 360 kits would appear to be that there simply wasn't any closer hardware available from ATI, but that wouldn't seem to be the case with the WiiU.
but VLIW4 vs VLIW5 is mostly a shader compilation thing, they could use a radeon 6570 and it would not be too far from the final hardware.
I read that VLIW4 was essentially a concession for AMD's so-far-worthless GPGPU initiatives and it isn't actually a benefit for games. Tunafish posted above that VLIW5 is a better fit for 360's GPU, btw, and I do think it's clear that N wants access to other companys' game libraries. I think I'm expecting WiiU to be essentially a modernized 360.
I read that VLIW4 was essentially a concession for AMD's so-far-worthless GPGPU initiatives and it isn't actually a benefit for games. Tunafish posted above that VLIW5 is a better fit for 360's GPU, btw, and I do think it's clear that N wants access to other companys' game libraries. I think I'm expecting WiiU to be essentially a modernized 360.
So... you dismiss my speculation that the size of the box stays the same by speculating that the box will get way bigger?
Why?None to non-existent.