The IGN article where they built a PC to supposedly emulate Wii U was a joke. I'm completely disregarding it.
Let me ask this. If it has rv770 in it, how is it only equal to or slightly faster than ps360.
I've got some possible explanations for that here:
http://forum.beyond3d.com/showpost.php?p=1567722&postcount=1784
That's really thin. If Nintendo put that much horsepower in their box and are going to launch ahead of their competition, I expect they would want to showcase it and brag about how much more powerful it is than the current competition. rv770 would roflstomp ps360 graphics with a bit of effort. Also, I would expect it would be much faster than rv770 in a pc, so I think even crysis would separate itself quite a bit with the advantages of a closed environment.
Does anyone know what the final process node is going to be? Are we talking 32nm, 28nm or 45nm? If the development kits represent silicon which is going to be shrunk in the future then I think we have an obvious reason why there were reports of how hot they ran.
CPU is confirmed by IBM to be 45nm. There's nothing about the GPU yet.
But does that mean that it won't be 32nm by the time they release?
How is it logical to fit a Radeon HD 4870 into the WiiU Chassis? Even if you take the 40nm rendition, which is the 5770.
Since it's highly unlikely to get a 55nm GPU out in 2012, at least a shrink to 40nm should be expected, and at the same clocks it'd consume less.
I can't believe a RV730 would bring overheating issues, unless it was clocked really high (>850MHz?).
How is it logical to fit a Radeon HD 4870 into the WiiU Chassis? Even if you take the 40nm rendition, which is the 5770.
But it's unlikely it'll be an Evergreen-based GPU.There's a reason why I mentioned 5770.
So you agree that, should the info about overheating consoles be true, a RV730 @ ~500MHz in there isn't the most probable option. That or at least the GPU wouldn't be responsible for the overheating.RV730 is clearly much lower spec than rv770, so of course.
Yes, the console isn't that small. The new macbook mini is a lot smaller and it carries a 624M transistor Core i5 Sandybridge @ 2.5GHz (35W), a 716M transistor Turks @ 500MHz (~15W) with GDDR5 and DDR3 system memory.There is Phenom II X4 and 6850 inside laptops, that are smaller a pack much more stuff.
I've got a 17" laptop with a vastly inferior 4670M and a Core i720 (8 thread) that get toasty and verily unstable in just an hour of full load and it's noisy as hell.
The point is that it is similarly spec'd on shaders, TMUs, ROPs, and is a 40nm rendition of what rv770 was.But it's unlikely it'll be an Evergreen-based GPU.
Juniper is DX11, so of course it packs more.And Juniper packs more transistors than RV770, so one could assume that a 40nm RV770 @ 500MHz could be at <22W, GPU alone.
Power consumption is still power consumption...That HD4670M is a 3 year-old GPU made on 55nm, with very high clocks for that time, consuming 35W, together with a 2 year-old CPU with a 45W TDP.
It's not really fair to compare old hardware using old processes with something that's coming out in Q2 2012. Even if the same nodes were used (55nm GPU + 45nm CPU), they'd be a lot more power efficient by now.
Overheating at how many hours of operation? Who says the GPU is the only contributor? There is a 45nm CPU packed away in there too. What sort of cooling is in there? Too many assumptions.So you agree that, should the info about overheating consoles be true, a RV730 @ ~500MHz in there isn't the most probable option. That or at least the GPU wouldn't be responsible for the overheating.
It's the Radeon 4600 series, so that's 320SPs, 8 ROPs, 20 TMUs IIRC. On 40nm, the equivalent desktop GPU would be akin to the 55xx/56xx series, which sports 400SPs.
Anyways, you're all so keen on cutting down clocks of an rv770 class GPU. Clocks and power envelopes are a part of the equation, but these can always be tweaked. Do consider that rv770 is inherently a much larger chip than rv730. That is going to be an inherent factor in the BOM and will weigh some amount in what Nintendo wants to achieve with the WiiU. The rv770 is a gigantic leap in shading and pixel throughput power relative to Xenos. It's not just "a little". It's not even "ballpark".
At what point does lowering the clocks of an rv770 outweigh using a higher clocked low end part that is also a smaller chip (cheaper to manufacture)? You won't get more than double the clock, but what is good enough and how much do you expect Nintendo to spend? Will they have enough bandwidth to support rv770-like pixel throughput? How much will they spend for the memory? How many RAM chips fit inside the WiiU chassis? How many did they fit within the Wii? Was there much room left-over? How inexpensive do you believe GDDR5 2Gbit @ highest speed is? Does GDDR5 even make sense if they go with another edram for framebuffer usage? How much edram makes sense for them? How big will that chip end up being and what is its cost?