How do we 'know' that it isn't 55nm? Do we have proof that it was created on X process? If it were 40nm then surely we would know about the extra GPU performance by now?
We don't know, but with how old 55nm and 40nm both are, I don't see how using a 55nm GPU could possibly be cheaper. Throw this issue of power consumption in and I really don't see any reason why they'd go with 55nm. Again, I want to see a reason for it. Something that would benefit them. We can see reasons for what they did with the CPU and RAM, but the GPU being 55nm serves literally no benefit to the system, and even less if we go with the theory that a quarter of the GPU was cut out.
But it would actually cost more to do something like that than it would to shrink an RV730 to 40nm.
There's absolutely no reason for it to be 55nm.
I know how you plan to respond to this, so instead please give a reason.
How do we 'know' that it isn't 55nm? Do we have proof that it was created on X process? If it were 40nm then surely we would know about the extra GPU performance by now?
This what I want to know. It feels like it *should* be a 40nm chip but it sure as hell isn't performing like one, unless a huge amount of the chip is taken up with an unexpectedly large amount of edram and BC hardware is taking up more space than we're expecting. And yeah, the main memory BW doesn't seem like the kind of pipe you'd connect up to a fast chip. Whether it's 40nm or 55nm it's going to be around or below RV730.
So while it probably isn't 55nm, I wouldn't completely rule out the idea that the idea that given Nintendo's very low performance target that 55nm might make more business sense. The die size and performance don't clearly rule it out IMO, and "bu bu bu 55nm is old PC GPU 28nm!?" doesn't do it for me either. Power consumption is a bigger issue IMO.
It would be great to know for sure how much edram the WiiU has (and what its BW is too!).
Well we did have that rumour which suggested that it is literally an RV 7xx chip. What if the chip itself was just a little bit smaller than the RV 730? That would give room for 32MB of eDRAM and it would explain the relative lack of performance. Why would they stick an RV 740 with 640 SP onto a 12GB shared memory bus??? Perhaps we really need to start asking what kind of chip makes sense given the shown performance and the lack of memory bandwidth?
Perhaps we can consider the eDRAM as much an energy saving feature as a performance feature. Off chip memory bandwidth is very energy expensive so if they can keep the data on the die they make it more energy efficient.
Really? What would the comparative costs be? You're sure about this so hopefully you can share the same figures you're using.
Unless you can show otherwise, cost might be a reason.
( )( )
Okay, you're right, I can't be sure. But who even makes 55nm anymore? Plus, we know that CPU is ridiculously weak along with the RAM, which means that the GPU has to be heavily bottlenecked in these rushed, low-budget launch games. Going off of these and assuming that the GPU is <= Xenos really doesn't make sense. If you use that logic, those games shouldn't be able to run without severe sacrifices. Slightly lower frame rates and more pop-in doesn't quite cut it. We even have at least one game that looks and runs better than PS3/360 versions (Trine 2).
It's a little lower down, but I posted a link showing that TSMC are still doing an awful lot of business on 65/55nm.
If the Wii U is CPU bottlenecked and has a fast GPU then that's a good reason to bump the resolution up (if you have the memory, and the Wii U is supposed to). But nothing has. In fact it looks very much like the Wii U GPU is in the same ballpark as the 360 - likely a little better (certainly newer), but maybe with a couple of drawbacks too.
Again, I'd be thinking RV730 territory and definitely not RV740.
I also figure that RV730 is much likelier at this point. The rumors always suggested an HD4600 variant and performance seems to be in line with that. RV730 also has official support for DDR3 where RV740 does not. Even with the bandwidth deficiency, an HD4700 should be performing better at 720p.
To put it in context, AMD's integrated HD6550D has 400SPs running at 600mhz and manages quite a few 360 games at 1080p at 30+ fps running off the same 1600mhz DDR3 as the WiiU. Usually 60fps when running at 720p.
Also usually with a lot more RAM and a CPU 3+x as powerful as what's in the Wii U. That's not the most fair comparison.
To put it in context, AMD's integrated HD6550D has 400SPs running at 600mhz and manages quite a few 360 games at 1080p at 30+ fps running off the same 1600mhz DDR3 as the WiiU. Usually 60fps when running at 720p.
so what do you guys think about the 1GB RAM for the OS? can nintendo optimize the OS to use less RAM and free some of that up for games? for what we know the OS has to do does it need a full GB to function properly? keeping in mind we dont know what other features nintendo has in mind for the console and OS in general. but lets say they can free up enough for Wii U to have 1.5GB for devs to work with how will that impact game development?
With the piddly main mem bandwidth available, large textures would probably run the game into a performance brick wall...More RAM for games means bigger and better textures can be used.