Wii U hardware discussion and investigation *rename

Status
Not open for further replies.
A few more comments from Criterion:

http://www.eurogamer.net/articles/digitalfoundry-need-for-speed-most-wanted-wii-u-behind-the-scenes

The suggestion seems to be that the Wii U CPU can be made to be as performant as its peers despite its low clock, if you make use of its strengths.

Bandwidth for textures also doesn't seem to be an issue, at least not in the Criterion engine, as they claim to literally have a 'use PC textures' switch in their engine, and enabling that seemed to be all that was necessary.

They also mention that Nintendo's development tools are as bad as ever. And there are no comments about bandwidth issues. Couldn't be discussed due to NDA, but seemed to not be an issue - if the framerate of this version holds up, I think we can put that in the confirmed bin and assume that the combination of the 32MB EDRAM with the large amount of (slower) DDR3 combines to be efficient enough.
 
Thanks for providing some real-world comparable reference figures. Sounds like the 320 SP GPU is capable of far better than Wii U is showing at present. Is that because the devs aren't able to use those resources (odd drivers, contrary to the way every other GPU works by distributing work to available resources), or there aren't as many resources to use?
Pardon my interference, but aside from the ALU count, isn't the 4670 a 750MHz part with 32 TMUs fed by a 32GB/s GDDR3?
 
Pardon my interference, but aside from the ALU count, isn't the 4670 a 750MHz part with 32 TMUs fed by a 32GB/s GDDR3?
Okay, so you propose that Wii U's performance is limited by clock-speed and TMU count and RAM bandwidth? That would mean Nintendo took a whole load of shaders and reduced the BW and TMUs below what those shaders can use, no?
 
For me, the take home point is he says, "punches above its weight," and at the 40 Watt lightweight class, that's impressive, which is what I've been saying. Wii U's engineering is all about efficient performance at a low power draw. That fact it can match PS360 and exceed in some areas while using such a low power draw is impressive. People wanting even more performance from it, significantly above PS360, are asking the rules of thermodynamics to be bent completely out of shape, and that's just unfair on Nintendo!
 
For me, the take home point is he says, "punches above its weight," and at the 40 Watt lightweight class, that's impressive, which is what I've been saying. Wii U's engineering is all about efficient performance at a low power draw. That fact it can match PS360 and exceed in some areas while using such a low power draw is impressive. People wanting even more performance from it, significantly above PS360, are asking the rules of thermodynamics to be bent completely out of shape, and that's just unfair on Nintendo!

Problem is how many gamers care about power draw as long as it's not over 150W? :LOL:
 
Okay, so you propose that Wii U's performance is limited by clock-speed and TMU count and RAM bandwidth? That would mean Nintendo took a whole load of shaders and reduced the BW and TMUs below what those shaders can use, no?
No, I'm just saying the comparison is flawed. As to what and why nintendo did - the jury is still out on those.
 
Maybe NFS Most wanted fits better with that kind of set up. Maybe Criterion knew how to optimize it well(not going to doubt that either as Criterion are obviously a talented dev team).
Plausible

The Wii U's internals aren't that much of a mystery. On par with current generation. A marginally more powerful GPU compared to Xenos and RSX, a slightly weaker CPU, twice the ram amount for games and 3 times the EDRAM of 360. Not all that exotic.
Speculative
 
No, I'm just saying the comparison is flawed.
'Flawed' as in 'imperfect' and not 'unusable'. We can make extrapolations of performance differentials. At least real hardware results on real comparable games is a huge leap better than staring at numbers and guessing. ;)
 
For me, the take home point is he says, "punches above its weight," and at the 40 Watt lightweight class, that's impressive, which is what I've been saying. Wii U's engineering is all about efficient performance at a low power draw. That fact it can match PS360 and exceed in some areas while using such a low power draw is impressive. People wanting even more performance from it, significantly above PS360, are asking the rules of thermodynamics to be bent completely out of shape, and that's just unfair on Nintendo!

Of course. But if devs can get the CPU to not be a bottleneck and have enough bandwidth to use the high-res textures you also see in PC, with better lighting, then that would mean the Wii U is potentially the best current gen machine, which so far has been in doubt (justifiably). However, those that believe it will keep up with the 'real' next gen consoles much better than the Wii did with the HD consoles are very likely to end up disappointed.
 
Inuhanyou read my last post again, I make no performance inquires about Wii U, I only state that a dev got final hardware in November, which is around launch time where the ports of other games were already gold. This could explain performance of these ports and give us something more accurate to compare pc ports to, since it is using the same assets as the high end version of those games.

darkblu is right btw, 4670 makes little since as a comparison unless both the core and memory clock were lowered to match the theoretical performance of the Wii u's gpu. Not to mention that partnering it with the i7 CPU is probably a poor idea.
 
'Flawed' as in 'imperfect' and not 'unusable'. We can make extrapolations of performance differentials. At least real hardware results on real comparable games is a huge leap better than staring at numbers and guessing. ;)
Sure. I didn't see any extrapolations of performance differentials in that post, though.
 
That seems only slightly somewhat of a grab.

Different situations all around. Some games will be taxing on the bandwidth a console has, some will not be, some will be more GPU centric, others will be more CPU centric. Its not a "this game performs well so the hardware should always perform well in every situation".

Maybe NFS Most wanted fits better with that kind of set up. Maybe Criterion knew how to optimize it well(not going to doubt that either as Criterion are obviously a talented dev team).

I'm going on the actual assertion from multiple devs that ram has been the most limiting factor of this console generation on what they can do with the hardware. Could it not apply in this scenario we are debating? Sure, but i would not bet on that being the case.

The Wii U's internals aren't that much of a mystery. On par with current generation. A marginally more powerful GPU compared to Xenos and RSX, a slightly weaker CPU, twice the ram amount for games and 3 times the EDRAM of 360. Not all that exotic.

Did we ever find out the bandwidth for the EDRAM?

No we did not. Although it does seem that you (and others) have a vested interest in highlighting theoretical bw limitations, multiple system bottlenecks, etcetera. Your additional 360 ram theory is an incorrect assumption, would this also account for the increase/stabilization in framerate? In addition to the feature functionality Blu pointed out. The Wii U hardware supports tessellation, as well as advanced shader support. In addition the tablet is yet another performance penalty. The Wii U is simply a more capable platform than either the PS3 or 360 overall. Not by any huge metric however. I've always labeled it as a "transitional console," though not even within the same realm of Orbis & Durango. [(I've said as much long before any of the leaks) The chip not exotic? How many ALUs are there definitively? Why their disproportionate cluster size? (if it is indeed 160) Are the function of the duplicate logic blocks readily apparent? Brazos is the only chip imo of such exotic, or rather unorthodox design.

Thanks for providing some real-world comparable reference figures. Sounds like the 320 SP GPU is capable of far better than Wii U is showing at present. Is that because the devs aren't able to use those resources (odd drivers, contrary to the way every other GPU works by distributing work to available resources), or there aren't as many resources to use?

The resources are there afaik, the documentation & architectural familiarity however are not. Optimisations will only come initially from the various proprietary engines from 1st & 3rd party exclusive software. Expect Monolithsoft, Retro, & especially Platinum Games to impress at E3. (along with some unannounced collaborations & 3rd party software) You will see software unable to be emulated on the current HD twins. Regarding Criterion & NFS read this from Digital Foundry: http://www.eurogamer.net/articles/digitalfoundry-need-for-speed-most-wanted-wii-u-behind-the-scenes Puts some aspects & misconceptions in perspective doesn't it?
 
Last edited by a moderator:
^ Arguing for what we may or may not see at E3 seems more like conjecture than anything we've been talking about.

No we did not. Although it does seem that you (and others) have a vested interest in highlighting theoretical bw limitations, multiple system bottlenecks, etcetera. Your additional 360 ram theory is an incorrect assumption, would this also account for the increase/stabilization in framerate?

Never said it would, i was referring to the high resolution textures. Although the GPU being more powerful to a degree would contribute to that as i've always assumed a 300-350flop GPU. But i'd like to stress that i have no "interest" in making the Wii U out to be anything more than the information we've gleaned so far.

I have a Wii U, i like playing games on it. But i've been stating for a while that there's no magic in this set up. Anything that Criterion has gotten out of it has been through careful optimization of the elements that we've always known are there.

We've always known that 1GB of ram should bring theoretical higher resolutions, and the GPU, being a theoretical 320SPU part should perform much better than the old ass Xenos and RSX set ups. My questions have been down to why that has not been the case outside of one or two exceptions.


In addition to the feature functionality Blu pointed out. The Wii U hardware supports tessellation, as well as advanced shader support. In addition the tablet is yet another performance penalty.
We should also take care to remember that mirroring images incurs no performance penalty. Only when the screens are different does the performance of the game start to be impacted, which is why the gamepad usually only shows light material. Its not a bullet point proving anything spectacular the Wii U is hiding.

Also, the 360 has a tessellation unit, so i'm not sure of your angle in regards to that, you mean the Wii U is in a more capable position to use its tessellation unit? That makes more sense


The Wii U is simply a more capable platform than either the PS3 or 360 overall. Not by any huge metric however. I've always labeled it as a "transitional console," though not even within the same realm of Orbis & Durango. [(I've said as much long before any of the leaks) The chip not exotic? How many ALUs are there definitively? Why their disproportionate cluster size? (if it is indeed 160) Are the function of the duplicate logic blocks readily apparent? Brazos is the only chip imo of such exotic, or rather unorthodox design.
I won't argue against how powerful the Wii Uis in proportion to the 360/PS3, they all have their places against each other.


As for exotics.
I call the Cell an exotic piece of hardware. We know that the Wii U's GPU is based off of the RV700 series of GPU, with custom arrays bolted onto it. That's not exotic to me.
 
Expect Monolithsoft, Retro, & especially Platinum Games to impress at E3. (along with some unannounced collaborations & 3rd party software) You will see software unable to be emulated on the current HD twins.
I can well believe it. I started a poll about a year ago and state it was a toss-up between the same or slightly better. The low power draw is a going to be an overarching limitation. But the question isn't really a versus question, but what is the hardware? how it performs relative to the other of only of interest here in determining what the hardware is.

Regarding Criterion & NFS read this from Digital Foundry: http://www.eurogamer.net/articles/digitalfoundry-need-for-speed-most-wanted-wii-u-behind-the-scenes Puts some aspects & misconceptions in perspective doesn't it?
Not really. "Punches above its weight." Matching a 100 W console using just 40 W is definitely punching above its weight. Maybe one misconception being addressed is the role Nintendo play. Criterion are saying Nintendo were really supportive:
"The difference with Wii U was that when we first started out, getting the graphics and GPU to run at an acceptable frame-rate was a real struggle. The hardware was always there, it was always capable. Nintendo gave us a lot of support - support which helps people who are doing cross-platform development actually get the GPU running to the kind of rate we've got it at now. We benefited by not quite being there for launch - we got a lot of that support that wasn't there at day one... the tools, everything."
So, maybe the tools were a bottleneck for launch titles, but NFS:MW hasn't got that problem and the devs can fully use the hardware. Is NFS:MW comparatively more indicative of 320 shaders, or 160? (And are 320/160 shaders the only possible option?!)
 
Now, on my decrepit laptop that has the Core 2 Duo (2008 tech) and a Mobility Radeon 4670 @ 843/882 core and memory clocks respectively, the game runs at Very High settings, No AA, but at a staggering 1920x1080 with a stable 30 - 35 FPS ingame with Triple Buffered Vsync using D3DOverrider.

Since people seem to have missed it - "Radeon 4670 @ 843/882 core and memory clocks respectively".

It would be a better experiment to scale GPU clocks and memory speed down to WiiU level, and see what the system can sustain, bearing in mind that the C2D is a significant step up from the processor in the WiiU in terms of per thread performance, so this PC should be less CPU limited if we scale the resolution down to console levels.
 
So, maybe the tools were a bottleneck for launch titles, but NFS:MW hasn't got that problem and the devs can fully use the hardware. Is NFS:MW comparatively more indicative of 320 shaders, or 160? (And are 320/160 shaders the only possible option?!)

Emphasis mine. Not trying to quote you out of context, just highlighting that last part. Is it B&W as in 160 or 320? Or is this an unorthodox architecture with something like 256 ALUs (the shader blocks are larger assuming it is still 40nm)? I think 160 is too low based on the PC benchmarks.

IMO I think it's best to confirm the block size, then try and figure out or speculate as to why they are larger (and what impact that would have) rather than resort to the lowest common denominator and call it a day (I mean that in general, not accusing or singling out anybody for doing such a thing.. just wanted to clarify in case anybody thought that)
 
Emphasis mine. Not trying to quote you out of context, just highlighting that last part. Is it B&W as in 160 or 320? Or is this an unorthodox architecture with something like 256 ALUs (the shader blocks are larger assuming it is still 40nm)? I think 160 is too low based on the PC benchmarks.

IMO I think it's best to confirm the block size, then try and figure out or speculate as to why they are larger (and what impact that would have) rather than resort to the lowest common denominator and call it a day (I mean that in general, not accusing or singling out anybody for doing such a thing.. just wanted to clarify in case anybody thought that)

I don't think any of us can honestly stare into that abyss of logic on the die shot and tell you how many shaders. The SRAM blocks, however, seem to lean towards the two aforementioned options. I believe it highly unlikely that the SIMD cores have been altered that extensively. Nintendo barely know how to use shaders - they're going to improve on AMD's design? ha! If anything, they would have been working on reducing size/heat. That's their clear MO these days.
 
Okay, so you propose that Wii U's performance is limited by clock-speed and TMU count and RAM bandwidth? That would mean Nintendo took a whole load of shaders and reduced the BW and TMUs below what those shaders can use, no?

I know this will get me into trouble...but what the hell?

Iwata's comments calling Latte a GPGPU should not be brushed off so quickly. We can say he was "in PR mode" or that it was a "bulletpoint" but that is just as much an assumption as if one were to say that the comment indicates a modification of the shader cores. We simply don't know what he really meant by it.

Likewise, it's not important how R700 performs vs Southern Islands re: compute functionality. What matters is what Iwata thinks of as a GPGPU. If, for instance, they designed a system with more shader resources than texture resources and a large on-chip memory pool, would that not justify the title? We should at least consider this an equally possible scenario as Iwata (laughably) calling a 160 ALU part a GPGPU.

Anyway, this is getting off into speculation territory, but you asked what the point would be of adding more ALUs to a bandwidth/TMU bound setup, and I see this as a valid response. :)
 
If tools are problem and Wii U versions need much greater effort to get up to scratch day and date releases might not be that great even going forward

Lets see if Criterion really are worth the PR and there are no downgrades
 
Seems like we might not know anything unless someone actually hacks it and run a benchmark or leak production documents at this point.



So anyone want to list why they think its 320 shaders based on anything technical(the die shot or documentations)? Seems like people will believe what they want. The area is way too small to fit 320 at 40nm. The layout looks like 20 shaders per block with matching cache blocks. So whats the 40 shader block theory's basis at this point beside it being 50% larger?
wPPQWH2.jpg

This is with 2 Wii U blocks compare to 1 bobcat block. the size is larger sure but the notable features are the same...
 
Status
Not open for further replies.
Back
Top