Wii U hardware discussion and investigation *rename

Status
Not open for further replies.
^ The difference being however, that from the ports we have seen, its not just one or two games that render at these lower performance ratios, its a vast majority of launch titles.

Most Wanted and Trine 2 are the only two games so far that have been superior to their HD twin equivalents in a framerate, resolution or texture perspective.

The other side to your argument would be; why would only these two games so far have a handle on the Wii U to that extent?

We also know that especially in Criterons case, they already had the game essentially made, they just didn't want to ship before knowing Nintendo's online plans.

So does that say more about Nintendo's documentation, the hardware itself, or the developers?

It could be a mix of all 3.

To be fair, Most Wanted isn't out yet. If the finished game does indeed turn out to be superior to PS3/360 as the devs (and video preview) claim, it's worth pointing out that both Criterion and Frozenbyte (in the case of Trine 2) developed the Wii U port themselves. It doesn't sound like they were outsourced to a secondary team where many (if not all or most) of the others were. It's also worth pointing out that they ported the PC versions in both cases rather than use the 360 version as a base (unless Criterion ported the 360 version then added the assets from the PC version).

Also to be fair, "majority of launch titles" is only a handful of titles. I know it's not saying much but they were close to parity with the HD twins and a couple titles surpassed the PS3 version in certain places (and in certain other places the PS3 version pulled ahead). I would even go so far as to say if these versions were what originally appeared on PS3, they would be within what is considered parity (which isn't saying much considering its age, but still).

To follow up on my last post, while I commend your efforts to figure this thing out, I don’t think these Call of Duty numbers prove your point. You are trying to say that if we were looking at a 320 SPU part, we’d be seeing vastly better frame rate and resolutions, correct? In reaffirming the statement about the effect of resolution on GPU and CPU loads that I quoted before, I started to notice some peculiarities in those benchmarks.

Let’s look at the HD 6450 for comparison. At 1024x768, we are seeing comfortable frame rates. This makes sense since the GPU is barely being taxed at that resolution. The next bump up in resolution/IQ and look what happens to the frame rate. It takes a nose dive. Is it any coincidence that when we look at low resolutions, CPU bottlenecks are easier to discern? So when looking at the lowest numbers in this chart vs avg figures for BLOPS II Wii U, it makes sense that we would see a difference due to Espresso being no i7. And then there are other performance factors like the locked vsync and characters on screen (which seems to be a cpu thing) on Nintendo’s console. Meanwhile the chart also displays the clear effect of memory bandwidth on performance.

It’s pretty amazing that on the same card, the difference between the 1280x1024, 2xAA, 8x AF and 1920x1200, 4x AA, 16 AF is only ~8 frames!

In short, while I agree that the jury is still out on whether it’s a 160 SPU part, I don’t think you rule out a 320 SPU part by making the rightful observation that games thus far haven’t automatically featured increased resolution and framerate. If getting the image quality to where they felt comfortable resulted in a merely acceptable framerate together with everything else that affects performance, why would the developer then go ahead and increase the settings?

Good point about the CPU. The test setup used in the benchmark was an i7 920 overclocked to 3.8ghz. Even at stock (2.67ghz) I doubt even eight AMD Jaguar cores would surpass it. I could be wrong though.

http://www.notebookcheck.net/AMD-Radeon-HD-7450M.57211.0.html

This is a 7450M, a 160:8:4 mobile GPU based on Caicos. If you scroll down you'll get some average framerates listed for a few games. If you hover the mouse over the settings it will tell you the resolution used (along with AA and AF settings). The 7450M came in DDR3 and GDDR5 flavors, both clocked to 700mhz. One of the games tested is COD:BO (the original). According to this, it averages 52.8 fps @ 800x600 with no AA or AF on low settings using the DDR3 version. BO/BO2 are using the same engine as the original MW, but each subsequent game adds to it (meaning a PC would run the 5+ year-old MW faster than BO2 on the same PC).

BO2 on 360/Wii U is close to the PC's high settings (without AO and the Extra texture setting) @ 880x720 resolution with 2xMSAA and probably 4xAF. The above link claims @ 1024x768 using medium settings, the original BO had an average of 36.5 fps. BO2 on Wii U does experience a drop in fps when there is a lot of action on the screen, especially in a few realtime cutscenes (like the first one as seen over at DF). And in the jungle it runs kind of mixed. For the most part, the single player campaign stays closer to the target 60fps though. Well at least above 50fps, I'm not using fps-measuring equipment and going by eyes only which can only notice fps dips below 50fps.

Keep in mind this was a DDR3 card, but compare it to the DDR3 version in the link function posted. The laptop CPU was an AMD A6 3420M @ 1.5ghz. Even without the overhead that comes with Windows, I doubt 160 shader cores could pull it off. Latte is probably using a modern custom 7xx-based GPU or else the leaked info wouldn't keep alluding to shader model 4+ (and + doesn't mean 5). At the very least I would consider 256 shader cores using an unorthodox custom design, but I'm leaning toward 320 due to the block size. If it really was 160, I doubt those outsourced devs would have gotten as close to parity with 360 as they did given the time they had to work on their ports. But I'm just a noob, I could be wrong.
 
Hello everyone, I'm an avid lurker of these forums, and have been following your discussions with great interest.

I see there's still great uncertainty over the SPU counts in the Latte GPU of the Wii U, and the various AMD GPUs being used to support a thoery or disprove it.

There was also mention of the RV 730 "Mario" (4670) or the AMD 5550 being possible contenders with their 320 shaders by various posters on this thread. It also seems that the new prevailing theory is that it only contains 160 shaders and should be compared with the 7450 or the 6450.

As an owner of a still perfectly functioning 2009 era laptop with ONE of the possible cards in play (which happens to be the Mobility Radeon 4670 1 GB coupled with a Core 2 Duo t9550 @ 2.8 GHz), I can provide some brief impressions on how the card stacks up in some of the games mentioned in this thread.

A game that ported well to the Wii U that was mentioned earlier on was Trine 2.
As a point of comparison, the Wii U renders it a 720p with a steady 30FPS, but without the dynamic scaling present in the other consoles.

Now, on my decrepit laptop that has the Core 2 Duo (2008 tech) and a Mobility Radeon 4670 @ 843/882 core and memory clocks respectively, the game runs at Very High settings, No AA, but at a staggering 1920x1080 with a stable 30 - 35 FPS ingame with Triple Buffered Vsync using D3DOverrider.

Another game often brought up in comparisons is Batman Arkham City.

Ironically this is a much more demanding game than Trine 2 for my Radeon 4670. The Wii U runs it again at 720p @ 30 FPS.

Since the 4670 doesn't support DX 11, I use the DX9 Renderer and can run the game at 1680x945, High Settings (Very High includes things like Tesselation) with a decent framerate of again 35 - 40 FPS, Triple Buffered Vsync.

Just chipping in, I can test other games If there's an interest in other framerate and port performance comparisons.
 
^ I've always compared the WuuGPU most to a underclocked 4650 than a 4670.

A 4670 is still roughly twice as powerful as Xenos even on the lowest kind of configurations(480gflops). Whereas 4650 is around 380-400 range, still stronger than our initial 352gflop count of the WuuGPU.
 
The NFS developers just stated that they didn't get final dev kit hardware until November. If this was common, launch ports were also completely built and finalized without access to final hardware. There is just so much unknown about these ports and the console, that performance compared to pc gpus really does seem premature. Maybe NFS is the best option for that, as they said they are using the high end assets from their pc build.
 
Whatever the Wii U has in its gut, the fact that they are using the PC assets has nothing to do with some hidden power the Wii U's got.

Memory has been said time and time again as the main thing devs have lamented this gen, far far above even GPU or CPU power.

If you took a 360 with no altercations whatsoever besides giving the ram a boost to 1gb instead of 500, it would have had a similar improvement in Most Wanted to the Wii U version.

The issue is that the Wii U has half the memory bandwidth of the 360. So developers need to manage the EDRAM to get the most out of the 1gb, and it just so happens that Criterion had the time and the drive to do that.
 
Whatever the Wii U has in its gut, the fact that they are using the PC assets has nothing to do with some hidden power the Wii U's got.

Memory has been said time and time again as the main thing devs have lamented this gen, far far above even GPU or CPU power.

If you took a 360 with no altercations whatsoever besides giving the ram a boost to 1gb instead of 500, it would have had a similar improvement in Most Wanted to the Wii U version.

The issue is that the Wii U has half the memory bandwidth of the 360. So developers need to manage the EDRAM to get the most out of the 1gb, and it just so happens that Criterion had the time and the drive to do that.

Bolded emphasis added by me. I may be a noob here, but this kind of post has nothing to do with the subject and is out of place. If you took a 360 with no altercations whatsoever besides replacing the GPU with a Radeon 7970, it would be beyond the next console generation. What would that have to do with this topic? And who said anything about "hidden power" ?? "The issue is that the Wii U has half the memory bandwidth of the 360" ? Seriously? This issue of what? What is the issue?
 
Bolded emphasis added by me. I may be a noob here, but this kind of post has nothing to do with the subject and is out of place. If you took a 360 with no altercations whatsoever besides replacing the GPU with a Radeon 7970, it would be beyond the next console generation. What would that have to do with this topic? And who said anything about "hidden power" ?? "The issue is that the Wii U has half the memory bandwidth of the 360" ? Seriously? This issue of what? What is the issue?

Well if we'll talking technically, if you equipped a 360 with a 7970, it would probably explode within a few seconds, not to mention be completely useless even if it did work, with the amount of bottlenecks holding it back.

Which comes back to my point;

Having the Wii U's strengths displayed in a game doesn't make the console more than it is.

There are those who would point at most wanted or trine and say conclusively that the Wii U's GPU has to be beyond what we've been speculating to get those results on screen. Not necessarily the case.


With most wanted, they were able to use the system's strengths(which primarily is the large amount of ram it has over its other console brethren) to bring out high resolution assets. It doesn't mean the Wii U GPU is superior to any results we have been speculating here, or that there's some unknown mystery sauce that allows results like these to pan out.

This was in response to Syferz's query.
 
Bolded emphasis added by me. I may be a noob here, but this kind of post has nothing to do with the subject and is out of place. If you took a 360 with no altercations whatsoever besides replacing the GPU with a Radeon 7970, it would be beyond the next console generation. What would that have to do with this topic? And who said anything about "hidden power" ?? "The issue is that the Wii U has half the memory bandwidth of the 360" ? Seriously? This issue of what? What is the issue?

Giving it a Radeon 7970 would be like putting a Formula 1 engine on an unmodified sedan with no other changes and expecting it to top the timesheets on a NASCAR race.
 
BO2 on 360/Wii U is close to the PC's high settings (without AO and the Extra texture setting) @ 880x720 resolution with 2xMSAA and probably 4xAF.
BO2 definitely did not run at high pc settings. It looks like a mix of low and medium settings. The rendering of the plants looks like low settings but the models for the guns and the characters looks like its on medium.
 
Giving it a Radeon 7970 would be like putting a Formula 1 engine on an unmodified sedan with no other changes and expecting it to top the timesheets on a NASCAR race.

Not disagreeing with that, my only point was in response to Inuhanyou's hypothetical adding ram to a 360 in order to compare with a Wii U. AFAIK this topic is about discussing and investigating the hardware that makes up the Wii U. Not speculating how the 360 could be amended to surpass it (referring to Inuhanyou).
 
Not disagreeing with that, my only point was in response to Inuhanyou's hypothetical adding ram to a 360 in order to compare with a Wii U. AFAIK this topic is about discussing and investigating the hardware that makes up the Wii U. Not speculating how the 360 could be amended to surpass it (referring to Inuhanyou).

I clarified my point multiple times via an offhand comparison in regards to Syferz query about Most Wanted's performance relative to Wii U's GPU.

The additional ram in the Wii U enabled Criterion to take advantage of potential that the 360 and PS3 lacked. This is not a secret, nor is it a mystery.

Are you disagreeing with my assessment? If so, what is your alternative theory?
 
Well if we'll talking technically, if you equipped a 360 with a 7970, it would probably explode within a few seconds, not to mention be completely useless even if it did work, with the amount of bottlenecks holding it back.

Which comes back to my point;

Having the Wii U's strengths displayed in a game doesn't make the console more than it is.

There are those who would point at most wanted or trine and say conclusively that the Wii U's GPU has to be beyond what we've been speculating to get those results on screen. Not necessarily the case.


With most wanted, they were able to use the system's strengths(which primarily is the large amount of ram it has over its other console brethren) to bring out high resolution assets. It doesn't mean the Wii U GPU is superior to any results we have been speculating here, or that there's some unknown mystery sauce that allows results like these to pan out.

This was in response to Syferz's query.
You point is a bit puzzling.

Higher-res assets require:
  • higher BW for the higher-res textures
  • higher BW (arguably not much of an increase) and higher ALU for the higher-res meshes

Assuming everything that stopped a 360 from utilizing those assets is solely the amount of memory is just that - an assumption. An assumption can make a hypothesis, but proves nothing.
 
You point is a bit puzzling.

Higher-res assets require:
  • higher BW for the higher-res textures
  • higher BW (arguably not much of an increase) and higher ALU for the higher-res meshes

Assuming everything that stopped a 360 from utilizing those assets is solely the amount of memory is just that - an assumption. An assumption can make a hypothesis, but proves nothing.

the 360 has the same type of bandwidth system as the Wii U. EDRAM to EDRAM, as you are well aware of Blu, its redundant for me to say it. It may have 10MB, but its apples to apples. Which was my point in using 360 specifically as an example.

The devs reach into the EDRAM pool to make full use of the 1gb of ram, that they otherwise could not reach to full effect with the main pool of slow ram.

I'm not trying to instate a 360 versus Wii U scenario here, that wasn't my intention.

My only point was against Syferz query about the Wii U GPU having something to do with the Wii U's ability to render higher resolution textures in Most Wanted. Criterion themselves has said it was because of the memory usage, that's all i was repeating.
 
Well if we'll talking technically, if you equipped a 360 with a 7970, it would probably explode within a few seconds, not to mention be completely useless even if it did work, with the amount of bottlenecks holding it back.



Which comes back to my point;



Having the Wii U's strengths displayed in a game doesn't make the console more than it is.



There are those who would point at most wanted or trine and say conclusively that the Wii U's GPU has to be beyond what we've been speculating to get those results on screen. Not necessarily the case.





With most wanted, they were able to use the system's strengths(which primarily is the large amount of ram it has over its other console brethren) to bring out high resolution assets. It doesn't mean the Wii U GPU is superior to any results we have been speculating here, or that there's some unknown mystery sauce that allows results like these to pan out.



This was in response to Syferz's query.

Noone is claiming Wii U is more than it is. Nothing is more than it is.

It's also a bold assumption that more memory in the 360 would allow greater draw distance, better lighting, and the bandwidth for higher res textures.

If anything, and I'm being generous here, NFS throws a spanner into the theory that the Wii U is BW starved into impotence as some have claimed.

It's an enigmatic beast.
 
Noone is claiming Wii U is more than it is. Nothing is more than it is.

It's also a bold assumption that more memory in the 360 would allow greater draw distance, better lighting, and the bandwidth for higher res textures.

If anything, and I'm being generous here, NFS throws a spanner into the theory that the Wii U is BW starved into impotence as some have claimed.

It's an enigmatic beast.

That seems only slightly somewhat of a grab.

Different situations all around. Some games will be taxing on the bandwidth a console has, some will not be, some will be more GPU centric, others will be more CPU centric. Its not a "this game performs well so the hardware should always perform well in every situation".

Maybe NFS Most wanted fits better with that kind of set up. Maybe Criterion knew how to optimize it well(not going to doubt that either as Criterion are obviously a talented dev team).

I'm going on the actual assertion from multiple devs that ram has been the most limiting factor of this console generation on what they can do with the hardware. Could it not apply in this scenario we are debating? Sure, but i would not bet on that being the case.

The Wii U's internals aren't that much of a mystery. On par with current generation. A marginally more powerful GPU compared to Xenos and RSX, a slightly weaker CPU, twice the ram amount for games and 3 times the EDRAM of 360. Not all that exotic.

Did we ever find out the bandwidth for the EDRAM?
 
Please keep your Nintendo Fanboyism in check. As already stated, given the information we do know, it's evident that the WiiU has no magic that remains to be seen. Yet every single one of your posts tries to suggest otherwise.

This has nothing to do with "Fanboyism". We STILL don't have many information about the Wii U hardware, only speculations. The Wii U GPU seems to be altered from standard GPU beyond recognition and it seems it has many fixed functions (especially for lightning and shadows).

That said: Of course the Wii U has not the power of a high end pc. But obviously more power than PS3 or XBox360.

As mentioned: Criterion, the developer of Need For Speed, said that the Wii U version of "Most Wanted" will be the best looking version and they use PC assets and textures for this, on top of that the Wii U version has better drawing distance, better lightning and better framerate (the Xbox360 version only has 30FPS!).

Such developer statements seems to be much more informative than speculations about chip photos. Interesting: Criterion also claimed that they got the final Wii U dev kit at November 2012 ...
 
Last edited by a moderator:
This has nothing to do with "Fanboyism". We STILL don't have many information about the Wii U hardware, only speculations. The Wii U GPU seems to be altered from standard GPU beyond recognition and it seems it has many fixed functions (especially for lightning and shadows).
Wait wait wait...When and where did we established there were fixed function units in the GPU?
 
Just chipping in, I can test other games If there's an interest in other framerate and port performance comparisons.
Thanks for providing some real-world comparable reference figures. Sounds like the 320 SP GPU is capable of far better than Wii U is showing at present. Is that because the devs aren't able to use those resources (odd drivers, contrary to the way every other GPU works by distributing work to available resources), or there aren't as many resources to use?
 
Thanks for providing some real-world comparable reference figures. Sounds like the 320 SP GPU is capable of far better than Wii U is showing at present. Is that because the devs aren't able to use those resources (odd drivers, contrary to the way every other GPU works by distributing work to available resources), or there aren't as many resources to use?

Well, since the evidence is mounting that a 320 SP GPU is more capable than what's in the Wii U. Does anyone have theories to answer Richard's question?

What is your take on the far larger (66%) shader cores vs. Bobcat? It’s pretty much the only remaining argument I see against [function's] 160SP theory.
 
Status
Not open for further replies.
Back
Top