Wii U hardware discussion and investigation *rename

Status
Not open for further replies.
Pretty sure the DDR3 isn't clocked at 800MHz. It should be 729MHz. I was told the DSP would be running at 120MHz. Looking at Nintendo's MO, it's probably not really 120MHz, but 121.5MHz - same base clock as the Wii. Nintendo likes clean multipliers, so I would assume the RAM to be clocked at 729MHz (6 x 121.5). Same as the Wii CPU. Nintendo likes to keep RAM and CPU in sync, so the CPU should be running at 1458MHz (12 x 121.5). Accordingly, the GPU would be clocked at 486MHz (4 x 121.5), and the eDRAM at either 486 or 729MHz.

I don't know why Nintendo always seems to do this. I guess using a single fixed base clock and only changing multipliers for various components is simpler. And it definitely gives more predictable results. I don't see Nintendo giving that up.

IIRC (I think it was in a Iwata asks years ago) they do it because of synchronizing between components, which works better with fixed multipliers. I am no tech guy so I can't verify this.
 
I have to say though that given what we know today, it seems to punch above its weight even at this point in time. There are a number of multi platform ports on the system, at launch day with what that implies, that perform roughly on par with the established competitors. And those games are not developed with the greater programmability, nor the memory organization of the WiiU in mind. So even without having its strengths fully exploited, it does a similar job at less than half the power draw of its competitors at similar lithographic processes! And its backwards compatible. To what extent its greater programmability and substantial pool of eDRAM can be exploited to improve visuals further down the line will be interesting to follow.
How what we have seen so far can be construed as demonstrating hardware design incompetence on the part of Nintendo is an enigma to me.

Without really fully understanding the ins and outs of your very good post, I thought this last bit was an important point. To say Nintendo are incompetent or wasteful in their design is wholly inaccurate. They have, in my view, set out to produce a system which can a) receive ports from rival consoles b) be backwards compatible with Wii c) support a modern feature set d) incorporate their Gamepad tech e) do all that with low power draw. And they've made a seemingly efficient and elegant solution to that design brief.

Whether its the right route to go for their next system is another debate. But it's certainly not a poorly designed system just because it's not pushing the performance envelope.
 
Developers have it wrong - the WiiU is powerful

OK, I just read through this article, and a bunch of the comments on top of that. I can't see why I should suffer alone so I dare some of you to read it as well. I warn you, you may feel your brain cells dying as you read through it.

I read the title about 12 hours ago and was thinking to link it here for comments.
Got 50% of the way through and concluded it was a trash read and isn't worth posting here as it wouldn't get anywhere with all the biased optimism and sheer amount of bad logic.

ME3 and Batman are having framerate issues because of the GPGPU. What?
By the fact that it had been released on consoles that don't even have one already qualifies that they work perfectly fine without one or simply not utiliizing it.
@davros: you probably incur framerate drops for anything if you enable it. Pegging the GPGPU as the culprit for the bad fps is another thing.
 
Last edited by a moderator:
Well I read it and i am now enlightened
"but it’s unlikely the PS4 and 720 will feature GPU’s that are any beefier than the Wii U"
@strange batman arkham city did suffer framerate drops when enabling physx its funny the writer understands it and then later on forgets and takes the opposite stance
 
Seriously what did you expect from ZeldaInformer website? If that doesn't spell Nintendo Fanboys I don't know what else would.
 
Well I read it and i am now enlightened
"but it’s unlikely the PS4 and 720 will feature GPU’s that are any beefier than the Wii U"
@strange batman arkham city did suffer framerate drops when enabling physx its funny the writer understands it and then later on forgets and takes the opposite stance

The WII U is bandwidth limited,and if th GPU general processing mean more data from the main memory then the speed will fall.

As simple as that.
 
So, what are the changes that CPU and GPU have quite fast bus between them and gdram on GPU can be properly accessed by CPU?
This certainly would allow some interesting possibilities.
 
So, what are the changes that CPU and GPU have quite fast bus between them and gdram on GPU can be properly accessed by CPU?
This certainly would allow some interesting possibilities.

According to Nintendo, the CPU and GPU do have fast communication between them. I think that is probably true, both because they have no particular reason to bring it up if it isn't the case, and because it isn't really a challenge to have on package signalling that is very fast in relation to the relatively modest processing power.

While it hasn't been officially confirmed that the eDRAM on the GPU is accessible to the CPU, it seems likely. It would definitely help avoid any issues in emulation of the Wii accessing its 24MB of 1T-SRAM, and the CPU has its memory requests handled through the GPU anyway, so barring any evidence to the contrary it seems like a reasonable assumption.
 
I've read some people on NeoGAF saying things like "when developers build "gpu centric" games, Wii U will show next gen face...

What's mean "gpu centric"? Can the devs stop using cpu and begin to use gpu for general purpose code?

All this looks like Nintendo fans words (IMO).
 
I've read some people on NeoGAF saying things like "when developers build "gpu centric" games, Wii U will show next gen face...

What's mean "gpu centric"? Can the devs stop using cpu and begin to use gpu for general purpose code?

All this looks like Nintendo fans words (IMO).

They basically meen that when developers start making games for the Wii U hardware the graphics will improve, at the moment it's only really running games that are ports and thus we not designed for Wii U's architecture or memory lay out.

From what I've seen some ports on Wii U have better graphics in places and others show worse graphics. But the most obvious thing seems to be reduced shadow resolution as show in COD and Mass Effect.

However if reduced shadows is all what's really suffering on ports then I don't think the bandwidth problem is as bad as people think.
 
They basically meen that when developers start making games for the Wii U hardware the graphics will improve, at the moment it's only really running games that are ports and thus we not designed for Wii U's architecture or memory lay out.

From what I've seen some ports on Wii U have better graphics in places and others show worse graphics. But the most obvious thing seems to be reduced shadow resolution as show in COD and Mass Effect.

However if reduced shadows is all what's really suffering on ports then I don't think the bandwidth problem is as bad as people think.

No one port show "better" graphics than current gen version, even BLOPS2 is sub-hd at the same visuals than 360 version but worse performance.

And this is what I've read:

GPU centric:
If developers would design the games around the WiiU architecture.

-They can can run those same games at higher resolutions, and higher frame-rates with no tearing. (ex: 1080p 60fps)

-They can run AAAA next gen games at lower resolutions (Sub HD or 720p 30 to 60fps) and a steady frame rate. By the time we see games like this come out, it will probably be mid to late 2014. Depending on which developers are still around.

But, how many AAAA games will we have next gen? And how many will be multi-plat and not first party exclusives? Most likely, the majority of the games will stay within the same budget ranges as this gen, with enhancements the WiiU is most likely designed to handle.


Conclusion
the CPU and memory is good enough to handle current gen games.
Which is what Nintendo wanted this first year.

Nintendo appears to have put their money into latencies and utilizing advanced memory controllers to offset memory bandwidth restrictions. So Im not seeing the memory as being a bottleneck. The bottleneck are the publishers who decide the money, the budget and quality of the team working on ports, and games in general.

Then again, Publishers can also decide in the future to use WiiU as a base, and cheaply port games to the Durango and PS4. Because if those consoles are that more powerful than the WiiU, they should run games made on the WiiU with no problem. Which means visual and performance differences between WiiU games and future consoles will be minimal. But this will be based on the target audience install base WiiU will have by E3.
 
What's mean "gpu centric"? Can the devs stop using cpu and begin to use gpu for general purpose code?

Well sort of, in a limited sense. If you are aware of whats been going on in the pc space we have a lot of games that are doing physics on the gpu instead of the cpu.
maths based stuff that is parallelize-able like particles (smoke ect) object destruction and waves

heres an example
 
Sounds like a lot of wishful thinking to me.
If it can't run games at higher resolution now I doubt it ever will, about the easiest thing in the world to change is the resolution you're running at.
It also is not impacted by the CPU performance.

Could ports be better on WiiU?
Probably, but I don't know how much better.

Can WiiU compete with 720/PS4 if they constitute a significant performance improvement? That I very much doubt.

Will WiiU be the lead platform for next gen games? I just don't see it.
 
Well sort of, in a limited sense. If you are aware of whats been going on in the pc space we have a lot of games that are doing physics on the gpu instead of the cpu.
maths based stuff that is parallelize-able like particles (smoke ect) object destruction and waves

heres an example

But we are talking about using the Wii U GPU for CPU things. I know your example, dual gpus on pc for physics and so, but a good gpu for graphics + a second gpu for gpgpu + a good cpu (PC) is not the same than poor cpu + a (unknown) gpu for graphics and gpgpu.

I don't know if I am explaining my point.
 
But we are talking about using the Wii U GPU for CPU things. I know your example, dual gpus on pc for physics and so, but a good gpu for graphics + a second gpu for gpgpu + a good cpu (PC) is not the same than poor cpu + a (unknown) gpu for graphics and gpgpu.

I don't know if I am explaining my point.

PhysX can be used with a single GPU.
This demo is essentially demonstrating what GPGPU is best used for.

I believe many people have stated that trying to get GPGPU to take over other CPU tasks is a big nono.
 
PhysX can be used with a single GPU.
This demo is essentially demonstrating what GPGPU is best used for.

What is the performance using one gpu compared to dual gpu?

I believe many people have stated that trying to get GPGPU to take over other CPU tasks is a big nono.

It is not a big nono, I mean it is not "magic". If Xbox 720 and/or PS4 use an apu (for physics and gpgpu things) and a discrete gpu plus a better cpu than Wii U, how can the gpgpu do magic for Wii U?

And I am not saying that Wii U is less powerfull than today consoles.
 
Last edited by a moderator:
Status
Not open for further replies.
Back
Top