Wii U hardware discussion and investigation *rename

Status
Not open for further replies.
Ah, there we go, much cleaner now that the last round of illogical noise is removed. Now at least no one will be able to attempt to use it as reference material elsewhere with some hint of validation since it's on here.
 
720P with FSAA is 14/28 MB for 2x/4x. I'm not sure what Megafenix is talking about, because it is the same everywhere, but you would have to tile with FSAA.

Do we know how much eDram is there?
Do you mean on the WiiU? I think it has 32 MB of EDRAM according to rumours, iirc, but perhaps I am mistaken.

Developers Shin'en have been praising the WiiU as of recently, and they are very happy with how it performs. They speak highly of the console and don't quite understand how other developers can't use it as effectively and make justice to the platform.

“The Wii U eDRAM has a similar function as the eDRAM in the XBOX360. You put your GPU buffers there for fast access. On Wii U it is just much more available than on XBOX360, which means you can render faster because all of your buffers can reside in this very fast RAM. On Wii U the eDRAM is available to the GPU and CPU. So you can also use it very efficiently to speed up your application.”

“The 1GB application RAM is used for all the games resources. Audio, textures, geometry, etc. Theoretical RAM bandwidth in a system doesn’t tell you too much because GPU caching will hide a lot of this latency. Bandwidth is mostly an issue for the GPU if you make scattered reads around the memory. This is never a good idea for good performance.”

“I can’t detail the Wii U GPU but remember it’s a GPGPU. So you are lifted from most limits you had on previous consoles. I think that if you have problems making a great looking game on Wii U then it’s not a problem of the hardware.”
http://mynintendonews.com/2013/09/2...if-devs-cant-create-good-looking-wii-u-games/
 
Last edited by a moderator:
Wait till after the big guns are all out and see if that lifts the sales, namely

Mario, Zelda, Donkey Kong, Mario Kart, and the new Wii Fit.
 
Do you mean on the WiiU? I think it has 32 MB of EDRAM according to rumours, iirc, but perhaps I am mistaken.

Developers Shin'en have been praising the WiiU as of recently, and they are very happy with how it performs. They speak highly of the console and don't quite understand how other developers can't use it as effectively and make justice to the platform.

http://mynintendonews.com/2013/09/2...if-devs-cant-create-good-looking-wii-u-games/


It's worth remembering that Shin'en are effectively 2nd party Nintendo developers, although I don't personally think that changes the validity of what they are saying. They're a notoriously talented bunch of devs who (outside Shin'en) have a tonne of experience on all kinds of platforms. Its always great to hear their insights as they actually give out quite a bit of info, plus they have a reputation for squeezing the most out of whatever platform they're working on. As some hip gangsta forum member once put it, they are "old school devs" who are known for their "tight coding", yo. (I'm paraphrasing, but that was the gist)

Most interesting part is:
Shin'en said:
On Wii U the eDRAM is available to the GPU and CPU. So you can also use it very efficiently to speed up your application.

Thats good to know, right?
 
Thats good to know, right?

I think we more or less knew this, fail0verflow (Marcan) revealed a long time ago that there's 32MB of eDRAM mapped in the CPU's address space. Although that doesn't really say just how fast the CPU's access to it is. I wouldn't count on a really high speed bus between the CPU and GPU, but who knows.

I remember on PSP the VRAM (also eDRAM) was directly accessible by the CPU as well, and lower latency than the main memory.. I tried using it for normal buffers once but it didn't really help me ;p
 
It's worth remembering that Shin'en are effectively 2nd party Nintendo developers, although I don't personally think that changes the validity of what they are saying. They're a notoriously talented bunch of devs who (outside Shin'en) have a tonne of experience on all kinds of platforms. Its always great to hear their insights as they actually give out quite a bit of info, plus they have a reputation for squeezing the most out of whatever platform they're working on. As some hip gangsta forum member once put it, they are "old school devs" who are known for their "tight coding", yo. (I'm paraphrasing, but that was the gist)

I think it's worth remembering that there are "old school devs" capable of "tight coding" that are senior engineers in all big studios and all big engine makers.

Shinen are currently getting fellated by the Nintendo crowd because they need a hero to believe in, while other talented developers are shit on because they don't support the narrative that morons want to propel. The truth is that all developers are constrained by the realities of their situation. Vague but positive comments - when comparing the Wii U to a seven your older system ffs(!!!!) - don't really tell us much. They just help to promote that particular studio and their game to a userbase (understandable), but the flames of delusion can be fanned as collateral damage.
 
I also take a developer's Wii U praise with a little more salt when I know they develop exclusively for Nintendo consoles. Especially when they're talking about the console's strengths vs a previous console they never released anything for. I take this as only slightly removed from boasting made by Nintendo themselves.

Then again, I don't see anything that strikes me as technically wrong with their comments - yes, (from everything we know) Wii U's eDRAM is a lot more flexible than XBox 360's, accessible to the CPU, and the GPU supports GPGPU stuff. It's just a question of how pertinent this is. That last comment is the real sticking one.. ignoring that "great looking games" can mean pretty much anything, the deeper implication is that anyone who criticizes anything about Wii U's hardware just sucks at game development.
 
I think it's worth remembering that there are "old school devs" capable of "tight coding" that are senior engineers in all big studios and all big engine makers.

Of course there are, and their comments wouldn't warrant being disregarded either. I'm just pointing out that they are 2nd party developers, but their experience and expertise means it's always great to hear what they say. They go into a lot of detail usually. They also aren't saying anything too controversial, just things which we don't generally hear talked about aloud regarding WiiU.

Shinen are currently getting fellated by the Nintendo crowd because they need a hero to believe in, while other talented developers are shit on because they don't support the narrative that morons want to propel. The truth is that all developers are constrained by the realities of their situation. Vague but positive comments - when comparing the Wii U to a seven your older system ffs(!!!!) - don't really tell us much. They just help to promote that particular studio and their game to a userbase (understandable), but the flames of delusion can be fanned as collateral damage.

Wow, you read alot into that! ;) Sure some Nintendo fanboys are going to feverishly rally around anything positive because its supports their narrative, just as some Sony/MS fanboys are going to do their best to piss all over the same information because it doesn't support theirs. Thats why I mentioned both that a) they are effectively 2nd party so you can argue a conflict of interest; and b) 2nd party or no, they are well regarded developers and generally have interesting (and rare) insight into the hardware. I don't see any other developers (who have actually developed for WiiU & that have come out with actual technical insight into their experience) being "shit on", as you put it?

As I said I don't even think Shin'en said anything too controversial or shocking, tbh. I just thought the confirmation that the eDRAM was available to both CPU and GPU was a worthy of note and wanted to chime in before the usual "lolz, they're in Nintendo's pockets so of course they'd say that" or "rofl, they haven't developed for any other platforms so what do they know!" rubbish which usually follows their comments (although as Exophase pointed out, the eDRAm stuff was already known so i guess it wasn't that interesting after all :cry:)

PS: the "tight coding" bit was meant as a joke as I thought it was funny that someone genuinely used that term in the same sentence as "old school" :D Brap Brap!
 
Last edited by a moderator:
I think we more or less knew this, fail0verflow (Marcan) revealed a long time ago that there's 32MB of eDRAM mapped in the CPU's address space. Although that doesn't really say just how fast the CPU's access to it is. I wouldn't count on a really high speed bus between the CPU and GPU, but who knows.

If I remember correctly the 32MB was questioned at some point to lever the 55 (and 45?) nm manufacturing node hypothesis. But yes, most of it was known. It would be nice if they threw out some info on the shaders they use and polycounts.

[EDIT] BTW is 32MB@45nm possible given the die area?
 
If I remember correctly the 32MB was questioned at some point to lever the 55 (and 45?) nm manufacturing node hypothesis. But yes, most of it was known. It would be nice if they threw out some info on the shaders they use and polycounts.

[EDIT] BTW is 32MB@45nm possible given the die area?


Wouldn't it just :) At this point I don't see what the point is in keeping those things secret. If the point was to hide WiiUs shortcomings in comparison to the other two next gen consoles - then it is now a moot point as everyone has come to a conclusion anyway. If the point was to stop people worrying about graphics so as to keep the focus on games etc - then that is also moot as we've spent WiiU's entire existence analysing and discussion its hardware! If anything, keeping hardware specs under wraps just serves to take the focus off gameplay even more. :D
 
I think we more or less knew this, fail0verflow (Marcan) revealed a long time ago that there's 32MB of eDRAM mapped in the CPU's address space. Although that doesn't really say just how fast the CPU's access to it is. I wouldn't count on a really high speed bus between the CPU and GPU, but who knows.

I remember on PSP the VRAM (also eDRAM) was directly accessible by the CPU as well, and lower latency than the main memory.. I tried using it for normal buffers once but it didn't really help me ;p

Wii U CPU has total of 3MB of eDRAM as L2 Cache and 32MB eDRAM is embedded in Wii U's GPU so Wii U's CPU could not theoretically have access to GPU's eDRAM pool and they are in seperate chips so can anyone explain me how CPU can have direct access to GPU's eDRAM?

http://en.wikipedia.org/wiki/POWER7

IBM mentioned Power7 out of nowhere involving Wii U, later it was "debunked" except Power7 can use eDRAM as L3 cache and 32MB is supported though it should be on the same chip, right? Also max eDRAM per core as L3 cache was 4MB so it is possible that Wii U has something from Power7 implemented and we know Wii U CPU is made at IBM at 45nm process as Power7 chips. This gives the posibility, right?

Was Xbox 360 and PlayStation 3 CPU's mixture of PowerPC and Power series?

Anyway after the latest update, some users are reporting louder/faster fans and that their Wii U's are warmer and even near hot. Can anyone measure power consumption of the Wii U right now, if it is above 30 watts then it should point out bumped clocks and would confirm it if it goes above 40 watts.

Wii U's GPU must be Radeon HD 5550, I know some people will disagree though Call Of Duty Black Ops 2 Local Co-op zombie mode proves that it has eyefinity and there are evidence/hints:

http://www.ign.com/boards/threads/o...ower-thread.452775697/page-203#post-481660123

So... I am new in this forum so please, don't be harsh.
 
Been out of the loop for a good long while with regard to what's been discussed about Wii U, and what has been discovered nearly a year after release.

I was wondering if the level of quality seen in the updated UE3 demo from 2010 would be possible on Wii U at 720p/30fps ?

http://www.youtube.com/watch?v=kjxL_J_7j9M

I realize UE4 is far beyond the scope of its capability but since UE3 is fine, why not the improved version.

It kind of reminds me of the Japanese Garden / Bird demo shown on Wii U a year later during the reveal, but I wanted to know what others thought. I haven't yet played with an actual Wii U as I kinda lost interest in it around this time last year.
 
Wii U CPU has total of 3MB of eDRAM as L2 Cache and 32MB eDRAM is embedded in Wii U's GPU so Wii U's CPU could not theoretically have access to GPU's eDRAM pool and they are in seperate chips so can anyone explain me how CPU can have direct access to GPU's eDRAM?

http://en.wikipedia.org/wiki/POWER7

IBM mentioned Power7 out of nowhere involving Wii U, later it was "debunked" except Power7 can use eDRAM as L3 cache and 32MB is supported though it should be on the same chip, right? Also max eDRAM per core as L3 cache was 4MB so it is possible that Wii U has something from Power7 implemented and we know Wii U CPU is made at IBM at 45nm process as Power7 chips. This gives the posibility, right?

Was Xbox 360 and PlayStation 3 CPU's mixture of PowerPC and Power series?

Yes the Power 7 thing was PR guff, basically. At the very best, the only resemblance between Espresso and Power 7 is that they can both access eDRAM as a cache. The assertion that it was "based on Power 7" though, was nonsense.

Anyway after the latest update, some users are reporting louder/faster fans and that their Wii U's are warmer and even near hot. Can anyone measure power consumption of the Wii U right now, if it is above 30 watts then it should point out bumped clocks and would confirm it if it goes above 40 watts.

Wii U's GPU must be Radeon HD 5550, I know some people will disagree though Call Of Duty Black Ops 2 Local Co-op zombie mode proves that it has eyefinity and there are evidence/hints:

http://www.ign.com/boards/threads/o...ower-thread.452775697/page-203#post-481660123

So... I am new in this forum so please, don't be harsh.

WiiU using more than 30 watt is fairly well known and documented and says nothing about clocks being "bumped". The consoles power draw will creep up from initial measurements simply because the initial batch of games wouldn't necessarily have taxed the system to its limits. Given Nintendo's own admission that the console will use ~40w in typical usage, a power draw above 30w wouldn't imply anything regarding clock speeds increasing. Anecdotal evidence of increased heat can't really be used as evidence for anything either.

Regarding the BLOPS2 comment; WiiU's GPU isn't a "Radeon HD" anything. It's a custom GPU based on the specification of the RV7xx* series GPU (*according to apparently leaked specs)

Been out of the loop for a good long while with regard to what's been discussed about Wii U, and what has been discovered nearly a year after release.

I was wondering if the level of quality seen in the updated UE3 demo from 2010 would be possible on Wii U at 720p/30fps ?

http://www.youtube.com/watch?v=kjxL_J_7j9M

I realize UE4 is far beyond the scope of its capability but since UE3 is fine, why not the improved version.

UE4 (the engine) is not beyond WiiUs capability at all. The demo they used to promote UE4 is beyond WiiU from a processing perspective, becuase it was running on some very beefy PCs (even the slightly-dumbed-down PS4 version is obviously beyond what WiiU can run) - But the engine behind it is not beyond WiiUs capability at all, it just lacks official support from Epic. This just means if a developer wants to use UE4 for their WiiU game they won't have any official support from Epic.

And re your query about the UE3 demo: I don't see any graphical effect/feature in it that would be inherently impossible on WiiU, however I've no idea if the demo itself would require more processing power than WiiU has. If that "level of quality" is possible at a particular resolution/framerate is impossible to answer I'm afraid. "Level of quality" is subjective, for a start ;)
 
Last edited by a moderator:
Wii U CPU has total of 3MB of eDRAM as L2 Cache and 32MB eDRAM is embedded in Wii U's GPU so Wii U's CPU could not theoretically have access to GPU's eDRAM pool and they are in seperate chips so can anyone explain me how CPU can have direct access to GPU's eDRAM?
Direct access means that it is mapped in the CPU's address space, not that there is a direct data bus between CPU and eDram if that's what you meant.
 
Latte's bandwidth is 563.2GB/s thanks to 8192 bits 32MB eDRAM from Renesas clocked at 550Mhz and it embedded into GPU.

Espresso is 15GFLOPS and its performance is comparable to Pentium E5800 or i5 480M.
 
Status
Not open for further replies.
Back
Top