Wii U hardware discussion and investigation *rename

Status
Not open for further replies.
Nope. Rumours have placed it at 32MBs consistently but given the POWER7 confusion, that may be a figure from nowhere. It's what I believe is in there though. If significantly less (eg. 10 or 16 MBs), Wii U will have lost much of its main redeeming feature regards system design.
 
Edram is shared with CPU?

I expected the memory bw to be slightly up from 360 to help the tablet burden. But this is really something else that will badly affect performance.

GPU is somewhat irrelevant now other than "features"
 
Wow, this entire launch seems like a disaster.

As pointed out on GAF, it's now entirely possible Wii U will be in a worst position relative to PS4/720 than Wii was to 360/PS3 (considering it appears Wii U may be at or below current gen, whereas presumably Wii was >Xbox and certainly PS2).

I did say way back when I figured Nintendo would find a way to badly bottleneck the thing...

And there's a 5GB, one hour plus download out of the box for basic functionality. If Microsoft launched a console in this state the internet would have burned down from the rage.
 
I don't know what it's for, but it seems there might be another memory chip on the bottom side of the PCB? Close to the MCM, manufactured by Hynix.


Nope. Rumours have placed it at 32MBs consistently but given the POWER7 confusion, that may be a figure from nowhere. It's what I believe is in there though. If significantly less (eg. 10 or 16 MBs), Wii U will have lost much of its main redeeming feature regards system design.
The VGleaks specs are apparently spot-on and pretty much copied and pasted from the Nintendo SDSG website (warioworld.com). Assuming they are indeed correct, it's 32MB, and those 32MB are the "main RAM" - MEM1, as Nintendo calls it. The 2GB DDR3 are secondary memory (MEM2).
 
the half-res shadowmap in this batman WiiU screenshot would can be explained with memory architecture
on X360 you rasterize shadowmap in eDRAM and move it to Main RAM for futur use like a texture
on WiiU you rasterize shadowmap in eDRAM too but in some cases it's probably better to leave the shadowmap in the eDRAM for futur use directly from the eDRAM (because unlike X360, WiiU eDRAM is probably a video memory space for ROP and TMU and can be used by texture/map for help Main ram) and in this way liberate some Main ram bandwitdh but eDRAM is a limited space, this can explaine low res choice.
it will be the same dilemma for MSAA, i think they will be lot of 720p without MSAA

Nice picture, thanks. Looks like the Wii-U uses some anisotropic filtering at least., because it looks a bit sharper.
 
I always suspected the main memory would be on a pretty narrow bus. 64bits would not surprise me.


Wii main ram latency:roughtly 6ns,random,the texture+frame buffer ram requiring 4ns latency.

So based on this there has to be 32 megs of edram in the gpu,at leat on 500 mhz,and with at least 4096 bit wide memory interface.

Simply the main memory latency is not good enough for the wii hw emulation. (that is slower than the embeded 24megs edram of the wii)
 
Hey Alstrong became an internet celebrity for calculating the total bandwidth :)

Yeah the floor textures seem a bit sharper, but the lower res shadowmaps are a lot more noticeable, making the Wii U version look worse.
I don't know what resolution the X360 is outputting (sub-720p or not?) but it's certain that the Wii U isn't doing any better, which is a big let down by itself.
 
Wii main ram latency:roughtly 6ns,random
I doubt it, DDR3-1600 or whatever it is has quite a bit higher latencies for random access, especially if you don't just look at the memory, but also include the memory controller. It is probably closer to 60ns than 6ns :rolleyes:. GPUs can usually only dream of 60ns memory latency, it's far higher (and for the Wii U GPU probably, too).

And stop making something up with no base other than some almost irrelevant latency numbers!
 
Nope. Rumours have placed it at 32MBs consistently but given the POWER7 confusion, that may be a figure from nowhere. It's what I believe is in there though. If significantly less (eg. 10 or 16 MBs), Wii U will have lost much of its main redeeming feature regards system design.

Indeed.
I can understand if they have gone with a 64-bit interface to DDR3, but damn, that makes the achievable results extremely dependent on how well the MEM1 memory pool is utilized. That's bound to bite them, hard, on ports where the focus understandably will be on things other than ripping out and rewriting otherwise perfectly well functioning game code.
I wonder if it would be possible to get a thread going on eDRAM usage and consequences.

The WiiU design is extremely GPU-centric compared to any previous console design. It will be interesting to see how its future competitors are balanced.
 
I doubt it, DDR3-1600 or whatever it is has quite a bit higher latencies for random access, especially if you don't just look at the memory, but also include the memory controller. It is probably closer to 60ns than 6ns :rolleyes:. GPUs can usually only dream of 60ns memory latency, it's far higher (and for the Wii U GPU probably, too).

And stop making something up with no base other than some almost irrelevant latency numbers!
The latency figures come from Nintendo:

Framebuffer:
ca. 2 MB Sustainable Latency : 6,2ns [1T-SRAM]

Texturenspeicher:
ca. 1 MB Sustainable Latency : 6,2ns [1T-SRAM]

Texturen-Lesebandbreite:
10,4 GB/Sekunde [Peak]

Speicherbandbreite:
2,6 GB/Sekunde [Peak]
http://www.nintendo.de/Kundenservic...Technische-Daten/Technische-Daten-619165.html
 
Yes [edit: you linked some game cube specs, but close enough, the higher clock speed of the Wii reduced the latency to ~4 ns]. But he was answering a post about the Wii U 64bit DDR3 main memory (that's why I thought he referred to the Wii U, I probably stopped reading his latency posts very thorough :oops:). He tries to establish requirements for the emulation of the old Wii and claims the latencies are too high to get it done. I think this leads nowhere.

And btw., the Wii also had 64MB GDDR3 as the largest memory pool with a very likely much higher latency. Even the 24MB 1T-SRAM had probably a higher latency than the much smaller texture and framebuffer pools (over a certain size, latency tends to be dominated by the wiring, not the memory cells itself), especially it is not on die but one needs to go over an external interface to a second die to access it.
left: Broadway, right: Hollywood with its components Vegas (GPU with 3 MB integrated 1T-SRAM) on top and Napa (1T-SRAM chip with 24MB) on the bottom

20061124willchip2.jpg
 
Last edited by a moderator:
Wow, this entire launch seems like a disaster.

As pointed out on GAF, it's now entirely possible Wii U will be in a worst position relative to PS4/720 than Wii was to 360/PS3 (considering it appears Wii U may be at or below current gen, whereas presumably Wii was >Xbox and certainly PS2).

I did say way back when I figured Nintendo would find a way to badly bottleneck the thing...

And there's a 5GB, one hour plus download out of the box for basic functionality. If Microsoft launched a console in this state the internet would have burned down from the rage.


Yeah the launch hasn't exactly been peachy has it. That download and the slow o/s arent the best first impressions! Let's hope they sort the latter out.

Must admit I fail to see how the RAM speed changes WiiUs position going into the next gen that drastically. Not saying it doesn't matter - but it's what a lot of people were claiming it would be, isn't it?


Edit: Plus I think its confirmed now that it's a 1GB download. Still massive though.
 
Last edited by a moderator:
I doubt it, DDR3-1600 or whatever it is has quite a bit higher latencies for random access, especially if you don't just look at the memory, but also include the memory controller. It is probably closer to 60ns than 6ns :rolleyes:. GPUs can usually only dream of 60ns memory latency, it's far higher (and for the Wii U GPU probably, too).

And stop making something up with no base other than some almost irrelevant latency numbers!

Great,the WII 24 megs of embedded memory has less than 10ns latency

So it is not possible to use the main memory pool for wii hw emulation.

But it means that the edram has to be big enough to fit the 24megs of edram ,so the minimum size is 32 megs (if we consider the frame buffer + the texture buffer as well)

there are four possible scenario after this.
a,they using the texture cache as texture buffer for legacy wii games, in that case the minimum bandwith is defined by the frame buffer data with - 64 bit/1megs of edram.
So in that case it is 2048 bit,on 500 mhz(minimum)
b,they using the edram for the texture buffer as well,and instead of 1 megs, they using 4 megs (low 256 kbyte on each megabyte) with 128 bit bandwidth /1megs - 4096 bit interface, 500 mhz minimum
c,2 megs texture+2megs frame+24 megs main +4 megs free-> 256 bit/megs bandwidth 8192 bit 500 mhz minimum.
d.,1 megs texture+2megs frame+24 megs main +5 megs free-> 512 bit/megs bandwidth 16384 bit 500 mhz minimum.

All of this considering that they didn't implemented same fancy pre-complier for the wii games, with per-game microcode management.
 
PCPer did a teardown. Great stuff... they actually read off the RAM chip (SMRT!). There's only 4 chips.

Samsung K4W4G1646B -> DDR3 4Gbit, 1.5V, 800/933/1066 speed bins -> I'm pretty sure that means "DDR3-1600/1866/2133" data rate.

256Mx16 would imply 16-bit I/O per DRAM... So... 17GB/s at most for main memory bandwidth. They didn't mention the rest of the numbers on the DRAM, but they ought to have high res photos later.

It doesn't make any sense.
The GC has been designed to be able to read/write 4 times the memory on every frame.
In this case the WIIU can read only the quoter of the main memory on every frame.

Why they implemented this much memory ?
It is simply wast of the money.512 megs should be enough.
The best part of the memory will serve as a cache for the DVD reading.
 
Yeah the floor textures seem a bit sharper, but the lower res shadowmaps are a lot more noticeable, making the Wii U version look worse.
I don't know what resolution the X360 is outputting (sub-720p or not?) but it's certain that the Wii U isn't doing any better, which is a big let down by itself.

I'm not here to state if the Wii-U is more powerful or not, because I can't provide proof for you, but I'm sure that these games are using ported engines, engines not built up from scratch with the Wii-U in mind.
The other consoles, where the majority of the games and engines will come at the beginning are all GPU bond hardware, while the Wii-U is a CPU bond/limited console, so one would need an entirely different approach and solutions to get the best out of it. (It would be nice to see Rage on the Wii-U for example, I wonder if it would be possible to use GPU-transcode to "help" the CPU, like how it's possible on the PC)

I'm sure that sooner or later devs will learn how to achieve the same or better visuals what you can see on the xbox360 or on the PS3 because the GPU is probably more powerful (I assume).
 
Status
Not open for further replies.
Back
Top