Wii U hardware discussion and investigation *rename

Status
Not open for further replies.
Have you not read years, maybe decades of horror stories of devs dealing with Nintendo back to the NES days (when they seemed even the worst). come now.

Nintendo haven't been the same since N64.
They were painful to deal with in the NES/SNES/N64 days, post N64 they've had a hard time getting 3rd party support and they're amongst the easiest to deal with.
As I understand it Sony we're really painful to deal with in the PS2 era for much the same reason Nintendo were in the SNES era.
 
That's specifically a complaint about Wii Ware.
As I understand it today compared to publishing on PS3/360 Nintendo is relatively painless. And you should understand that in no way is anyones submission process painless.

Trust me back in the SNES days dealing with Nintendo was "special", I know of one game that was submitted over 30 times, with a required 100hrs of VHS tape with every submission. It was bounced because of none specific problems on a limited run version of the hardware that the publisher could not get a version of.
Mario Club was the worst implementation of a good idea ever. Nintendo would have 3 experts rank you game they would send you the reports together with a score. Invariably, the reviewers would disagree, and you would be docked points for the conflicting viewpoint, "Game is too hard", "Game is too easy".
The worst part of it was that the points mattered, if you exceeded the arbitrary rating the game didn't rank against the publishers limited publishing slots.
 
Christ. Who actually sat down and WATCHED 100 hours of taped video gameplay??? That's ludicrous. I start getting itchy after 15 minutes of speedrun play on youtube, and that's typically recorded by creative and entertaining people...

Surely they made that demand just to screw with 3rd party devs.
 
It used to be required that any game submission was accompanied by video tape footage of every possible path through the game.
I have no idea if it was ever watched, I know the game in question after the first couple of submissions they just copied the tapes, rather than have some tester actually record new footage.
 
What are the stats regarding glitches from games on Nintendo systems vs the others?
As a consumer I like being able to simply put a game in and have it work properly the first time.
 
Last edited by a moderator:
Last time I submitted a cross platform title, which is a while ago. The submission process was much the same on all of the platforms.
Platform vendors only do basic QA now, the publishers do the bulk of the testing.
 
So am I imagining it that games on Nintendo systems seem to be less glitchy?
In fact I can't remember if I've ever patched a Nintendo game or even a 3rd party game.
 
We submit the same code on all 3 platforms for a cross platform game, very occasionally you might patch an issue if something turned up,in a platform submission.
But there is no magic in Nintendo testing.
 
We submit the same code on all 3 platforms for a cross platform game, very occasionally you might patch an issue if something turned up,in a platform submission.
But there is no magic in Nintendo testing.

I never suggested magic, that would be unscientific.
Just curious about the actual facts. I can only go by own anecdotal experiences though, and they seem to suggest to me that games on Nintendo systems are overall less glitchy.
 
Alleged Wii U specs leaked, looks pretty legit and a GAF mod says they are similar to what he had.

Also seems similar to some things bg has said.

Nothing much on GPU though, shaders/clock etc. Maybe people more knowledgeable can glean more about it from the specs than I can.

Main Application Processor

PowerPC architecture.
Three cores (fully coherent).
3MB aggregate L2 Cache size.
core 0: 512 KB
core 1: 2048 KB
core 2: 512 KB
Write gatherer per core.
Locked (L1d) cache DMA per core.

Main Memory

Up to 3GB of main memory (CAT-DEVs only). Note: retail machine will have half devkit memory
Please note that the quantity of memory available from the Cafe SDK and Operating System may vary.

Graphics and Video

Modern unified shader architecture.
32MB high-bandwidth eDRAM, supports 720p 4x MSAA or 1080p rendering in a single pass.
HDMI and component video outputs.

Features

Unified shader architecture executes vertex, geometry, and pixel shaders
Multi-sample anti-aliasing (2, 4, or 8 samples per pixel)
Read from multi-sample surfaces in the shader
128-bit floating point HDR texture filtering
High resolution texture support (up to 8192 x 8192)
Indexed cube map arrays

8 render targets
Independent blend modes per render target
Pixel coverage sample masking
Hierarchical Z/stencil buffer
Early Z test and Fast Z Clear
Lossless Z & stencil compression
2x/4x/8x/16x high quality adaptive anisotropic filtering modes
sRGB filtering (gamma/degamma)
Tessellation unit
Stream out support
Compute shader support

GX2 is a 3D graphics API for the Nintendo Wii U system (also known as Cafe). The API is designed to be as efficient as GX(1) from the Nintendo GameCube and Wii systems. Current features are modeled after OpenGL and the AMD r7xx series of graphics processors. Wii U’s graphics processor is referred to as GPU7.
Sound and Audio

Dedicated 120MHz audio DSP.
Support for 6 channel discrete uncompressed audio (via HDMI).
2 channel audio for the Cafe DRC controller.
Monaural audio for the Cafe Remote controller.

Networking

802.11 b/g/n Wifi.

Peripherals

2 x USB 2.0 host controllers x 2 ports each.
SDCard Slot.

Built-in Storage

512MB SLC NAND for System.
8GB MLC NAND for Applications.

Host PC Bridge

Dedicated Cafe-to-host PC bridge hardware.
Allows File System emulation by host PC.
Provides interface for debugger and logging to host PC.

Also from GAF mod



My brief takeaway would be that this seems similar to rumors we had from quite a while back:

-3 core CPU
-1.5 GB of RAM in the retail version
-A DX10 compliant GPU, since the shaders are unified (model 4.0), but not compute shaders (model 5.0)

1.5 GB RAM is .5 GB more than I expected. If it pans out.

Also could be from an older dev kit.
 
"8 render targets" means 8 ROPs?
More 8 render targets :)

By the way BGAssassin was rigth about the weird L2 cache.
I was more than septic about its credential, was wrong.
I don't get this:
Write gatherer per core.
Locked (L1d) cache DMA per core.

What does that mean?

We miss indeed critical informations:
how many SIMD?
Clock speeds?
OoO execution?
 
Last edited by a moderator:
I was watching 720p feed and what TT Games had managed to get Lego IQ was just unbelivable bad. Tearing, pop-in, framerateissues in a really simple game.
 
I was watching 720p feed and what TT Games had managed to get Lego IQ was just unbelivable bad. Tearing, pop-in, framerateissues in a really simple game.
I don't know about clock speeds (CPU/GPU) but I can't say that I'm turned on by the specs.

The 32MB of EDRAM seems to have take a way to significant part of the silicon budget. Possibly the part of the budget which would have allowed the WiiU to be a more proper in between design.

By the early IBM comment one could think that the 32MB EDRAM is on the CPU. That makes a quiet big chip.
Last blue gene chip is ~360 sq.mm @45nm.
It has 32nm of cache. I would assume that WiiU pool of edram will be densier as it's not a cache (simpler, no ECC, etc.) but still.
On 32nm thing to get better still a significant investment.

I could see Nintendo like MS having 3 chips in the device. Edram in the CPU is just the 3MB of cache. It allows for great density and nice power characteristics.

Definitely don't like it. Especially for Nintendo and their intended targets, less bandwidth intensive forms of AA was good enough and would have free quiet some transistors.
 
Last edited by a moderator:
Status
Not open for further replies.
Back
Top