Wii U hardware discussion and investigation *rename

Status
Not open for further replies.
I don't tend t draw conclusions ever; I'm just leaning more in favour of the existing evidence given what's been said about it. I don't buy the 'AA's not enabled yet' idea. we've had 2xMSAA for donkey's years. 32 MBs of eDRAM, 720p frames, and no AA?? You may be right, but the evidence is strongly in favour of the lower-spec'd machine, both what we've seen at E3 and Nintendo's known preferences for hardware and the various rumours. I'll be very shocked indeed if we end up with a console significantly more powerful than PS360.
Rumor has it that the eDRAM isn't just a framebuffer. Depending on what else you're using it for, you might simply not have enough space left for real AA (Batman and Trine use FXAA).
 
Originally Posted by Espresso
It isn't power7. it isn't SPU or cell. it isn't a 4xx. It is the same core as Wii, with 3 of them and larger L2's, clocked a little bit faster.

this has so much the ring of truth, cause it's just so nintendo to shatter hardware expectations, in the wrong direction.

i'd pretty much bet money it's the case.
 
This issue of multi platform will hurt nintendo, they should have set out a gpu/cpu spec that is 50% better with 2gb ram, with the emphasis on future cross platform compatability.

That way they can have top dog for a year to get good install base, then make wiiu an attractive easy port in future when it isnt the lead platform.
 
It doesnt even seem like decent hardware is costly (same goes for Microsoft if they are actually considering Cape Verde over Pitcairn, I mean pitcairn is 210mm).

Cape Verde pushes a teraflop plus and is 123mm. Too much? I'm sure they could have grabbed teraflop hardware from the r700 line even cheaper.

It's just Nintendo is not interested, just like last time with Wii.
 
Yep, just like Wii. Only this time they haven't even got BC as an excuse. We'll have to see what their launch price is like to see if they could have been more adventurous.
 
this has so much the ring of truth, cause it's just so nintendo to shatter hardware expectations, in the wrong direction.

i'd pretty much bet money it's the case.
The problem is that Espresso also said the chip would be as fast as Xenon. I have no idea how you'd ever reach that level of performance with a low clocked (closer to 729MHz than 3.2GHz - also according to Espresso), single threaded (again: Espresso) three core 750.
 
Does it make any tiny bit of sense to just glue together three 10-year-old CPUs?

I'm pretty sure that general performance/transistor should be quite a bit higher on newer architectures.
Would IBM sell a PowerPC 750 architecture license for much less than a Power6 or Power7?

The Wii might have been an up-clocked gamecube with more memory (though some said they doubled the amount of TEV units?) so they could afford to just use the same core, but the Wii U is a whole different animal. Would they gain that much money on skipping technological advances in this?
 
The problem is that Espresso also said the chip would be as fast as Xenon. I have no idea how you'd ever reach that level of performance with a low clocked (closer to 729MHz than 3.2GHz - also according to Espresso), single threaded (again: Espresso) three core 750.

From what I hear (and I trust my source on this as reliable), the CPU chip in WiiU is almost certainly weaker than Xenon & Cell. Which is why some devs have outright refused to port their existing (in-development) PS360 games to it. I believe Nintendo indended the GPU compute support to make up the difference for their weak CPU choice, however the CPU performance is so bad that devs would be required to refactor their entire engines to get stuff working on the GPU instead, and for many dev that would be impossible on their current project schedules.
;)
 
From what I hear (and I trust my source on this as reliable), the CPU chip in WiiU is almost certainly weaker than Xenon & Cell. Which is why some devs have outright refused to port their existing (in-development) PS360 games to it. I believe Nintendo indended the GPU compute support to make up the difference for their weak CPU choice, however the CPU performance is so bad that devs would be required to refactor their entire engines to get stuff working on the GPU instead, and for many dev that would be impossible on their current project schedules.
;)
Interesting. So the system isn't all that conventional after all, huh?
 
Last edited by a moderator:
From what I hear (and I trust my source on this as reliable), the CPU chip in WiiU is almost certainly weaker than Xenon & Cell. Which is why some devs have outright refused to port their existing (in-development) PS360 games to it. I believe Nintendo indended the GPU compute support to make up the difference for their weak CPU choice, however the CPU performance is so bad that devs would be required to refactor their entire engines to get stuff working on the GPU instead, and for many dev that would be impossible on their current project schedules.
;)

Well, that's... just sad (though things pointed in this direction). I wonder if that was the primary focus of the tweaking period the 5th kit went through.

EDIT: Also goes back to what I said about Nintendo and their pursuit of "balance" in their hardware. Looks like it bit them even worse than I originally thought.

EDIT 2: But I am glad to see the emphasis on a GPGPU. I didn't expect that from them.
 
Last edited by a moderator:
From what I hear (and I trust my source on this as reliable), the CPU chip in WiiU is almost certainly weaker than Xenon & Cell. Which is why some devs have outright refused to port their existing (in-development) PS360 games to it. I believe Nintendo indended the GPU compute support to make up the difference for their weak CPU choice, however the CPU performance is so bad that devs would be required to refactor their entire engines to get stuff working on the GPU instead, and for many dev that would be impossible on their current project schedules.;)

If the CPU is literally Broadway x 3 @ 45nm + more cache, this would be the case.

If it is, however, something in the 470 line then this wouldn't really be the case. The IBM embedded stuff like the 470S would be deceptive in that matter. Your friend could see a clock rate of "1.6ghz" or "1.8ghz" or something and assume that it's much worse than the 3.2ghz Xenon, but in reality the OOOE 470 line would thrash Xenon in most general purpose code. However, just looking at it on paper (i.e. "holy crap, it's only 1.6ghz!?!") wouldn't give any non-programmer that idea and they'd instantly assume that it is a far weaker unit.
 
Now true believers will claim the GPU is so much better it dosent matter.. COMPUTE shaders believe

Wii U will burn and crash next spring if support is much better on cheaper machines. Nintendo cant afford this CPU
 
Now true believers will claim the GPU is so much better it dosent matter.. COMPUTE shaders believe

And Jaguar cores in other consoles would make them right, wouldn't it?

Wii U will burn and crash next spring if support is much better on cheaper machines. Nintendo cant afford this CPU

For yourself, Wii U has been crashing and burning for what... 2 years now?
 
Now true believers will claim the GPU is so much better it dosent matter.. COMPUTE shaders believe

Wii U will burn and crash next spring if support is much better on cheaper machines. Nintendo cant afford this CPU

You might want to fix that sig unless you believe Tim Sweeney is also laughable. ;)
 
From what I hear (and I trust my source on this as reliable), the CPU chip in WiiU is almost certainly weaker than Xenon & Cell. Which is why some devs have outright refused to port their existing (in-development) PS360 games to it. I believe Nintendo indended the GPU compute support to make up the difference for their weak CPU choice, however the CPU performance is so bad that devs would be required to refactor their entire engines to get stuff working on the GPU instead, and for many dev that would be impossible on their current project schedules.
;)
So much for ease of porting games. Has nintendo finalized the cpu yet? Or can we see a change before it launches? If not, why would Nintendo do this?
 
why would Nintendo do this?
Maybe they're all masochists on the Nintendo board, they love to have pain and suffering inflicted upon themselves?

I can't say that the prospect of the gamecube CPU making yet another appearance in a console appeals to me. Iwata's an electronics engineer, I have always respected him for that, but have the man simply gone INSANE? The basic design of that CPU's like fifteen years old now, or close to it anyway. Surely there are ARM cores today with both higher performance both per watt and per transistor than that old PPC nag.
 
I don't believe even for a second that Nintendo decided to bet on GPGPU given its current state of development and adoption.

I do believe that the presence of a fairly advanced dedicated sound DSP (X-Fi like?), along with a souped-up "Starlet 2" allowed for a simpler CPU.
Maybe the "SoC" that many people keep talking about does have a fairly powerful ARM CPU (multi-core?) in it, that could offload some tasks from the skimpy "main" CPU.

That would totally cock-block ports from X360 and PS3, which would go against Nintendo's main selling points for the new console -> again.
 
From what I hear (and I trust my source on this as reliable), the CPU chip in WiiU is almost certainly weaker than Xenon & Cell. Which is why some devs have outright refused to port their existing (in-development) PS360 games to it. I believe Nintendo indended the GPU compute support to make up the difference for their weak CPU choice, however the CPU performance is so bad that devs would be required to refactor their entire engines to get stuff working on the GPU instead, and for many dev that would be impossible on their current project schedules.
;)
'Weak' CPU is relative. Xenon and Cell are pretty weak for conventional code, but strong in floating point. It'd be possible for a less potent FP performer to still be stronger in useful code.

If devs are offloading CPU vector maths onto the GPU, that would account for results being well below what the hardware could do as devs have to refactor their code once again. As if XB360 and PS3 ports weren't trouble enough. I can also see why they wouldn't bother with Wuu - the market for the same titles will be a tiny fraction of PS360's market. Is it really worth the effort? And if you're considering developing for Wuu and chatting with other dev mates in other studios you hear that they aren't going to bother, that only snowballs the effect.

Of course, GPGPU would go against Nintendo's claims that they want to make the platform dev friendly. It'd also be a strange change for them.
 
Pulling this totally out of my a** but what are the chances that the C1 core (the one with 2MB L2 cache) is in fact a module with 4 cores or at least 4 thread SMT? Or what would the huge L2 be good for otherwise?
 
From what I hear (and I trust my source on this as reliable), the CPU chip in WiiU is almost certainly weaker than Xenon & Cell. Which is why some devs have outright refused to port their existing (in-development) PS360 games to it. I believe Nintendo indended the GPU compute support to make up the difference for their weak CPU choice, however the CPU performance is so bad that devs would be required to refactor their entire engines to get stuff working on the GPU instead, and for many dev that would be impossible on their current project schedules.
;)

This is just insane, if true.

In an ideal world Nintendo would just give IBM and ATI a set price point. And let them use their best technologies to come up with hardware without interfering. Nintendo aren't really at the cutting edge of 3D tech and their internal hardware dev team arguably don't always make the right choices with tech.
 
Status
Not open for further replies.
Back
Top