Wii U hardware discussion and investigation *rename

Status
Not open for further replies.
What they do? DS, 3DS and Wii - that's what their extensive RnD investments have given us. Maybe they are creating custom chips, but the hardware they sell would be better if they used more potent standard components. ;)

To be fair, i'm rather inclined to agree with shifty on this one. The proof is in the pudding. And if extensive expenditure on HW R&D has only given us relatively dated HW in the last three Ninty platforms then perhaps they need to do some restructuring and simply pay AMD/IBM/Nvidia to design performant systems.
 
What they do? DS, 3DS and Wii - that's what their extensive RnD investments have given us. Maybe they are creating custom chips, but the hardware they sell would be better if they used more potent standard components. ;)

Shifty, Nintendo's handheld & console philosophies do in fact differ. The SNES-GC were all very capable machines, esp. the N64. The Wii was more of an aberration. Even internally at Nintendo, there was a heated debate amongst their engineers if such an incremental power/clock increase would be sufficient in the marketplace vs. their competitors.
 
What they do? DS, 3DS and Wii - that's what their extensive RnD investments have given us. Maybe they are creating custom chips, but the hardware they sell would be better if they used more potent standard components. ;)

Hundreds of millions in R&D came after the success of the DS and Wii, which is when Nintendo started making lots and lots of money, so you can't count on them for that kind of statistics.

True, the 3DS is poor and it showed Nintendo they can't get too greedy on hardware specs, which should be a reason to expect more from the Wii U, and not less.
 
Beating the handheld dead horse real quick, Nintendo handhelds have been behind since the Gameboy started getting competition. It's pretty obvious the handheld line has a different philosophy as it's been the same since the beginning.

Acert, my apologies for the late reply. I do indeed have "sources" working with development kits, though the information is not flowing as freely as you may assume. The SDK version can actually be verified online, if you know where to look. I told you one aspect of the GPU, I meant the "lost in translation" comment quite literally, as my japanese is a tad rusty. Other than it being of a custom & more modern design, the GPU is as much of a mystery to me as anyone else here. (due not to a lack of trying to obtain the information, I assure you) Yes I know that custom is a nebulous statement, it is truly a matter of degrees. And no console GPU appears out of a vacuum, I'm trying to obtain information referencing its base. I had heard of the CPU target specifications some time ago online, & simply verified if these were accurate or not. Due to the late arrival of the system, I was expecting something much more powerful than the Xenon. Regardless, some information was shared freely, other information was not.

The details you seek, specifically clock speeds, SPUs, ROPs, FLOPs, TUs, etc, I do not possess. I would've certainly leaked them, if it would not compromise my sources' position. The V5 dev kit seemed to still be focusing on 3rd party engine optimizations, efficiency, etcetera. Perhaps including a 5-15% performance boost as well. A "transitional console" would be an apt description based upon what I've gathered thus far. No outrageous power claims, as Orbis & Durango will definitely outclass it. (the Durango much moreso, as MS have been imbibing the Mark Rein laced punch) The strength will, as I said previously, be in proprietary engine development. The Wii U can still perform visual effects that the PS3 & 360 cannot. Which is only to be expected, why anyone here expected less is beyond my comprehension.

You definitely know more than I do, but from what I do know I can say this is accurate.
 
Shifty, Nintendo's handheld & console philosophies do in fact differ. The SNES-GC were all very capable machines, esp. the N64. The Wii was more of an aberration. Even internally at Nintendo, there was a heated debate amongst their engineers if such an incremental power/clock increase would be sufficient in the marketplace vs. their competitors.

Keep the hope up. Iwata strategy on HW does not give a dime about history 10+ years ago
 
Keep the hope up. Iwata strategy on HW does not give a dime about history 10+ years ago

Hardware doesn't just include CPU and GPU. Hundreds of millions have been spent on other aspects of hardware. As an example, Sony has been stuck with the same controller design since 1999. It's about time they evolved there, and spent less money on hardware r&d.


Which is exactly what's happening.

The Wii was the aberration, the handhelds have followed a pretty predictable hardware path since their inception.
 
These fixed functions what can we expect?

Li Mu Bai you say that it will in some situations be much more. Do you mean when the games utilize the fuxed functions or when properly optimized? Or even both? Do you have any rough estimations as to how much more power the Wii delivers?
Any who can tell me if it would be hard to implement? You say it's automagically so it sound like they call some API and it will add the things in the shaders for you, wouldn't that make it rather easy to implement?

The question is now if 3rd party are going all out again which means higher development costs or only making a smaller jump which means the Wii U is in a sweet spot.
 
Shifty, Nintendo's handheld & console philosophies do in fact differ. The SNES-GC were all very capable machines, esp. the N64. The Wii was more of an aberration.

I see things a bit differently.

The SNES had a nice graphics processor (low resolution aside) but a weak as heck CPU; two years later than the Megadrive but with less than half the CPU power. The N64 was passed over by Sega some time in late 1993 / early 1994 - Nintendo released it across 1996 and 1997 after a process shrink. The GC was utterly outclassed by the (uberlo$$) Xbox and was infact less powerful than the PS2 (certainly in terms of handling multiplatform games). Nintendo has never chased the high end, but has when necessary made carefully selected elements of its consoles highly competitive with enemy hardware.

The SNES, N64 and GC were, however, all capable machines and so was the Wii. The Wii was definitely capable of doing what Nintendo needed it to, just like the SNES, N64 and GC. And just like the Gameboy, Gameboy Advance, DS and 3DS.

Nintendo are amazing. They have never made a huge hardware misstep like Sony, Sega or Microsoft did (carts aside) and continue to succeed and innovate after 30 years. I don't expect Nintendo to spend more on hardware this time than they need to, and they don't need to far exceed the PS360 to make a shot of the WiiU.
 
I see things a bit differently.

The GC was utterly outclassed by the (uberlo$$) Xbox and was infact less powerful than the PS2 (certainly in terms of handling multiplatform games).

The Xbox had a wonderful GPU but a comparatively shitty CPU.
I do not, however, see in which way the GC was "less powerful than the PS2"

That's an assertion I'd like you to back up with real world evidence - not Sony's "theoretical peak" numbers.
 
You've already been told this actually (first link + quote):

http://forum.beyond3d.com/showpost.php?p=1634264&postcount=2

The Gamecube was a marvel of engineering in comparison to its competitors of that particular generation.
Hmm wouldn't have been my assessment.
It was by far the worst performing of the 3 platforms. You should see the performance penalty when god forbid the GPU has to clip a polygon, it was so bad I actually wrote code to traverse triangle lists and clip tris with the CPU.

http://forum.beyond3d.com/showpost.php?p=1634357&postcount=4

perhaps but having actually written code for all 3 platforms in that generation that's my assessment.
You could get pretty good initial performance out of GC, but usually that was pretty much it, no amount of dicking around would get you more polygons or more pixels.
PS2 was a pain in the ass poor implementations were really bad.
Xbox was basically underutilized because no one could be bothered.

You might get prettier pixels out of a gamecube than a PS2, but usually only because you couldn't be bothered trying to figure out have to make the PS2 produce the better imagery.

Gamecube's "big win" was the processors relatively arge cache and the low latency main memory. I always felt it was designed that way as an over reaction to the horrible memory latency on N64 and all the developer bitching about it.
But the memory subsystem was over engineered, the large cache for the most part removed the advantage of the low latency memory...

http://forum.beyond3d.com/showpost.php?p=1634576&postcount=12

FWIW I don't ever consider any developer who ships anything lazy.
When you're building a cross platform game, there is always an element of lowest common denominator, it's about costs (and I don't just mean financial).
PS2 was often the "lead SKU" at big publishers because of the installed base, Xbox was a version you had to do, in most cases you could write a simple version of your renderer and just drop the assets on Xbox and they would usually run faster. So you'd increase texture quality and call it done.
Usually when you dropped it on gamecube it would run slower and you'd have no memory left, so you downsample to make things fit, figure out how you could use ARAM without crippling performance and ship it.

If you wrote an XBox exclusive with no intention of ever shipping on PC, and you actually spent time optimizing there was a lot of performance to be had, usually most titles were CPU limited because then the polygon indices had to be copied into the GPU ring buffer (which wasn't actually a ring buffer). If your app was pushing a lot of geometry it could literally spend 60% of it's time doing nothing but linear memory copies.
It was possible to place jumps into the ringbuffer, to effectively "call" static GPU buffers, but it was tricky to get right because of the pipeline and the fact you had to patch the return address as a jump into the buffer so you'd have to place fences between calls to the same static buffer.
If you did this however you could trivially saturate the GPU and produce something much better looking.

On GameCube the biggest issue is it was just had pathetic triangle throughput, the 10M polygons per second (I don't remember the real number) assumes you never clip or light anything.
GameCube was DX7 class hardware for the most part, albeit a more fully featured version than ever shipped in a PC. The GPU just wasn't very fast.
As I said it's real benefit was the memory architecture and I still feel it was over engineered.
On the whole it wasn't a bad machine, but I wouldn't have said it was "more powerful than PS2)

But think whatever makes you happy. It's not really relevant to this thread.

What is relevant is that the GC, like the Wii and I suspect the WiiU, is demonstrative of Nintendo's really tight and really, really smart control over hardware costs. It allows them to survive situations that would decimate and destroy less savvy companies (i.e. Sega + Saturn) and when they get it absolutely spot on (SNES, DS, Wii, 3DS) make an absolute killing with cheap and relatively weak (in some or all ways) hardware.
 
I've been out of the loop on everything related to Wii U for quite awhile now, anything new on the GPU?
( I think I already asked this, in this thread or the next-gen prediction thread, I don't remember and am too tired to look).

Man if only Wii U's custom GPU was based on RV770 or at least RV740, I'd be really happy.
 
I'm starting to get the impression the Wii U is going to have a Pica Extreme GPU instead of something from ATI.
 
Man if only Wii U's custom GPU was based on RV770 or at least RV740, I'd be really happy.

Maybe you need to go lower. It will go beyond Xenos to support the tablet but mainscreen will stay at 720p for most titles like on current consoles. French site that leaked the whole console reported notch above 360 one year ago and there is not much reason to expect more

http://www.computerandvideogames.co...-as-good-as-ps3-but-its-still-not-as-capable/

"Assumptions that Wii U games will look like 'up rezzed' current-gen titles with better textures aren't quite right. They'll look just as good, but not better," one developer told CVG. "You shouldn't expect anything special from a graphics point of view," they added.
 
Maybe you need to go lower. It will go beyond Xenos to support the tablet but mainscreen will stay at 720p for most titles like on current consoles. French site that leaked the whole console reported notch above 360 one year ago and there is not much reason to expect more

Yes, and several devs, without trying to hide behind the curtain of anonymousity, have said it's from "more powerfull" to "far more powerfull" than current gen
 
Status
Not open for further replies.
Back
Top