Wii U hardware discussion and investigation *rename

Status
Not open for further replies.
Well on the ps3 the hdd is standard so it is a decent work around for search perf problems. For the ram usage virtual texture are not that new. I believe it may not have been embrassed earlier because it was a bit pointless within the limitation of dvdrom.

Any way assuming that the wiuu have more ram virtual texturing or.not the system should be able to support higher quality textures here and there especially as those textures will be available for the Pc.version.

Anyway the guys statements are clear, and basically I expect the marketing team to be upset... true or not we are doing nothing special sounds bad, not the best way to pimp a product especiall as the extra sales even on the early and limited wiiu user base might not hurt.

As for wiping that sounds a bit extreme, I don't expect the Wiiu to be super powerful ( actually expect next gen to fall short of.enthusiast expectations) or even powerful still I expect more ram thann 512MB, more ram allows higher quality assets.

I can understand that the guy doesn't to do.special.effort for an unproven platform, especially as N does a pretty good job at amking their life difficult (devs kits and perfs goals seem a moving target) I'm still allow to think that it sounds bad and that it would better have kept his mouth closed.
 
Last edited by a moderator:
It's really disappointing hearing the Wii U is probably just basically the 360.

Even though I of all people expected it.

Quite a lot of my zest for this E3 has been lost now, I was really curious how Wii U games would look. I'm really ready for next gen graphics.
 
It's really disappointing hearing the Wii U is probably just basically the 360.

Even though I of all people expected it.

Quite a lot of my zest for this E3 has been lost now, I was really curious how Wii U games would look. I'm really ready for next gen graphics.

Don't jump to conclusions just yet, specialguy (as the rest of GAF has) - Darksiders isn't putting the extra memory and GPU muscle to use. That's not the fault of the hardware.
 
It´s cringeworthy that for $10 more they could probably put a GPU from 2009 that smokes RSX/Xenos and runs current-gen ports with hugely improved IQ at 60fps.

But they just really love the ultra small and silent consoles. Iwata wants it to invisible :rolleyes:
 
It´s cringeworthy that for $10 more they could probably put a GPU from 2009 that smokes RSX/Xenos and runs current-gen ports with hugely improved IQ at 60fps.

What data are you using to make this statement :?: And what do you mean by hugely improved image quality?
 
It just seems to me Nintendo designs the case before they order the components. Iwata wants it to be a size of two DVD cases or whatever it is this time.

I just cant imagine a reasonable powerful entry designed by AMD costing all that much more to manufacture when they are not pushing any boundaries at all. But when it needs to hit a certain TDP it is problem because Nintendo does not want to pay laptop level component prices

Better IQ should be self explatonary when comparing a PC game running even on DirectX overhead 4850 vs 360
 
What data are you using to make this statement :?: And what do you mean by hugely improved image quality?

I remember seeing a table somewhere in the Southern Islands thread (or was it Kepler's?) with component prices for each graphics card.
I can't find it right now but I think I remember noticing how the price difference between a Juniper and a Redwood was actually around $10.
Of course, that is without the pricier PCB, voltage regulation system and GDDR5 memory.
 
Better IQ should be self explatonary when comparing a PC game running even on DirectX overhead 4850 vs 360

Which could mean anything... I'm just trying to nail down what GPU you're thinking of in order to make any sense of your wishes.
 
6670 is 3X the flops of Xenos and one would assume dirt cheap.

You can one at "retail" prices so that includes a PCB, cooler, and 1 GB of GDDR5 for around ~70 after rebate, I've seen the DDR3 versions of this board for about $50. The GPU's cost must be in the $10-20 range by now.

And even though it's only about 3X the flops, I would guess that you can squeeze out more performance out of it than Xenos since a relatively more modern architecture.
 
I think IGN unintentionally releaved it when they said x720 with 6670 is 20% more powerful

Looking at Redwood level power
 
First off, this board is far too educated to presume that developing a console GPU is like shopping at Newegg. Or even determining current component prices, if it was AMD/ATI, Genyo Takeda (IR&D) would not have spent 2yrs.+ in the development process on the Wii U's graphics processor. I realize we are attempting to establish a power as well as an architectural baseline, but this will be an amalgamation of processor capabilities that will yield a very custom proprietary chip. Somewhat defying the current DX metric. (to a degree of course)

What I mean when I say that is this, we cannot assume because it's based off of, or similar to gpu architecture "X," that it is incapable of "Y." Y equaling effects such as tessellation, IBL, real-time GI, deferred rendering, etc. There are certain visual aspects, such as lighting, that are very important to Nintendo. I have heard that, much like the Flipper, Nintendo has incorporated at least partially a portion of the same design philosophies into the Wii U chipset. Features that “automagically” appear during shader code implementation. A post from my early days regarding the GC’s architecture on B3D:

"However, as mentioned above, a couple of features where added in automagically already, like self-shadowing and tinting for example."

"Per-object self-shadowing can be realized quite nicely on the Nintendo Gamecube. The benefit of doing self-shadowing on a per object basis is that one does not need to be concerned so much with precision."

"One should note that during the shader build many features are activated dynamically. For instance, if an object should get tinted a color multiplication is added to the final output color whatever shader was setup before."

"The results of global lighting can be computed in three different ways: per vertex, per pixel using emboss mapping, and per pixel using bump mapping. All three of these methods come in two variants one with self-shadowing and one without."--Florian Sauer & Sigmund Vik http://www.gamasutra.com/features/20.../sauer_pfv.htm

Also 8 light values came at a very negligible performance cost, because Flipper computed light values in parallel to UV generation. It’s these types of “hardwired” like effects Nintendo I believe has carried over to make modern shader effects with a subset of fixed feature functionality . I’m simply providing examples as I do not know to what extent overall it is, or can be incorporated. (esp. with the gpu being of a modern design) I was told that lighting behaved in this manner, & that lighting was a point of emphasis. As always with a secondhand source, you must always be cautious not to take it as gospel. (though I trust this source, Nintendo's NDAs are the most binding)

Nintendo did make certain alterations to their gpu based upon various 3rd party input, a first. Usually, they tend to develop their gpus & platforms with just simply ATI/Nintendo engineering, consultation, & guidance. Designed around their evolving software strengths, & "the natural flow of the industry."-Genyo Takeda Yes, I am referring to all those benchmark tests Nintendo ran on 3rd party engines for optimization on Wii U hardware.

But make no mistake, Nintendo's footprint is definitely here. You will see a marked performance difference in their proprietary engines, as well as close 3rd parties, & exclusive titles. (UbiSoft, Capcom, etc.) Also, ARM may also be providing their DSP component solution. The nameless devs that are claiming inferiority to the current generation of consoles are either inept, or working with middleware that is still yet unoptimized for the differing Wii U architecture.
 
Last edited by a moderator:
Yeah, the developers have to be inept because Nintendo would never, ever, cheap out on hardware (GB/DS line). Ok, ok, but that is their handhelds, Nintendo would neeeever cheap out on their home console! (Wiiiiii!) Ahem. Ok, ok, but every indications is the WiiU is a performance MONSTER. A big, harry, performance breathing MONSTER. Oh, wait, general reports don't indicate such? Ok, developers are just inept. Glad we settled that!

Sarcasm aside, I don't think anything is off the table yet. Nintendo has been fairly secretive and they are still on dev kits with final hardware at retail at least 6 months away. But I don't think rose tinted glasses should take off the table the prospects that Nintendo, for whatever reasons, has gone with notably cheaper hardware. Not knowing their Form Factor and their Power requirements we should be careful. If they aim for something akin to the Wii there will indeed be some strict power limitations.
 
I'm by no means defending Nintendo's past handheld technical offerings Acert, or even the Wii. But from what I was able to glean from my sources, I just cannot fathom it. Also Acert, the GC was a nice piece of kit. :)
 
Nice certainly (I like the way GC games look), but possibly the weakest of the generation performance wise (bar the 1998 Dreamcast). And it went on to power the hugely successful Wii, which was outclassed by even the Xbox 1. If the WiiU did end up being around PS360 performance - or even weaker in some ways - it wouldn't be a surprise and it wouldn't necessarily be a mistake.

I've got a bit of a thing about the WiiU being a CPU+GPU+EDRAM SoC, and the performance level now being talked about might fit in a ~200 mm^2 ish chip which seems like a reasonable size for a mass produced chip judging by Llano, Trinity and the chip in the 360S.

IBM and AMD now have experience of combining PowerPC and ATI graphics on IBM's manufacturing tech, and IBM are the masters of on-chip edram. I think Nintendo still use IBM to fab the Wii CPU don't they? I think it could be a really good fit. Looking at Llano and *Trinity, 200 ~ 320 shaders might be kind of ish what you might expect to get on a 45nm SoC and so even that could fit current developer comments.

*Edit: actually scratch that about Trinity, looks like the GPu cores are bigger:

http://www.fudzilla.com/home/item/26635-trinity-has-fewer-radeon-cores-but-more-efficient
 
Last edited by a moderator:
It's weird how the expectations have ballooned over the last year. Nintendo was pretty careful to downplay hardware capabilities, to the point where at E3 it wasn't even immediately clear there was new hardware. And a bunch of non-committal, diplomatic, PR comments from devs haven't really given any stronger impression than, "Oh, yeah. It's pretty good, I guess. We think the controller might be cool". But its legend has grown to the point where the faithful all thought it was secretly really powerful and that it contains mysterious components that auto-magically triple its capabilities, and just wait until they activate the I/O processor!

My guess, it's a single chip. 200-300 DX10.1 generation shaders. 3 OoOE PPC cores at ~2.5 Ghz. Better at certain things than either the PS3 or 360 processors, but with lower maximum vector capabilities causing issues. No more than 16 MBs EDRAM as cache for the CPU and 1GB of fairly mundane RAM: DDR3 or maybe GDDR5 on a narrow bus. Manufacture on 45nm, lean on considerable talent of internal studios and call it a day.
 
Im pissing my self reading all these more recent 'leaks' we were talking about 800 shaders..around january time...it just gets worse and worse.
 
I'm by no means defending Nintendo's past handheld technical offerings Acert, or even the Wii. But from what I was able to glean from my sources, I just cannot fathom it.
Well, everything we're hearing places Wuu either on a par with PS360, or below. Which, if taken at face value, means Nintendo have already done something amazing to come up with such a feable GPU. We had the same thing with Wii. There was no way Nintendo weren't going to have something around R300 level. It would have been dirt cheap and easy to use and...oh. They had 2 GCs ducktaped together. Now that could be excused, by the very tolerant, as necessary for BC, but in reality their choices were real headscratchers. All it really proves is that our sort of logic that takes an economically viable technology of a decent, contemporary quality, can be ignored by engineers who go with choices we never ever would, and the rumours are all pointing that way. It would be hard for Nintendo to come up with a design that wasn't a good bit stronger than PS360, yet they may well do it. ;)
 
Well, everything we're hearing places Wuu either on a par with PS360, or below. (...)

Only if by "everything" you mean gamesindustry's latest article and a single GAF post..
Most rumours are pointing to better, not "below" or "on par".
How much better should be the main topic of discussion, actually.
 
Status
Not open for further replies.
Back
Top