Wii U hardware discussion and investigation *rename

Status
Not open for further replies.
Fixed function GI :?:

I don't know what you guys (Li Mu Bai etc) are on about, but what relevance is there supposed to be now? We've moved away from fixed function/"hardwired shaders"/HW T&L for good reason.
IF some common task like GI could be run on optimal fixed-function shaders at considerable performance advantage, it'd make sense. Pretty much every game style can benefit from good GI approximation. Otherwise, if it can't be improved upon versus the performance of programmable shaders, it'd just be a pointless reduction of versatility on the GPU.
 
Last edited by a moderator:
Fixed function GI :?:

I don't know what you guys (Li Mu Bai, etc) are on about, but what relevance is there supposed to be now? We've moved away from fixed function/"hardwired shaders"/HW T&L for good reason.
Yeah. One of the main reasons is to do stuff on the GPU that has nothing to do with games. I doubt Nintendo cares. Nor should they. I don't think it would be "fixed function" the way Flipper was fixed function, though.
 
Fixed function GI :?:

I don't know what you guys (Li Mu Bai, bgassassin, etc) are on about, but what relevance is there supposed to be now? We've moved away from fixed function/"hardwired shaders"/HW T&L for good reason.

It's taking into consideration that Nintendo doesn't think in "normal" GPU methodology. But it is supposedly true Nintendo has some of their own "touches" in the GPU design. I heard it back in December, and now Li is saying something similar. Though what I heard was more vague leaving me to guess what these touches might be. I'm not up to date enough to properly propose what those features might be with a benefit over modern shaders, but my idea of one of those things being lighting was trying to think in a Nintendo-like manner. And I don't put it passed them to do something like that based on things we've heard (though I think it mostly due to the CPU and porting more so than anything else).

Like I said in the GAF post, that's what I see if the GPU is not a traditional GPU with more shaders.

Shifty gave an example of what I'm thinking, but unable to explain.
 
IF some common task like GI could be run on optimal fixed-function shaders at considerable performance advantage, it'd make sense. Pretty much every game style can benefit from good GI approximation. Otherwise, if it can't be improved upon versus the performance of programmable shaders, it'd just be a pointless reduction of versatility on the GPU.

Well, I mean there are lots of components that are still carried out via fixed units, but it boils down to math and texture ops. :s The thing is that the "GI approximation" label covers a lot of things e.g. SSAO to SSDO to HBAO or other non-Screen-space alternatives for AO that deal with the shadowing, then there's the LPVs/RSMs, cubemaps, SH lighting coefficients, wavelets etc that deal with the light bouncing and whatnot. So I'm not sure what would be particularly common to all of them except for the fundamentals used in constructing the shaders/textures/volumes.

If we look at shadow filtering, one might build off of say the hardware 2x2 bilinear PCF (per fetch) for shadowmaps and just increase sampling (brute force), or just go in a completely different/better/more efficient direction to shadow filtering (being able to utilize the fixed MSAA hardware ala ESMs for instance). i.e. there's no point to implementing 32x32 PCF in hardware (expensive). :p

I think if we are to look at "common GI" we may as well be considering ray tracing!
 
Well, I mean there are lots of components that are still carried out via fixed units, but it boils down to math and texture ops. :s The thing is that the "GI approximation" label covers a lot of things e.g. SSAO to SSDO to HBAO or other non-Screen-space alternatives for AO that deal with the shadowing, then there's the LPVs/RSMs, cubemaps, SH lighting coefficients, wavelets etc that deal with the light bouncing and whatnot. So I'm not sure what would be particularly common to all of them except for the fundamentals used in constructing the shaders/textures/volumes.
I wouldn't know either, but in Nintendo's case I expect it'd be 'their way' implemented in hardware for their first-party needs, and devs would just have that to work with.
 
BGASSASSIN; What was you original predication form your 'sources' of wii hardware power?...certainly the sheepish comments coming from Ninty doesn't exactly fill its fans with confidence?...Im really trying to think how they could design a gpu.. some 7 years later thats...best case 'on par' with a Xbox 360.....i mean Ninty didn't even try and drum up the excitement...seriously if it was more powerfull in anyway...he would have said so....knowing full well if he did..it would produce more sales....knowing that...he didn't which speaks for its self really.

Still, i can't even fathom how they could make it worse...even a last years Llano was more powerfull...logic tells me that it IMPOSSIBLE for not to be better...it really isnt...ipad 4 will smoke Wii U...thats embarrassing.

A quick look on AMD website at the 4670; 320 ALU's-1gb ram...that alone would smoke a 360 in a console form factor...i mean it HAS to be better...AMD doesn't make GPU's that CAN be worse....i bet its all some kind of trick...baffling.

AMD 4670- -514m transistors
-DX10.1
-Shader model 4.1
-320 shaders/ALU's
-8 ROP's
-32 TMU's
-750mhz
-128bit bus with 1gb ddr3..@32gb p/s
-Retailed @ £55 sept 08..

That compares with a comparitively anaemic Xenos..with half the TMU's...2/3 shaders... 3/4 frequency....DX9.0c shader model 3.0.... 1/2 ram with 3/4 bandwidth...that came out in 2005...seriously they CAN'T make a worse GPU in 2012..if they tried.
 
Al, that is exactly what I was going to bring up (just with more words, less clarity, and 110% less dream wrecking). Just browsing thing a developers Siggraph stuff no how they did their "GI" shows a host of techniques needed. I have no clue how you even design a fixed function addition for all of those techniques as many are quite divergent in not only their task but what they are doing. It is the rise of programmable shaders that even allowed these "hacks" to come about. And the other problem you note is even if you could accelerate a set standard of GI hacks you are effectively stuck with it, and all the drawbacks of such. Kind of like the ol' DX7 and DX8 days where GPUs saw a rise in nice bells and whistles but those bells and whistles were pretty much the same in every game.

Of course if they mean a real "unified GI" approximation, seeing as such an approach hasn't been effectively programmed on GPUs (or CPUs) at anything remotely acceptable speeds (we are talking this stuff is slooooow on NV 580 SLIs on the most simple scenes) there is no way it could be a throwaway addon to a GPU.

Ok, now I will cease crushing Wii U fans dreams. See Al, you don't need to go godZILLA on everyone's hopes. Maybe you should just change your name to, urhmmm, I dunno, AlZilla? :p
 
Still, i can't even fathom how they could make it worse...even a last years Llano was more powerfull...logic tells me that it IMPOSSIBLE for not to be better...it really isnt...ipad 4 will smoke Wii U...thats embarrassing.

How much bigger do you think the iPad chips are going to get? They are not going to get magically faster anytime soon in terms of frequency (power sucking) and new processes are further in between and not giving the huge frequency bumbs at the same power we used to see. Seeing as the current iPad is shy of 30GFLOPs and trying to power a high resolution screen I see no way the iPad4 will be even half the current consoles. Even if you consider a transition to 28nm (+ 100%), a strong shift toward the new area budget to the GPU where the GPU swallows more of the new real estate like how Intel and AMD APUs are (+ 50%) and a 20% jump on base clocks, generously assuming that they can keep the same die size, same power, and increase clocks (not gonna happen most likely) you are ending up at less than 4x faster than the iPad3 which is about half as fast as a current console in raw shader computation power. There is no way an iPad4 is going to smoke a WiiU.
 
you are ending up at less than 4x faster than the iPad3 which is about half as fast as a current console in raw shader computation power. There is no way an iPad4 is going to smoke a WiiU.
Ha! You speak like that would be some kind achievement...i was saying it in jest...but seriously it won't be a million miles off...

..CPU wise it mosy likely be equal...if A6 follows the same design principles as A5X..then we could well see..4 ROGUES slotted in instead of SGX 543's..that alone would be a quantum leap in performance...2gb lpddr3 ram...with quad channel memory...making 25.6gb/s bandwidth..(not forgetting its a TBDR)...maybe ios 6 activates OPENCL OPENGL 4XX...honestly that is certainly feasable..if that was in a console..i would bet that beats all current (Wii U) consoles hands down.
-You disagree with that??

Anyway..its a mute point because i don't think its within the rules of physics for the Wii GPU to be anywhere less than 50% more powerfull...impossible.
Seeing as the current iPad is shy of 30GFLOPs
Well its 32 gflops..
http://www.anandtech.com/show/5688/apple-ipad-2012-review/12
Rogue should be many many times that...Xenos only produces 240GFLOPS...
 
Last edited by a moderator:
frenchtoast:

Here's the thing: what concerns me most about predicting the hardware power here is the ultimate TDP and cooling solution, and until IBM comes out and says otherwise, I'm strongly in the 45nm camp (not that 32nm fab capacity appears to be in abundance, so it's not a good sign there) for launch. Also, until TSMC shows ungodly ability to expand their 28nm operations, 40nm is what the GPU will be for the near future (besides, it's just simply going to be a lot cheaper for Nintendo).

As an aside, if IBM is having a hand in the GPU fabrication, then 45nm SOI it is (ala 360S design). Either way, we're not looking at 28nm/32nm, so comparing to Llano isn't going to be relevant.

We know that the small case is going to limit the cooling ability no matter what, plus I would be quite disappointed if Nintendo opted for an over 9000RPM turbine just to satiate any supposed beast underneath...

They're probably not looking at nearly the same TDP as the 360S SoP, which is probably in the ballpark of 70W (after subtracting the drives and RAM chips from load power consumption of the console unit). Even if the WiiU could dissipate 70W of heat power (it'll be quite noisy), you'd have to divy that up between the CPU and GPU. So then you'd probably be looking at a 40nm GPU from AMD that is in the range of 40-50W TDP.*

You can see that the 5570 series kind of fits (with some headroom for clock tweaks around 650MHz) within that TDP range. That's pretty much where the 400ALU idea came from. Mind you, it's already in the 600M+ transistor area.

*I think a lot of folks misunderstood the GI/DF article when it compared case size as a measure of computing/graphical power. It just gives a rough idea of the heat dissipation requirements (whilst keeping noise in mind), and then you can begin to look at what's currently available on 40nm that could fit somewhat closely. Ultimately, it is just a rough guess since there are the customizations to architecture that will give different perf/watt plus there's the supposed edram.... I mean, you'd probably have different ALU/ROP/TMU counts and clocks there.


Ok, now I will cease crushing Wii U fans dreams. See Al, you don't need to go godZILLA on everyone's hopes. Maybe you should just change your name to, urhmmm, I dunno, AlZilla? :p

Well, I have my STRONG concerns. I apologize for sounding [strike]harsh[/strike]STRONG. :p rwar
 
AI;..Impossible...i really can't get my head around it...7 years...a custom designed chip...hell even with the SAME shaders ROPS/TMU's clock speed..just an incease in ram and the efficiency from going with newer parts..shader model 5.0 v 3.0..dx11 class v dx9.0c ...i tell you its impossible...we are getting done up like a kipper! ;)
 
Well, you'd be discounting the advances since 90nm that allowed for such a small case to be used. :p

I mean, if you look at Gamecube to Wii, you got... 1.5x the clocks for both chips AND a smaller form factor (just over half the size), and that's after two full node reductions (180nm -> 130nm -> 90nm).
 
Well, you'd be discounting the advances since 90nm that allowed for such a small case to be used. :p

I mean, if you look at Gamecube to Wii, you got... 1.5x the clocks for both chips AND a smaller form factor (just over half the size), and that's after two full node reductions (180nm -> 130nm -> 90nm).
No you miss read me..i agree what i mean its impossible for theWii U not to be more powerfull than a 360..just take that 5570..its on another playing field altogether...;)

Whats worrying is the sounds coming from Ninty...
 
No you miss read me..i agree what i mean its impossible for theWii U not to be more powerfull than a 360..just take that 5570..its on another playing field altogether...;)

Whats worrying is the sounds coming from Ninty...

Some things to consider though:

- New artitecture: Unlike the Gamecube -> Wii, the Wii U has a brand new artitecture, and from several sources it appears to be a bit different from the 360. This is pre-first-gen for system, so there will be issues with unoptimized middleware and dev-kits that are still going through revisions. While we don't know exactly how powerful the system is, it is not likely signficantly more powerful than the current-gen consoles due to several factors. Game engines that are stilling getting optimized for the Wii U will likely performance below expectations.

- The Wii U controller screen: Depending on what the developers are doing with the other screen(s), it can potentially require a significant amount of resources.
 
Hey you guys don't understand Nintendo only want to top the new iPad and the upcoming new iPad 2... :LOL:

I would still be amazed if they manage to under shoot the ps360, that would be a technological "tour de force" imho.
 
Hey you guys don't understand Nintendo only want to top the new iPad and the upcoming new iPad 2... :LOL:

I would still be amazed if they manage to under shoot the ps360, that would be a technological "tour de force" imho.

Ha ha..you could grab the worlds top super brains..give them a budget of say £500 million, 7 years...with a challange to bring out a WORSE console than the xbox 360..in 2012...they would FAIL :LOL:

Ninty on the other hand are dab hands at pulling poor components out the bag!:p check out the 3ds's processing power..lol....genius how they manage to sell that crap.
 
Surely a shader library or software API distributed in the SDK is cheaper in every way and makes much more sense than putting an additional fixed function hardware pipeline into your hardware in 2012.
 
Status
Not open for further replies.
Back
Top