Wii U hardware discussion and investigation *rename

Status
Not open for further replies.
Not if they were trying to hit a certain TDP.

Why is TDP relevant for a Dev Kit? Development hardware has never needed to be the same size as the final hardware and, historically, has usually been significantly larger. All that's matters is recreating the target performance characteristics as closely as possible. Size, heat output, and looks don't matter. When the final silicon is ready, that's when you put it in a cute little box. Until then it might as well be exposed circuit boards sprawled across a desk.
 
Why is TDP relevant for a Dev Kit? Development hardware has never needed to be the same size as the final hardware and, historically, has usually been significantly larger. All that's matters is recreating the target performance characteristics as closely as possible. Size, heat output, and looks don't matter. When the final silicon is ready, that's when you put it in a cute little box. Until then it might as well be exposed circuit boards sprawled across a desk.

Because as lherre said the kit was freezing every time the tried to push it and the GPU was to blame. And the early dev kit casing wasn't much bigger than what they showed publicly.
 
Because as lherre said the kit was freezing every time the tried to push it and the GPU was to blame. And the early dev kit casing wasn't much bigger than what they showed publicly.

So we just have evidence of Nintendo's incompetence? Like was said before, there's no conceivable level of GPU performance the WiiU is targeting that could not have been closely approximated by off the shelf components. Failure to adequately cool that hardware in development systems is just Nintendo being stupid.
 
So we just have evidence of Nintendo's incompetence? Like was said before, there's no conceivable level of GPU performance the WiiU is targeting that could not have been closely approximated by off the shelf components. Failure to adequately cool that hardware in development systems is just Nintendo being stupid.

It sounds like you were looking for a reason to bash them. I can't defend why they did it. I was just saying what happened.

After all I guess MS is incompetent for going with the 10MB of eDRAM that was as much or more of a detriment than it was a benefit according to some and RROD. And of course Sony building an $800 console. Seems like all the console makers are good for doing something stupid.
 
He actually claimed the GPU itself is weaker then 360's GPU, if he meant it seemed weaker because of the controllers display then why not just say that. He's also made claims about the dev kit having 1GB of RAM. Lherre who was also verified and seemed very knowledgeable said the development kits had a range of memory which started at 2GB and upwards.

I dont understand why the RAM thing confuses people, for the millionth time lherre's RAM info was in the context of Wii U dev kits having 2X the ram as retail (and he said as much in the post, since it was people questioning him about the RAM). Therefore lherre's "2GB+" info does not contradict 1GB of ram in the Wii U. The only thing it offers is the "+" means Nintendo could conceivably end up at 1.5GB or something like that. Basically "2GB+" dev kit means "1GB+" in the final.

Now with Arkam specifically stating 1GB though that's my guess. That's already a sufficient amount over the competitors if the rest of the system is in PS360 ballpark.

Good luck on that. I sat around for 6 months waiting to get approved, only to be told I didn't meet the membership requirements. I even used my work email.

I dont know what membership requirements they could possibly have and apply individually to thousands of applicants. My assumption would be a fairly binary "this isn't a free webmail, approve" type thing. The 6 months is normal though and to be expected going in.
 
It sounds like you were looking for a reason to bash them. I can't defend why they did it. I was just saying what happened.

No, you were trying to use TDP to justify your pet theories about the WiiU hardware, and I just told you why that has zero meaning in a dev kit.
 
So we just have evidence of Nintendo's incompetence? Like was said before, there's no conceivable level of GPU performance the WiiU is targeting that could not have been closely approximated by off the shelf components. Failure to adequately cool that hardware in development systems is just Nintendo being stupid.

That's better than the level of incompetence needed to launch a machine to consumers that was not adequately cooled.

Just sayin'
 
Ok. Then if they were targeting a GPU on par with an RV730 or lower for the final, then why not just use that?

Difference in features maybe?

Have to say that I agree with Brad Grenz on this: trying to match final system power/heat in a dev kit doesn't seem like a reasonable thing to do. There's just no point.
 
Ok. Then if they were targeting a GPU on par with an RV730 or lower for the final, then why not just use that?

For the same reason that has already been suggested: a 256bit memory bus gets you closer to the memory bandwidth the embedded memory design is supposed to afford.

When they chose a part for the dev kit they had to consider multiple aspects: shader functionality, memory bandwidth and theoretical performance. You can't look at one and ignore the others. If the WiiU GPU has ~400 AMD's DX10.1 level ALUs attached to a large amount of embedded memory, the 4830 at some arbitrary underclock is the best all around approximation of that design possible with off the shelf components. An RV730 has too little memory bandwidth, newer low end DX11 GPUs have a different feature set. And since we have very good reason to believe the R700 derived shaders, and embedded memory parts of that equation, it would be foolish to draw too many conclusions from the 4830's ALU count.
 
Last edited by a moderator:
Entropy, I would deposit this to chew on: We have already seen multiplatform titles that "Console A" is locked at 30fps while "Console B" is at an average 28fps (drops/tearing) with a reduction in transparency resolution, texture resolution, disabled graphic features, etc and even sometimes reduced render resolution -- and the normal consumer really cannot see the difference. And while I am not a developer and just an longtime PC gamers I would say that in the above scenario "Console A" and "Console B" are more than the ~7% difference in performance for the app--if Console B had all the same settings the framerate would reduce much more and unlocking the 30Hz Console A version would also open up too.

All that to say that if the WiiU is *very* close to the 360 that is not a big deal. And it sounds like the door is open for the final dev kits and the retail versions to be slightly better than the current builds so something that is same-to-2x as fast should not be a huge issue on the graphics side for 360 ports. Even if, big if, the retail version was 95% as fast on the GPU I am sure it would be easy to make that gap dissappear and the features disabled would not be noticed by many. I am clueless, but I would bet the WiiU ends up being slightly faster.

This generation has seen the PS3 and 360 get the same games with the relatively minor differences you point out. But then again they were released with roughly the same transistor and power budgets, and were designed at roughly the same time, even if the PS3 were held up by the blue laser availability/yields/price.

The Wii U, coming to the market 6-7 years later, can avoid those porting issues easily, if it is a priority at all. Looking at the processing density of AMDs later products, matching the 360 in terms of shading power is in the ballpark of 20mm2, or (ballpark again) $2 added to the bill of materials. Put another way, if you are at 360 level of graphics calculations, you can double it for $2. So unless Nintendo is extremely cautious in terms of cost, I think it can be safely assumed that the GPU will have some headroom vs. the PS360s.

Again, as an exercise, you could turn the argument around, and ask yourself just what it would mean if Nintendo did in fact produce a console that had a weaker GPU than the PS360. It would mean that
a: Nintendo isn't interested in ports from the previous generation, much less future multi platform titles.
This is conceivable - after all, this is exactly what they did with the Wii, and now with the 3DS.
b: Nintendo is only paying lip service to being a HD console.
The PS360 has issues even performing at 720p, a resolution which by now is basically gone from dealer shelves in other consumer products. Targeting anything but 1080 output seems bizarre for a new console, but if it has lower performance than the previous generation, the WiiU will have difficulty doing even the half resolution format. Of course they could simply scale SD output to HD resolution, but the TV sets already do this, so why commission IBM and AMD to design new CPUs and GPUs for the WiiU at all?


Nintendo certainly knows the importance of hitting the right pricing spot, and the 3DS definitely drove the point home. But even so, I can't see them coming in below the previous generation in GPU performance. It just doesn't make sense, cost cutting that deeply simply isn't worthwhile. And what people saw in the "Garden" demo, doesn't really imply sub-par performance either. In the absence of actual specs/target specs being leaked, Occams razor is probably just as useful as any vague opinion offered by insiders. What does "arbitrary number" more powerful than the 360 mean anyway? In terms of what? Number of CPUs? CPU clocks? CPU marketing FLOPS? CPU internal benchmarking? L2 cache size? L2 cache bandwidth? GPU shader power? GPU shader FLOPS? GPU bandwidth? GPU EDRAM size? GPU internal benchmarking performance? Main memory size? People dishing out such an arbitrary number either don't know what they are talking about, or they don't want the recipients to know what they are talking about. It's completely useless.
 
For the same reason that has already been suggested: a 256bit memory bus gets you closer to the memory bandwidth the embedded memory design is supposed to afford.

When they chose a part for the dev kit they had to consider multiple aspects: shader functionality, memory bandwidth and theoretical performance. You can't look at one and ignore the others. If the WiiU GPU has ~400 AMD's DX10.1 level ALUs attached to a large amount of embedded memory, the 4830 at some arbitrary underclock is the best all around approximation of that design possible with off the shelf components. An RV730 has too little memory bandwidth, newer low end DX11 GPUs have a different feature set. And since we have very good reason to believe the R700 derived shaders, and embedded memory parts of that equation, it would be foolish to draw too many conclusions from the 4830's ALU count.


But that makes no sense also. The GDDR3's 57GB/s isn't going to simulate anything near the eDRAM. They would have been better off with something like the 4860 if that's the case. They most likely used that bus to increase to memory amount of the dev kit way more than trying to simulate the eDRAM. At the same time if it was their intent for ~400 ALUs, then there would be no reason for anyone to let it be known it was underclocked. I also don't buy the different feature set point as well since it's going to be customized anyway. Something newer most likely wasn't chosen due to their smaller buses as well. The only future card that was close to an RV770 in that manner is Barts LE and that didn't come out till after they were already sending out dev kits.
 
The Wii U, coming to the market 6-7 years later, can avoid those porting issues easily, if it is a priority at all. Looking at the processing density of AMDs later products, matching the 360 in terms of shading power is in the ballpark of 20mm2, or (ballpark again) $2 added to the bill of materials. Put another way, if you are at 360 level of graphics calculations, you can double it for $2.

How do you work the $2 figure if you don't mind me asking? That seems extremely low.

Edit: looking at the 360S Hot Chips presentation it's pretty clear that the 360S GPU is taking up vastly more than 20mm^2, and would take up rather more even with a 32nm shrink.
 
Last edited by a moderator:
The Wii U, coming to the market 6-7 years later, can avoid those porting issues easily, if it is a priority at all. Looking at the processing density of AMDs later products, matching the 360 in terms of shading power is in the ballpark of 20mm2, or (ballpark again) $2 added to the bill of materials. Put another way, if you are at 360 level of graphics calculations, you can double it for $2. So unless Nintendo is extremely cautious in terms of cost, I think it can be safely assumed that the GPU will have some headroom vs. the PS360s.
.

That's is not true. You can't just take Tahiti and divide it's die-size for 15.
Llano GPU-die size is around 100 mm^2 and it's in just a bit faster than Xenos.
 
Llano GPU-die size is around 100 mm^2 and it's in just a bit faster than Xenos.
Llano-level GPU, aka Redwood, is mere 104mm^2 at 40nm, so quite a bit smaller than 100mm^2 at 32nm, and is leaps and bounds faster than Xenos (on theoretical front, over double GFLOPS for example, and IIRC every single other theoretical value (Gvtx/s, Gpix/s etc) is higher, too)
 
Llano-level GPU, aka Redwood, is mere 104mm^2 at 40nm, so quite a bit smaller than 100mm^2 at 32nm, and is leaps and bounds faster than Xenos (on theoretical front, over double GFLOPS for example, and IIRC every single other theoretical value (Gvtx/s, Gpix/s etc) is higher, too)

But actually on a 32nm process, on Llano, it doesn't appear to be. Redwood as seen in the 5570 would probably need conservative clocks to be cool enough for the WiiU - if it dropped below 500 mHz it might lose at fillrate and triangle setup. Going by Llano it might be too big for a 45nm SOC - Llano would likely be very big at 45nm even if you dropped a couple of the CPU cores.

There's nothing yet that confirms the final GPU is even at Llano level, although it would be nice if it was (or better).
 
But that makes no sense also. The GDDR3's 57GB/s isn't going to simulate anything near the eDRAM.

Obviously not, but it would come closer than a card with half that much memory bandwidth.

But like I said, you keep trying told mold the evidence to fit your preferred outcome and that's why I find your predictions unreliable.
 
Obviously not, but it would come closer than a card with half that much memory bandwidth.

But like I said, you keep trying told mold the evidence to fit your preferred outcome and that's why I find your predictions unreliable.

My preferred outcome? My preferred outcome would be more than what Nintendo seems to be going for. :LOL:

Since you assumed incorrectly, you apparently just lost your reason to say why my prediction is unreliable.
 
hello everyone

seems like an interesting discussions going on here, though i just regitered and will be back when i finish reading all the long pages since the "final dev kits much better" post that actuallz got me here. but i did read some 4-5 pages today

anyways i was researching for my self mostly just commenting on other forums but not reall engagin in this kind of a discussion, what i can say now is pretty much in line of what bassasin and others are saying (the RV770 stuff) , although the dev guy that fled when asked for credentials is really not aa surprise, but the negative info is somehow weird and odd and if its true i would aswell be very disappointed of nintendo to do a false campaing but then end up with no leap over x360 , but thats just not what the info is saying, so goes for the Rangers guy ... i have holidays and i spent like 3-4 days surfing only wiiu info and i sware i never saw any RV730 credible , maybe in commments ... on the fly.

also theres a lot more to it i might have more info but its night now so im be back later and sry for grammar im tpying from mobile device exceptionally

and no im not a dev / before we go into any of this, im just an enthusiast and tech and i like to be as accurate as possible, but i dont speculate much, im more into finding and researching the actual stuff.


there is also a possibility there might be intentional misinformation .. but still frightening mistery is the dev kit story being less powerful than "announced". ...

also the biggest concern is memory, 1gb is i think not an enough leap over ps360 , 1.5 seems reasonable and i hope for it not any less

my observation is also leaning on to that nintendo stays in the game for long-term and the memory amount is very crucial, look on whats happening to Rage on consoles , the new game from id software running on idtech5 whicch tradeoffs in a way that it requires fast memory loading and lots of space since its a uniqe texture island without repeating tiles, struggles on ps3 where ps3 os takes more chunk of meme than on 360, but overall it was a real push case for carmack as he always compained about the lack of memory and poor io speeds and buffers

anyone wants to know more just ask ... ofcourse more details on gpu stuff when i get back
later ...
 
Status
Not open for further replies.
Back
Top