Wii U hardware discussion and investigation *rename

Status
Not open for further replies.
BGassassins leaked developer info also supported the 160SPU count, and the mods at Neogaf backed it up. It doesnt matter, the 176 Gflop GPU powering Wii U outperforms the 252Gflop GPU powering the 360, developers are backing that up. How can that be? Smarter people than me would have to do the explaining.

Somebody on B3D said the 360 GPU was 200 Gflops...

The rest could be explained by having 2008 GPU tech rather than 2005?

32MB EDRAm vs 10MB might help a lot too. I think 32MB was arguably overkill really!
 
But that's just your feels.

You can actually look at the chip and count the register banks in each SIMD block. Plus 320 doesn't fit in the space on the die, even on TSMC 40 nm. And it wasn't made at TSMC, it was made at Renesas on a less dense process. And the chip was laid out without AMD's TSMC optimised high density layout tools and shit.

Even on the PC a BW crippled 320 shader part can bring all kinds of hurt down on the 360. Where as the Wii U ... doesn't.

(WiiU only appears to have 8 TMUs too - which once again fits with 160 shaders and not an int core more).

In shader limited stuff Xbone will be very roughly ten times faster than Wii U, and far more than that in terms of compute. PS4 is about 50% faster than Xbone. Taken across all three platforms, 4Bone are close together while Wii U is trailing off so far behind that PS360 are holding its hands.

I looked at NeoGAF's Wii U GPU thread and die shot and I counted 32 SRAM banks per SIMD/SPU block also it is 16 and not 8 TMU's since for example Radeon HD 4870 has 4 TMU's per CU and also AMD's Bobcat(Radeon HD 5000 based IGP) also has for each one CU four TMU's.

4 TMU's per 1 CU is present in Radeon HD 4000, 5000, 6000 series GPU's also Bobcat and Trinity/Richland IGP's. Same thing we see in Wii U's die shot and you are forgot that Wii U's GPU is highly customized according to Chipworks since they can't match it with any known AMD GPU.

I used Radeon HD 4870 die shot and compare it to Wii U GPU(Latte) die shot thus I spotted similarities and I was able to find out exact number of TMU's. For each CU block or what ever it is we have one block or what ever it is of 4 TMU's.

It is what it is... You can try and deny it, but official specs from 4000 to 6000 and Bobcat and Trinity/Richland GPU's say otherwise. Oh yea... Same for Liano IGP's.
 
I still have the 320SP Radeon 4670 around. It's in my TV computer and I occasionally game on it at 1360x768. It's not really a card you want to be stuck with anymore. It probably does beat up PS360 but still... Loading up PS360 gives a different impression than bumming around with old PC hardware though, because the games are already precisely tailored for the hardware.

My point is don't overestimate 320 SP VLIW5. ;) Faster but it doesn't really matter in the grand scheme. If WiiU is 160 SP however, that is pretty terrible.
 
I still have a Radeon 4670 around. I occasionally game on it at 1360x768. It's not really a card you want to be stuck with anymore. Probably does beat up PS360 but still it is barely adequate even for say Bioshock 2 or FEAR 2.

That GPU that you have is exactly as powerful as two Xbox 360 GPU's combined and fun fact is that it has 8 TMU's per CU and if it was shrink'ed to 40nm node/process and downclocked to 550mhz which is same exact clock as Wii U's GPU then it would have TDP of 28.725 watts.

I am wonder if VRAM usage is also accounted into TDP or its TDP for chip its self and not entire graphic card as a whole.
 
(WiiU only appears to have 8 TMUs too - which once again fits with 160 shaders and not an int core more).

Please link me to said article. Furthermore the speculations has to be at least younger than the Wii U's age. I already posted a "dated" article (that was released MONTHS after the Wii U's date) suggesting this 320 SPU theory.

I'd like to see your bargain. Otherwise I'm sticking with
320 SPUs 352 GFLOPS
16 TMUs 8.8 GT/s
8 ROPs 4.4 GP/s
Let's not to forget IT'S AN AMD HD 4.xxx/5.xxx card. I don't believe the architectural analysis of the r700 lies since they're consistent. My "emotional" specs is consistent with the AMD r700 architecture.

Mario Kart not being 1080p is certainly a debatable issue. I thought I remembered reading something it was going to be 1080p but looking at it the comments seem ambiguous (either way it's incredibly clean.) At least from what I can make out there doesn't seem to be MSAA (or it's a very low sample count) so I'd think it's 1080p no AA.
 
It is quite funny to think that if one link a A4-5000 /kabini to GDDR5 and tweak the clock a tad (lower the cpu clock further, up the GPU speed), one could out do with minimal R&D effort the PS3, the 360 and the WiiU.
If you go for 1GB of GDDR5 and a reasonable memory usage for the OS (say 128MB x4 what the 360 uses), you would have a system quite possibly cheaper to produce than any of the 3 and more efficient in every ways :8o:

Yes! I've thought the same thing myself. There are a few systems you could make with a Jaguar and GDDR5 base (iirc Jaguar mem controller can handle GDDR5 ... or maybe that was just Kaveri?).

I know Nintendo didn't want to wait in launching, but early 2013 could have seen them with an almost passive Wii U level system - say 1 gHz 4 core with 550 mHz GPU - or they could have gone balls out and put in a 2.05 gHz CPU and 1 gHz GPU, with 2GB of 44 GB/s GDDR5 + 1 GB DDR3 hanging off a little bus to use a disk cache.

Obviously wouldn't have fit with the goals of BC and using familiar CPU cores, but would have kicked the socks off PS36U.
 
Please link me to said article. Furthermore the speculations has to be at least younger than the Wii U's age. I already posted a "dated" article (that was released MONTHS after the Wii U's date) suggesting this 320 SPU theory.

Richard Leadbetter is awesome, but the article you linked to is dated and analysis of the die has moved on since then.

I'm not aware of any publicly available articles that contain a fraction of the analysis that's actually gone on in this thread.

Try searching through and finding the parts where this was worked out.

I'd like to see your bargain. Otherwise I'm sticking with
320 SPUs 352 GFLOPS
16 TMUs 8.8 GT/s
8 ROPs 4.4 GP/s

Well you can if you want, but you're wrong, and what's more you're ignoring the actual die to stick with that. If you're happy with that then go right ahead!

If you're not, try having a scan at different VLIW5 die shots and counting things. And also do some measurements, and remember that this isn't even a TSMC chip laid out by AMD ...

Let's not to forget IT'S AN AMD HD 4.xxx/5.xxx card. I don't believe the architectural analysis of the r700 lies since they're consistent. My "emotional" specs is consistent with the AMD r700 architecture.

It's not a "card". And your emotions seem unable to distinguish between architecture and configuration.

Mario Kart not being 1080p is certainly a debatable issue. I thought I remembered reading something it was going to be 1080p but looking at it the comments seem ambiguous (either way it's incredibly clean.) At least from what I can make out there doesn't seem to be MSAA (or it's a very low sample count) so I'd think it's 1080p no AA.

Once the game is out you'll be able to calculate the resolution of the finished product (just as you can calculate the resolution of the media released so far) by examining the screen. Once again, no feels necessary. ;)
 
Certainly you're a tough cookie and apparently I have to be real careful with my wordings so it doesn't get misinterpreted (C++ 101 all over again). At this point I'll just lay this thread to rest and wait for new findings. At the least I'll research the die size comparisons and see the truth/falsehood of your statement.

Until concretely proven (since I'm missing the physical aspect of the GPU chip there is also a hole in my point) I'll stick with 320 SPUs.
 
I agree, the X1 has been lumped together with the PS4 for a long time now, and its about time that changes. The X1 is struggling to really separate itself from the Wii U in the same way the PS4 is. The PS4 can not only surpass the graphical fidelity of the Wii U by a large margin, but also do it in 1080p native. The X1, as it stands, seems like it would struggle to run a top tier 720p 30fps Wii U game in 1080p 60fps, where as the PS4 could easily do that. In my opinion, the PS4 is the only console that has made the full leap to next gen, the other two are making far to many compromises to be considered next gen, in terms of graphics anyway.

I disagree. In fact the gap in power between the Ps4 and Xbox one isnt any different than the gap between the Ps2 and the original Xbox. As far as the xbox one struggling to separate itself from the Wii U you have to be joking. In fact take one title thats on all 3 platforms mentioned above and look at the real differences. Lets say AC4, The X1 and Ps4 share the exact same assets, textures, poly count, shadows and particle effects. In fact the Ps4 and X1 version are identical except in resolution Ps4 1080, X1 900. Then go look at the Wii U version, it shares all the assets of the Ps360 versions. Same resolution but it cant keep up with the same frame rates. In fact this goes for all multiplats on the X1 and Ps4, they all share the same exact assets textures,shaders,polys and so on. Most of them share the same resolution. When there are differences its resolution or frame rates. I understand the Ps4 has 6 more compute units but with both next gen systems having the same cpu and amount of ram they are in the same gen or class. Im not sure if you have ever played any Xbox one games but if you had you would know that the Wii U cannot compare graphically in any way to what even X1 launch titles look like. I know that in this day and age the internet is really pushing game resolution but it's not what makes a game next gen. The increased draw distance,texture quality,lighting,shading,poly count, characters on screen is what makes a game next gen graphicaly.
 
I'll add this from bgassasin, who reportedly confirmed the 160 SP's through developer docs

http://www.neogaf.com/forum/showthread.php?t=710765

http://www.neogaf.com/forum/showpost.php?p=89465619&postcount=550




He also supposedly vetted his info with a mod before posting it.

That's very much what I'm looking for. It helps me see things from a different perspective. Though I only glance at it since it's night however It'll certainly be an interesting read.

calguy said:
The HD 6450 has 160 shaders.
I don't recall the HD 4xxx or 5xxx series being the same as 6xxx.
 
I disagree. In fact the gap in power between the Ps4 and Xbox one isnt any different than the gap between the Ps2 and the original Xbox.
PS2->Xbox had a considerable (huge, really) features gap in addition to a significant performance gap. Xbone and PS4 are both matched evenly on features - except for when it comes to audio, which frankly is not important.

I don't recall the HD 4xxx or 5xxx series being the same as 6xxx.
You can trust what Cal_guy posts... He works for AMD. :D (IE: the product in question is probably a re-badged 4/5xxx series dealie.)
 
I don't recall the HD 4xxx or 5xxx series being the same as 6xxx.

I thought the Radeon 6000 series is nearly the same as the 5000 series, barring the 6970 and Trinity APU. It's really almost the same, I forgot what changes there are. A bigger buffer somewhere, stuff like that.
In fact when you say "a HD 4xxx/5xxx GPU", there's so much difference between the two of them (yet they're pretty apparented) that the 5000 and 6000 series can be considered the same.
 
I disagree. In fact the gap in power between the Ps4 and Xbox one isnt any different than the gap between the Ps2 and the original Xbox. As far as the xbox one struggling to separate itself from the Wii U you have to be joking. In fact take one title thats on all 3 platforms mentioned above and look at the real differences. Lets say AC4, The X1 and Ps4 share the exact same assets, textures, poly count, shadows and particle effects. In fact the Ps4 and X1 version are identical except in resolution Ps4 1080, X1 900. Then go look at the Wii U version, it shares all the assets of the Ps360 versions. Same resolution but it cant keep up with the same frame rates. In fact this goes for all multiplats on the X1 and Ps4, they all share the same exact assets textures,shaders,polys and so on. Most of them share the same resolution. When there are differences its resolution or frame rates. I understand the Ps4 has 6 more compute units but with both next gen systems having the same cpu and amount of ram they are in the same gen or class. Im not sure if you have ever played any Xbox one games but if you had you would know that the Wii U cannot compare graphically in any way to what even X1 launch titles look like. I know that in this day and age the internet is really pushing game resolution but it's not what makes a game next gen. The increased draw distance,texture quality,lighting,shading,poly count, characters on screen is what makes a game next gen graphicaly.

I get your point, but you can't judge performance of Nintendo's Wii U by looking at simple ports of Xbox 360/PlayStation 3 version and we all know how much architectural difference between Xbox 360/PlayStation 3 are and it is huge.To properly utilize Wii U's hardware you would need to recode substantial amount of game and publishers/developers simply don't want to make a decent port and they gimp their games for god sake on Nintendo's platforms.

It is cheaper to port a Xbox 360/PlayStation 3 version than Xbox One/PlayStation 4 version of the game because XO/PS4 architectures are alien compared to Wii U and X360/PS3 architectures are distant relatives in a way to point it out bluntly.

Anyway...

Compare White Knight Chronicles with Monolith Soft's X, Little Big Planet Kart with Mario Kart 8, PlayStation All-Stars Battle Royale with Super Smash Bros U... Do you see the leap that you are asking for? Don't deny it, you do.

Read this post in which I point out differences between White Knight Chronicles and Monolith Soft X's in terms of visual fidelity; Post
 
cheaping out on hardware is part of nintendo's whole design philosophy. it's called lateral thinking with withered technology, essentially using old tech in innovative ways. it shouldn't be a surprise that they would choose a low end graphics part for the wii u, but then again art style and tailoring specifically for the tech will ensure that games can still look good on nintendo's system. resident evil on the gamecube shows what having a great art direction can achieve.

p.s caps lock genuinely broken.
 
To properly utilize Wii U's hardware you would need to recode substantial amount of game and publishers/developers simply don't want to make a decent port and they gimp their games for god sake on Nintendo's platforms.
If wuu was beyond last gen's consoles to any significant degree at all it wouldn't matter that ports were running gimpy code, it'd run faster regardless (it's six-seven years more recent - ON PAPER, which according to Moore should mean ~400% greater hardware power, all other things being equal.) However, wuu is simply a crippled piece of hardware whichever way you look at it from, and that's why ports stutter on it.

I don't understand why we're still discussing it, it's obvious. Everything's been said already, and yet some people feel the need to ride to the console's rescue, making excuses for it.

Compare White Knight Chronicles with Monolith Soft's X, Little Big Planet Kart with Mario Kart 8, PlayStation All-Stars Battle Royale with Super Smash Bros U... Do you see the leap that you are asking for? Don't deny it, you do.
Would you stop making these fannish comparisons of games with wildly different art and settings to try and prove one console's superiority over another? Thank you. You see what you want to see - we get it. Subjective preference and all; having opinions is OK. However, technical thread; not subjective thread.
 
cheaping out on hardware is part of nintendo's whole design philosophy. it's called lateral thinking with withered technology, essentially using old tech in innovative ways.

This is true. withered technology is actually a term Nintendo uses and part of their philosophy. Sad but true. The basic idea is take old, ubiquitous, "well understood", tech and use it in some unique new form or fashion. In other words the actual tech matters little to none, only whats done with it (IE Wuu Gamepad, Wii Mote etc). The purpose of the underlying tech is to be cheap, not get in the way, and "well understood" (aka aged). Cutting edge would be a terrible thing as it would not be "well understood" for programmers.

So nobody should be surprised at Nintendo hardware really. And viewed in that way we see why Wii remnants are strewn through Wii U. After all, they're familiar with it.
 
nintendo's philosophy is essentially 'it's not about how powerful or cutting edge the tech is, it's what you do with it that counts."
 
Status
Not open for further replies.
Back
Top