Wii U hardware discussion and investigation *rename

Status
Not open for further replies.
Question: Has anyone with the resources considered doing benchmarks of games on Wii U and PC using R600 cards to start and moving up to later cards of similar stat to see which cards perform closest to match Wii U? Or would RAM and CPU be too much of a factor to get any accuracy? Just curious...
 
Question: Has anyone with the resources considered doing benchmarks of games on Wii U and PC using R600 cards to start and moving up to later cards of similar stat to see which cards perform closest to match Wii U? Or would RAM and CPU be too much of a factor to get any accuracy? Just curious...


Well, considering we would have to match something like a Pentium M CPU with an AMD HD6450 GPU and GB of DDR3 memory, I think we wold find the Wii U over achieving. A PC with those components would be hard pressed to run a game like COD Ghost nearly as well as the Wii U does. Basically the customizations that Nintendo and AMD made give far better results than off the shelf PC parts.
 
Mariokart's only lasting appeal is as a multiplayer party game; racing games have virtually no lasting appeal for most people except racing gearheads, and those generally care about real sports cars, not funny-looking comicbook carts being raced by anthropomorphic animals and cartoon characters.

Also, there's been HOW many mario karts now? How big can demand for this thing really be? Meanwhile, where's the games with actual plot and depth...? Nintendo should have been working on an action/adventure Kid Icarus from day one the wuu was dreamt up. ...And, a Metroid game too. Bah!

Whats with all this talk of games, and business end stuff? Hard info dried up over here too? Anuways, if you are trying to make me cry its working.

Id give anything for any new
content more engaging than going left to right....


ANYWAYS. Whats some thoughts on what the espresso workload is looking lile now that its not paired with a fixed function combiner?

The 750 used to handle the entirety of geometry, and in games like rouge leader re4 (and im sure many others), it calculated point lighs as well.

The gpu is now a lot more capable on that end, so id assume their would be a rather prevelant change in how to think about organizing workflow.

Any interesting thoughts from anyone familiar with cube or wii and how they would change things up?
 
Well, considering we would have to match something like a Pentium M CPU with an AMD HD6450 GPU and GB of DDR3 memory, I think we wold find the Wii U over achieving. A PC with those components would be hard pressed to run a game like COD Ghost nearly as well as the Wii U does. Basically the customizations that Nintendo and AMD made give far better results than off the shelf PC parts.

Well surely, as the Pentium M is single core and the Wii U triple core.
 
Well surely, as the Pentium M is single core and the Wii U triple core.

My bad, I just remembered seeing a benchmark that put the PPC750 clock for clock pretty similar to a Pentium M processor. I suppose a Core 2 would probably be closer in terms of performance.

@Creaks

This is why I am more interested in hearing from developers about the nitty gritty details of development on the hardware. When you look at games like Rogue Squadron 2 and 3 on Gamecube, its tough to understand why there are so many issues with the Wii U's tri core design. NDA's suck, because it limits fans from getting real answers from developers, and you have to piece things together. The thing I have come to accept that the videogame industry is a business first and foremost, and Nintendo's hardware tends to be the oddman out.
 
Last edited by a moderator:
The 750 used to handle the entirety of geometry, and in games like rouge leader re4 (and im sure many others), it calculated point lighs as well.
CPUs in Wii and Gamecube (tossed both of mine out today btw, since I don't have any gamecube games anymore except for Metroid Primes 1/2, and wuu runs wii games (including MP Trilogy) flawlessly and in 640P) most likely only did geometry for skinned characters; not sure if it actually did any lighting at all. The GPU had pretty solid fixed-function T&L stuff, except for aforementioned lack of skinning/geometry shader support.

The gpu is now a lot more capable on that end, so id assume their would be a rather prevelant change in how to think about organizing workflow.
Oh, I'd expect rendering to be completely different in every single way for a wuu title compared to previous nintendo games. Feature-wise, the wuu is about four GPU generations ahead of wii, and the one thing they both have in common is they both rasterize textured polygons, and that's it... :p

Of course, how wonky wuugpu may be under the hood depends on how much of the wii backwards compatibility is 'visible' when the thing is running in wuu mode. It may be that functionally it's pretty much a standard radeon 4k series with eDRAM tacked on when running in wuu mode (and devs probably don't bang the hardware directly anyway but rather access some OGL variant.)
 
CPUs in Wii and Gamecube (tossed both of mine out today btw, since I don't have any gamecube games anymore except for Metroid Primes 1/2, and wuu runs wii games (including MP Trilogy) flawlessly and in 640P) most likely only did geometry for skinned characters; not sure if it actually did any lighting at all. The GPU had pretty solid fixed-function T&L stuff, except for aforementioned lack of skinning/geometry shader support.


Oh, I'd expect rendering to be completely different in every single way for a wuu title compared to previous nintendo games. Feature-wise, the wuu is about four GPU generations ahead of wii, and the one thing they both have in common is they both rasterize textured polygons, and that's it... :p

Of course, how wonky wuugpu may be under the hood depends on how much of the wii backwards compatibility is 'visible' when the thing is running in wuu mode. It may be that functionally it's pretty much a standard radeon 4k series with eDRAM tacked on when running in wuu mode (and devs probably don't bang the hardware directly anyway but rather access some OGL variant.)

Sorry I was poorly, in passing referencing an interview from factor 5 that said they used gekko for lighting past flippers 8 hardware lights.

I would dig it up, but im lazy and its rather irrelevant for wuu.

I guess What Im really wondering is, is latte handling the majority of geometry on wuu games out right now.

And, if so how much could a tricore espresso add to the equation, and at what expense to other aspects.
 
Last edited by a moderator:
Luigi's Mansion uses the CPU for lighting something. The flashlight maybe? I was at an IBM presentation long ago and they were talking Gamecube and mentioned that I think.
 
Sorry I was poorly, in passing referencing an interview from factor 5 that said they used gekko for lighting past flippers 8 hardware lights.
F5 interviews were regurgitated ad nauseam on this and undoubtedly other forums as well at the time, I must say I don't recall ever seeing that claim before. Maybe I did and just forgot; it's possible. Just can't recall it, do you have an idea which game of theirs they were talking about? The later rogue squadron games pushed the system progressively harder, technical-wise anyway if not so much in gameplay. :p

I guess What Im really wondering is, is latte handling the majority of geometry on wuu games out right now.
I'd be surprised if any geometry stuff at all is done on CPU in wuu. The GPU should be vastly faster and more than capable enough feature-wise for that kind of work.

Doing geometry on the CPU tended to kill the CPU pretty hard in the past, before T&L GPUs arrived, it was often bottlenecking there, and while wuucpu is much faster than the PC chips at the time it would probably be a much heavier burden for wuu, proportionally speaking, seeing as today's games are vastly more geometry-heavy. Hell, you can have as many polys in one character model today as in an entire game level from the GLQuake generation... ;)
 
F5 interviews were regurgitated ad nauseam on this and undoubtedly other forums as well at the time, I must say I don't recall ever seeing that claim before. Maybe I did and just forgot; it's possible. Just can't recall it, do you have an idea which game of theirs they were talking about? The later rogue squadron games pushed the system progressively harder, technical-wise anyway if not so much in gameplay. :p


I'd be surprised if any geometry stuff at all is done on CPU in wuu. The GPU should be vastly faster and more than capable enough feature-wise for that kind of work.

Doing geometry on the CPU tended to kill the CPU pretty hard in the past, before T&L GPUs arrived, it was often bottlenecking there, and while wuucpu is much faster than the PC chips at the time it would probably be a much heavier burden for wuu, proportionally speaking, seeing as today's games are vastly more geometry-heavy. Hell, you can have as many polys in one character model today as in an entire game level from the GLQuake generation... ;)

Thanks for the insight. Sounds good to me. I didnt expect the cpu to take over geometry, I just feel a wii game or two's amount of extra geometry icing on top of the gpu's cake could make an interesting difference.
 
Of course, how wonky wuugpu may be under the hood depends on how much of the wii backwards compatibility is 'visible' when the thing is running in wuu mode. It may be that functionally it's pretty much a standard radeon 4k series with eDRAM tacked on when running in wuu mode (and devs probably don't bang the hardware directly anyway but rather access some OGL variant.)

I've thought the eDRAM is especially used for running in Wii compatibility mode. See, the Wii has 3MB [strike]eDRAM[/strike] embedded 1T-SRAM, and 24MB of 1T-SRAM which is something of a special memory, and then GDDR3 as the "slow" memory.

eDRAM is perfect to house the 1T-SRAM's content, if you didn't there would be latency issues and that would mess the old games or their timing up.
 
I've thought the eDRAM is especially used for running in Wii compatibility mode. See, the Wii has 3MB [strike]eDRAM[/strike] embedded 1T-SRAM, and 24MB of 1T-SRAM which is something of a special memory, and then GDDR3 as the "slow" memory.

eDRAM is perfect to house the 1T-SRAM's content, if you didn't there would be latency issues and that would mess the old games or their timing up.

Yeah, The wii u has several pools of embedded ram, actually directly above the edram, each cache smaller and faster than the last. The smaller ones likely handle the small edram and embedded 1tsram, while the 32mb l2 acts as the 24Mb 1tsram, and ddr3 for the gddr3 in wii.

Speaking of the 32 Mb of edram I found a picture big enough to see the pins.

http://www.joesiv.net/fourthstorm/WiiU-GPU_enhanced_blocks.jpg

So, I see 16 pins per cell on that edram (you only count one side right, not top and bottom pins?), 8 cells form 1 main block, and 8 blocks lead to a total of 1024 pins.

Yay? Nay?

er... I guess about 8x less than that 8192 figure I just saw tagged at the bottom of the page (Am I off by 8-fold? I really dont see how that could be practical? was this already found out?)
 
Last edited by a moderator:
Sorry XD. Thanks for the save shifty. SO, edram bandwidth looks to be around 70 Gb/s?
 
Last edited by a moderator:
Probably a stupid question, but there are 16 pins on each side of the edram cell, so is that still just 16 pins? And 70GB/s seems pretty reasonable for the requirements of the console. Even if it were 140GB/s, would that even make a difference on a GPU that is so modest in performance?
 
You can't look at a photo of an acid-etched die and figure out the bandwidth of internal components of an integrated chip. Doesn't work like that.
 
Because the features of the chip that dictate its performance are simply way, way too small and complex for the human eye to see (for starters)...? :)
 
Status
Not open for further replies.
Back
Top