Wii U hardware discussion and investigation *rename

Discussion in 'Console Technology' started by TheAlSpark, Jul 29, 2011.

Thread Status:
Not open for further replies.
  1. yewyew

    Newcomer

    Joined:
    Aug 29, 2007
    Messages:
    70
    Likes Received:
    9
    Question: Has anyone with the resources considered doing benchmarks of games on Wii U and PC using R600 cards to start and moving up to later cards of similar stat to see which cards perform closest to match Wii U? Or would RAM and CPU be too much of a factor to get any accuracy? Just curious...
     
  2. Goodtwin

    Veteran Newcomer Subscriber

    Joined:
    Dec 23, 2013
    Messages:
    1,145
    Likes Received:
    610

    Well, considering we would have to match something like a Pentium M CPU with an AMD HD6450 GPU and GB of DDR3 memory, I think we wold find the Wii U over achieving. A PC with those components would be hard pressed to run a game like COD Ghost nearly as well as the Wii U does. Basically the customizations that Nintendo and AMD made give far better results than off the shelf PC parts.
     
  3. creaks

    Newcomer

    Joined:
    Apr 9, 2013
    Messages:
    81
    Likes Received:
    0
    Whats with all this talk of games, and business end stuff? Hard info dried up over here too? Anuways, if you are trying to make me cry its working.

    Id give anything for any new
    content more engaging than going left to right....


    ANYWAYS. Whats some thoughts on what the espresso workload is looking lile now that its not paired with a fixed function combiner?

    The 750 used to handle the entirety of geometry, and in games like rouge leader re4 (and im sure many others), it calculated point lighs as well.

    The gpu is now a lot more capable on that end, so id assume their would be a rather prevelant change in how to think about organizing workflow.

    Any interesting thoughts from anyone familiar with cube or wii and how they would change things up?
     
  4. Blazkowicz

    Legend Veteran

    Joined:
    Dec 24, 2004
    Messages:
    5,607
    Likes Received:
    256
    Well surely, as the Pentium M is single core and the Wii U triple core.
     
  5. Goodtwin

    Veteran Newcomer Subscriber

    Joined:
    Dec 23, 2013
    Messages:
    1,145
    Likes Received:
    610
    My bad, I just remembered seeing a benchmark that put the PPC750 clock for clock pretty similar to a Pentium M processor. I suppose a Core 2 would probably be closer in terms of performance.

    @Creaks

    This is why I am more interested in hearing from developers about the nitty gritty details of development on the hardware. When you look at games like Rogue Squadron 2 and 3 on Gamecube, its tough to understand why there are so many issues with the Wii U's tri core design. NDA's suck, because it limits fans from getting real answers from developers, and you have to piece things together. The thing I have come to accept that the videogame industry is a business first and foremost, and Nintendo's hardware tends to be the oddman out.
     
    #5805 Goodtwin, Feb 28, 2014
    Last edited by a moderator: Feb 28, 2014
  6. Grall

    Grall Invisible Member
    Legend

    Joined:
    Apr 14, 2002
    Messages:
    10,801
    Likes Received:
    2,174
    Location:
    La-la land
    CPUs in Wii and Gamecube (tossed both of mine out today btw, since I don't have any gamecube games anymore except for Metroid Primes 1/2, and wuu runs wii games (including MP Trilogy) flawlessly and in 640P) most likely only did geometry for skinned characters; not sure if it actually did any lighting at all. The GPU had pretty solid fixed-function T&L stuff, except for aforementioned lack of skinning/geometry shader support.

    Oh, I'd expect rendering to be completely different in every single way for a wuu title compared to previous nintendo games. Feature-wise, the wuu is about four GPU generations ahead of wii, and the one thing they both have in common is they both rasterize textured polygons, and that's it... :razz:

    Of course, how wonky wuugpu may be under the hood depends on how much of the wii backwards compatibility is 'visible' when the thing is running in wuu mode. It may be that functionally it's pretty much a standard radeon 4k series with eDRAM tacked on when running in wuu mode (and devs probably don't bang the hardware directly anyway but rather access some OGL variant.)
     
  7. creaks

    Newcomer

    Joined:
    Apr 9, 2013
    Messages:
    81
    Likes Received:
    0
    Sorry I was poorly, in passing referencing an interview from factor 5 that said they used gekko for lighting past flippers 8 hardware lights.

    I would dig it up, but im lazy and its rather irrelevant for wuu.

    I guess What Im really wondering is, is latte handling the majority of geometry on wuu games out right now.

    And, if so how much could a tricore espresso add to the equation, and at what expense to other aspects.
     
    #5807 creaks, Feb 28, 2014
    Last edited by a moderator: Feb 28, 2014
  8. swaaye

    swaaye Entirely Suboptimal
    Legend

    Joined:
    Mar 15, 2003
    Messages:
    8,567
    Likes Received:
    652
    Location:
    WI, USA
    Luigi's Mansion uses the CPU for lighting something. The flashlight maybe? I was at an IBM presentation long ago and they were talking Gamecube and mentioned that I think.
     
  9. Goodtwin

    Veteran Newcomer Subscriber

    Joined:
    Dec 23, 2013
    Messages:
    1,145
    Likes Received:
    610
  10. Grall

    Grall Invisible Member
    Legend

    Joined:
    Apr 14, 2002
    Messages:
    10,801
    Likes Received:
    2,174
    Location:
    La-la land
    F5 interviews were regurgitated ad nauseam on this and undoubtedly other forums as well at the time, I must say I don't recall ever seeing that claim before. Maybe I did and just forgot; it's possible. Just can't recall it, do you have an idea which game of theirs they were talking about? The later rogue squadron games pushed the system progressively harder, technical-wise anyway if not so much in gameplay. :razz:

    I'd be surprised if any geometry stuff at all is done on CPU in wuu. The GPU should be vastly faster and more than capable enough feature-wise for that kind of work.

    Doing geometry on the CPU tended to kill the CPU pretty hard in the past, before T&L GPUs arrived, it was often bottlenecking there, and while wuucpu is much faster than the PC chips at the time it would probably be a much heavier burden for wuu, proportionally speaking, seeing as today's games are vastly more geometry-heavy. Hell, you can have as many polys in one character model today as in an entire game level from the GLQuake generation... ;)
     
  11. creaks

    Newcomer

    Joined:
    Apr 9, 2013
    Messages:
    81
    Likes Received:
    0
    Thanks for the insight. Sounds good to me. I didnt expect the cpu to take over geometry, I just feel a wii game or two's amount of extra geometry icing on top of the gpu's cake could make an interesting difference.
     
  12. Blazkowicz

    Legend Veteran

    Joined:
    Dec 24, 2004
    Messages:
    5,607
    Likes Received:
    256
    I've thought the eDRAM is especially used for running in Wii compatibility mode. See, the Wii has 3MB [strike]eDRAM[/strike] embedded 1T-SRAM, and 24MB of 1T-SRAM which is something of a special memory, and then GDDR3 as the "slow" memory.

    eDRAM is perfect to house the 1T-SRAM's content, if you didn't there would be latency issues and that would mess the old games or their timing up.
     
  13. creaks

    Newcomer

    Joined:
    Apr 9, 2013
    Messages:
    81
    Likes Received:
    0
    Yeah, The wii u has several pools of embedded ram, actually directly above the edram, each cache smaller and faster than the last. The smaller ones likely handle the small edram and embedded 1tsram, while the 32mb l2 acts as the 24Mb 1tsram, and ddr3 for the gddr3 in wii.

    Speaking of the 32 Mb of edram I found a picture big enough to see the pins.

    http://www.joesiv.net/fourthstorm/WiiU-GPU_enhanced_blocks.jpg

    So, I see 16 pins per cell on that edram (you only count one side right, not top and bottom pins?), 8 cells form 1 main block, and 8 blocks lead to a total of 1024 pins.

    Yay? Nay?

    er... I guess about 8x less than that 8192 figure I just saw tagged at the bottom of the page (Am I off by 8-fold? I really dont see how that could be practical? was this already found out?)
     
    #5813 creaks, Mar 1, 2014
    Last edited by a moderator: Mar 1, 2014
  14. Blazkowicz

    Legend Veteran

    Joined:
    Dec 24, 2004
    Messages:
    5,607
    Likes Received:
    256
    I don't know.
    This image is freaking huge and I'm watching it load slowly :lol:
    I would prefer you put it as a link.
     
  15. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    42,985
    Likes Received:
    15,120
    Location:
    Under my bridge
    Yeah. A 16 MB, 27 megapixel image is a pretty crazy embed. I'll refactor to a simple link.
     
  16. creaks

    Newcomer

    Joined:
    Apr 9, 2013
    Messages:
    81
    Likes Received:
    0
    Sorry XD. Thanks for the save shifty. SO, edram bandwidth looks to be around 70 Gb/s?
     
    #5816 creaks, Mar 1, 2014
    Last edited by a moderator: Mar 1, 2014
  17. Goodtwin

    Veteran Newcomer Subscriber

    Joined:
    Dec 23, 2013
    Messages:
    1,145
    Likes Received:
    610
    Probably a stupid question, but there are 16 pins on each side of the edram cell, so is that still just 16 pins? And 70GB/s seems pretty reasonable for the requirements of the console. Even if it were 140GB/s, would that even make a difference on a GPU that is so modest in performance?
     
  18. Grall

    Grall Invisible Member
    Legend

    Joined:
    Apr 14, 2002
    Messages:
    10,801
    Likes Received:
    2,174
    Location:
    La-la land
    You can't look at a photo of an acid-etched die and figure out the bandwidth of internal components of an integrated chip. Doesn't work like that.
     
  19. Goodtwin

    Veteran Newcomer Subscriber

    Joined:
    Dec 23, 2013
    Messages:
    1,145
    Likes Received:
    610
    Can you explain why that doesnt work?
     
  20. Grall

    Grall Invisible Member
    Legend

    Joined:
    Apr 14, 2002
    Messages:
    10,801
    Likes Received:
    2,174
    Location:
    La-la land
    Because the features of the chip that dictate its performance are simply way, way too small and complex for the human eye to see (for starters)...? :)
     
Loading...
Thread Status:
Not open for further replies.

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...