Wii U hardware discussion and investigation *rename

Discussion in 'Console Technology' started by AlNets, Jul 29, 2011.

Thread Status:
Not open for further replies.
  1. AlNets

    AlNets ¯\_(ツ)_/¯
    Moderator Legend

    Joined:
    Feb 29, 2004
    Messages:
    17,787
    Location:
    Polaris
    Hey folks, we're back again. This time we've rocked our collective brains and extensive knowledge of GPU hardware together to bring you a speculative piece on the GPU powering Nintendo's WiiU based on some of the rumours and hints that have arisen since before and after E3. We present several scenarios whilst trying to bring some modest reasoning to the forefront. So have a look here!

    http://www.beyond3d.com/content/articles/118/

    Big round of thanks to Farid, AlexV, WillardJuice and a pixel counter for their input and assistance for the timely release of yet another B3D article!
     
  2. digitalwanderer

    digitalwanderer Dangerously Mirthful
    Legend

    Joined:
    Feb 19, 2002
    Messages:
    15,594
    Location:
    Winfield, IN USA
    Good read, very timely! :D
     
  3. Squilliam

    Squilliam Beyond3d isn't defined yet
    Veteran Subscriber

    Joined:
    Jan 11, 2008
    Messages:
    3,219
    Location:
    New Zealand
    Are you that pixel counter Mr Strong?
     
  4. kagemaru

    Veteran

    Joined:
    Aug 23, 2010
    Messages:
    1,350
    Location:
    Ohio
    Awesome, can't wait until I can read this. Thanks for the heads up Al.
     
  5. Squilliam

    Squilliam Beyond3d isn't defined yet
    Veteran Subscriber

    Joined:
    Jan 11, 2008
    Messages:
    3,219
    Location:
    New Zealand
    On the ED-RAM question: What if the Wii U can use the CPU ED-RAM as the memory target to resolve the frame buffer to? Maybe they simply decided they didn't want to create two separate pools of that memory and the slow downs seen in games like Ghost Recon are because the developers still haven't decided on whether or not to make use of that particular pool for rendering?
     
  6. AlNets

    AlNets ¯\_(ツ)_/¯
    Moderator Legend

    Joined:
    Feb 29, 2004
    Messages:
    17,787
    Location:
    Polaris
    Maybe. :p

    I'm not sure how complicated such an interface would be. I mean, we've seen shader export/memexport, but for starters, I don't think the bandwidth is anything near what we'd want for framebuffer reading/writing/blending. So there'd need to be a pretty wide interface between the CPU & GPU RBEs. The problem there is that the two chips ought to be on the same package I would think.

    I suppose there's also cache locking on the CPU side to avoid resource contention or thrashing there.
     
  7. Squilliam

    Squilliam Beyond3d isn't defined yet
    Veteran Subscriber

    Joined:
    Jan 11, 2008
    Messages:
    3,219
    Location:
    New Zealand
    You're such a bashful one... :oops:

    Thats the 10 million dollar question isn't it? The first fusion processor to market was an IBM POWER CPU and an AMD GPU, was it not? Do we have any reason to exclude the possibility that the chip will be yet another APU but with even better integration than the Xbox 360 APU which was merely a cost saving move?
     
  8. Heinrich4

    Regular

    Joined:
    Aug 11, 2005
    Messages:
    576
    Excellent article, concise, accurate within the possibilities offered at the time, unparalleled when compared to what has been discussed in forums, technical sites.
     
  9. ***CENSORED***

    Newcomer

    Joined:
    Jun 8, 2006
    Messages:
    1
    Decided to actually post here (I registered in 2006) to say how fascinating I found this article. Great read! :smile:
     
  10. rpg.314

    Veteran

    Joined:
    Jul 21, 2008
    Messages:
    4,295
    Location:
    /
    Great read.
     
  11. Grall

    Grall Invisible Member
    Legend

    Joined:
    Apr 14, 2002
    Messages:
    9,131
    Location:
    La-la land
    Nice article, but, *ahem*, maybe a bit speculative in nature... When it's so hard to find any "meat" on WiiU then the whole article becomes nothing except guesswork. Oh, it'll be an IBM CPU and AMD GPU...and then we extrapolate from there, based on the size of the enclosure shown at E3. :twisted:

    Also two very minor nitpicks: SNES had afaik a total max of on-screen colors of 256, shared between backgrounds and sprites. To exceed that you needed to write new values into color registers on a per-scanline basis, but that's fakery and thus not exceeding the total of 256 unique colors.

    Also, the SuperFX was not a DSP coproessor, it was from what I understand a custom RISC CPU (clocked at 12 or 21-ish MHz, with up to 512k RAM available to it I believe.) The SNES had a whole slew of in-cart coprocessors, a bunch of them were third-party creations, like Capcom's C4 for example. Some also were actual DSPs, like the chip powering Nintendo's Pilot Wings.
     
  12. DieH@rd

    Legend Veteran

    Joined:
    Sep 20, 2006
    Messages:
    5,139
    So even in the worst case scenario, WiiU will still have more than enough power to run current console games with some better performance.

    To be honest, im more interested in CPU and RAM. GPU will be good in any case.
     
  13. AlNets

    AlNets ¯\_(ツ)_/¯
    Moderator Legend

    Joined:
    Feb 29, 2004
    Messages:
    17,787
    Location:
    Polaris
    Yeah, it'd be virtually impossible to put in something that's worse. The question will be what they do about memory bandwidth. Even with GDDR5 on 128-bit, they'd be spending a fair bit with the fastest available chips & 2Gbit density. I'm not too sure what the availability is now, but the 6970 uses 5.6GHz. That'd be ~90GB/s on 128-bit. Most other cards use something slower. It'd be hard to expect the absolute fastest GDDR5 (in 2012) anyway when you consider supply constraints and the effect on cost.

    4 chips on 128-bit bus makes sense for GDDR5 as well...
     
  14. Arnage

    Newcomer

    Joined:
    Jun 27, 2007
    Messages:
    2
    Regarding this bit: Here's an interview with Reggie where he admits that all third party footage was actually from the 360 or ps3. (around the 4:40 mark in the video) So you can't really draw any conclusions from that footage.
     
  15. Grall

    Grall Invisible Member
    Legend

    Joined:
    Apr 14, 2002
    Messages:
    9,131
    Location:
    La-la land
    I'm hoping for eDRAM again. It did wonders for the GC/Wii, and the 360 also of course.
     
  16. Entropy

    Veteran

    Joined:
    Feb 8, 2002
    Messages:
    2,349
    I'm missing some discussion re:lithographic process for a GPU that is not yet in à material state, and due to be launched Q2 2012. The article seems to assume the same 40nm process AMD first delivered the HD4770 on in spring 2009. Which is not necessarily a given, and worthy of at least some mention.
     
  17. AlNets

    AlNets ¯\_(ツ)_/¯
    Moderator Legend

    Joined:
    Feb 29, 2004
    Messages:
    17,787
    Location:
    Polaris
    um... a 360/PS3 using the WiiU controller? I was playing it at the WiiU showfloor... Along with every other demo they had. Reggie is referring to the demo reel during the conference itself.
     
  18. AlNets

    AlNets ¯\_(ツ)_/¯
    Moderator Legend

    Joined:
    Feb 29, 2004
    Messages:
    17,787
    Location:
    Polaris
    Just keep in mind that launching on 28nm, a new and certainly immature process, isn't going to be cheap.

    For comparison's sake, TSMC had 65nm at the end of 2005, yet we didn't see its use in consoles or even other GPU markets until mid to late 2007. It's not like the complexity of the smaller process nodes is going to make the transition to 28nm any smoother for mass production - just take a look at how long 28nm has been delayed in the first place.
     
  19. Entropy

    Veteran

    Joined:
    Feb 8, 2002
    Messages:
    2,349
    Fair enough - altthough 28nm being delayed apparently doesn't stop AMD from launching their new lineup on 28nm this year, and makes targetting 40nm for a GPU with the assumed launch date less likely, not more since Nintendo would have had reason to assume the process would be well under control by then.
    Can't really comment on the die size vs. wafer cost tradeoff, other than that it should swing in favour of the smaller process with time.

    Of course, you may also have some undiscloseable inside info. :)
     
  20. AlNets

    AlNets ¯\_(ツ)_/¯
    Moderator Legend

    Joined:
    Feb 29, 2004
    Messages:
    17,787
    Location:
    Polaris
    No, it won't stop AMD's PC parts, but those aren't quite as high volume as Nintendo's requirements and the PC market has the advantage of binning and lol-prices at the high end, which Nintendo won't. It'll just be tight. Of course with time, things get better, but we're also talking about a target date of mid-2012 (or ever early 2012) for manufacturing, not just release date. Nintendo sure has cash to burn, so that's their prerogative if they really need 28nm at launch.

    I mean, it's not like 40nm is crappy at all right now, and shifting to a new process so early doesn't automatically mean good power characteristics (and thus has implications for clocks & usable chips).
     
Thread Status:
Not open for further replies.

Share This Page

Loading...