Wii U hardware discussion and investigation *rename

Discussion in 'Console Technology' started by TheAlSpark, Jul 29, 2011.

Thread Status:
Not open for further replies.
  1. bomlat

    Regular

    Joined:
    Nov 5, 2006
    Messages:
    327
    Likes Received:
    0
    edram can't be 729 MHz, there that is too fast.max 600 can be reasoable

    The aligned clock rates can decrease the latency.
     
  2. bomlat

    Regular

    Joined:
    Nov 5, 2006
    Messages:
    327
    Likes Received:
    0
    That CPU identifier is a Nintentdo one, in the WII it was an IBM one.
    That package in the U contain three different die, from different vendors
     
  3. stiftl

    Newcomer

    Joined:
    Jun 24, 2006
    Messages:
    118
    Likes Received:
    11
    IIRC (I think it was in a Iwata asks years ago) they do it because of synchronizing between components, which works better with fixed multipliers. I am no tech guy so I can't verify this.
     
  4. TheLump

    Regular

    Joined:
    Jul 13, 2012
    Messages:
    280
    Likes Received:
    9
    Without really fully understanding the ins and outs of your very good post, I thought this last bit was an important point. To say Nintendo are incompetent or wasteful in their design is wholly inaccurate. They have, in my view, set out to produce a system which can a) receive ports from rival consoles b) be backwards compatible with Wii c) support a modern feature set d) incorporate their Gamepad tech e) do all that with low power draw. And they've made a seemingly efficient and elegant solution to that design brief.

    Whether its the right route to go for their next system is another debate. But it's certainly not a poorly designed system just because it's not pushing the performance envelope.
     
  5. Gerry

    Regular

    Joined:
    Feb 6, 2002
    Messages:
    803
    Likes Received:
    170
    Developers have it wrong - the WiiU is powerful

    OK, I just read through this article, and a bunch of the comments on top of that. I can't see why I should suffer alone so I dare some of you to read it as well. I warn you, you may feel your brain cells dying as you read through it.
     
  6. Strange

    Veteran

    Joined:
    May 16, 2007
    Messages:
    1,698
    Likes Received:
    428
    Location:
    Somewhere out there
    I read the title about 12 hours ago and was thinking to link it here for comments.
    Got 50% of the way through and concluded it was a trash read and isn't worth posting here as it wouldn't get anywhere with all the biased optimism and sheer amount of bad logic.

    ME3 and Batman are having framerate issues because of the GPGPU. What?
    By the fact that it had been released on consoles that don't even have one already qualifies that they work perfectly fine without one or simply not utiliizing it.
    @davros: you probably incur framerate drops for anything if you enable it. Pegging the GPGPU as the culprit for the bad fps is another thing.
     
    #3426 Strange, Nov 24, 2012
    Last edited by a moderator: Nov 25, 2012
  7. Davros

    Legend

    Joined:
    Jun 7, 2004
    Messages:
    17,879
    Likes Received:
    5,331
    Well I read it and i am now enlightened
    "but it’s unlikely the PS4 and 720 will feature GPU’s that are any beefier than the Wii U"
    @strange batman arkham city did suffer framerate drops when enabling physx its funny the writer understands it and then later on forgets and takes the opposite stance
     
  8. BRiT

    BRiT (>• •)>⌐■-■ (⌐■-■)
    Moderator Legend Alpha

    Joined:
    Feb 7, 2002
    Messages:
    20,511
    Likes Received:
    24,411
    Seriously what did you expect from ZeldaInformer website? If that doesn't spell Nintendo Fanboys I don't know what else would.
     
  9. bomlat

    Regular

    Joined:
    Nov 5, 2006
    Messages:
    327
    Likes Received:
    0
    The WII U is bandwidth limited,and if th GPU general processing mean more data from the main memory then the speed will fall.

    As simple as that.
     
  10. kots

    Regular

    Joined:
    Oct 30, 2008
    Messages:
    394
    Likes Received:
    0
    The best part is when he's giving advise to Dice :lol:
     
  11. jlippo

    Veteran

    Joined:
    Oct 7, 2004
    Messages:
    1,744
    Likes Received:
    1,090
    Location:
    Finland
    So, what are the changes that CPU and GPU have quite fast bus between them and gdram on GPU can be properly accessed by CPU?
    This certainly would allow some interesting possibilities.
     
  12. Entropy

    Veteran

    Joined:
    Feb 8, 2002
    Messages:
    3,360
    Likes Received:
    1,377
    According to Nintendo, the CPU and GPU do have fast communication between them. I think that is probably true, both because they have no particular reason to bring it up if it isn't the case, and because it isn't really a challenge to have on package signalling that is very fast in relation to the relatively modest processing power.

    While it hasn't been officially confirmed that the eDRAM on the GPU is accessible to the CPU, it seems likely. It would definitely help avoid any issues in emulation of the Wii accessing its 24MB of 1T-SRAM, and the CPU has its memory requests handled through the GPU anyway, so barring any evidence to the contrary it seems like a reasonable assumption.
     
  13. XpiderMX

    Veteran

    Joined:
    Mar 14, 2012
    Messages:
    1,768
    Likes Received:
    0
    I've read some people on NeoGAF saying things like "when developers build "gpu centric" games, Wii U will show next gen face...

    What's mean "gpu centric"? Can the devs stop using cpu and begin to use gpu for general purpose code?

    All this looks like Nintendo fans words (IMO).
     
  14. almighty

    Banned

    Joined:
    Dec 17, 2006
    Messages:
    2,469
    Likes Received:
    5
    They basically meen that when developers start making games for the Wii U hardware the graphics will improve, at the moment it's only really running games that are ports and thus we not designed for Wii U's architecture or memory lay out.

    From what I've seen some ports on Wii U have better graphics in places and others show worse graphics. But the most obvious thing seems to be reduced shadow resolution as show in COD and Mass Effect.

    However if reduced shadows is all what's really suffering on ports then I don't think the bandwidth problem is as bad as people think.
     
  15. XpiderMX

    Veteran

    Joined:
    Mar 14, 2012
    Messages:
    1,768
    Likes Received:
    0
    No one port show "better" graphics than current gen version, even BLOPS2 is sub-hd at the same visuals than 360 version but worse performance.

    And this is what I've read:

     
  16. Davros

    Legend

    Joined:
    Jun 7, 2004
    Messages:
    17,879
    Likes Received:
    5,331
    Well sort of, in a limited sense. If you are aware of whats been going on in the pc space we have a lot of games that are doing physics on the gpu instead of the cpu.
    maths based stuff that is parallelize-able like particles (smoke ect) object destruction and waves

    heres an example
     
  17. ERP

    ERP
    Veteran

    Joined:
    Feb 11, 2002
    Messages:
    3,669
    Likes Received:
    49
    Location:
    Redmond, WA
    Sounds like a lot of wishful thinking to me.
    If it can't run games at higher resolution now I doubt it ever will, about the easiest thing in the world to change is the resolution you're running at.
    It also is not impacted by the CPU performance.

    Could ports be better on WiiU?
    Probably, but I don't know how much better.

    Can WiiU compete with 720/PS4 if they constitute a significant performance improvement? That I very much doubt.

    Will WiiU be the lead platform for next gen games? I just don't see it.
     
  18. XpiderMX

    Veteran

    Joined:
    Mar 14, 2012
    Messages:
    1,768
    Likes Received:
    0
    But we are talking about using the Wii U GPU for CPU things. I know your example, dual gpus on pc for physics and so, but a good gpu for graphics + a second gpu for gpgpu + a good cpu (PC) is not the same than poor cpu + a (unknown) gpu for graphics and gpgpu.

    I don't know if I am explaining my point.
     
  19. Strange

    Veteran

    Joined:
    May 16, 2007
    Messages:
    1,698
    Likes Received:
    428
    Location:
    Somewhere out there
    PhysX can be used with a single GPU.
    This demo is essentially demonstrating what GPGPU is best used for.

    I believe many people have stated that trying to get GPGPU to take over other CPU tasks is a big nono.
     
  20. XpiderMX

    Veteran

    Joined:
    Mar 14, 2012
    Messages:
    1,768
    Likes Received:
    0
    What is the performance using one gpu compared to dual gpu?

    It is not a big nono, I mean it is not "magic". If Xbox 720 and/or PS4 use an apu (for physics and gpgpu things) and a discrete gpu plus a better cpu than Wii U, how can the gpgpu do magic for Wii U?

    And I am not saying that Wii U is less powerfull than today consoles.
     
    #3440 XpiderMX, Nov 25, 2012
    Last edited by a moderator: Nov 26, 2012
Loading...
Thread Status:
Not open for further replies.

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...