Ok, full interview from anonymous third party about Wii GPU.

Discussion in 'Console Technology' started by DeadlyNinja, May 9, 2007.

  1. StefanS

    StefanS meandering Velosoph
    Veteran

    Joined:
    Apr 20, 2002
    Messages:
    3,608
    Likes Received:
    75
    Location:
    Vienna
    Forget that Wiinside stuff. It's not even worth the virtual paper it's written on.
     
  2. Oblivion

    Newcomer

    Joined:
    Jul 25, 2005
    Messages:
    138
    Likes Received:
    1
    Random question, but is it possible to have a RAM expansion pack for the Wii like N64 did for certain games?
     
  3. Fafalada

    Veteran

    Joined:
    Feb 8, 2002
    Messages:
    2,773
    Likes Received:
    49
    I picture that the "logic" goes kinda like -
    GC CPU @ 480Mhz >>>> 733P3.
    P4 = 70% of P3, so 1GhzP4.
    * 150% ('conservative' Wii CPU upgrade estimate) * clock increase - and presto -
    Wii Cpu > 2.2Ghz P4.

    :razz:
     
  4. Entropy

    Veteran

    Joined:
    Feb 8, 2002
    Messages:
    3,360
    Likes Received:
    1,377
    The information on the Wii GPU that has made it into the public domain hasn't changed one bit as far as I can see.
    * GPU clock is up by 50%
    * Die area indicates roughly a factor of three increase in logic gate budget
    (* Early Nintendo ballpark statement that the Wii would be around three times as powerful as the GC.)

    And that's it.
    A solid leak of an engineering document would be appreciated.
     
  5. Cheezdoodles

    Veteran

    Joined:
    May 24, 2006
    Messages:
    3,930
    Likes Received:
    24
    Wait, this funny "logic" implies that the P4 is worse than the P3? :shock:
     
  6. DuckThor Evil

    Legend

    Joined:
    Jul 9, 2004
    Messages:
    5,996
    Likes Received:
    1,062
    Location:
    Finland
    Well atleast the early P4 models were clock for clock, I'm not sure if that holds true for later p4 models like Northwood though.
     
  7. Kiloran

    Newcomer

    Joined:
    Apr 24, 2007
    Messages:
    14
    Likes Received:
    0
    Ha.

    I'd take that seriously maybe if the author knew how to capitalize words at a third grade level.:roll:
     
  8. rekator

    Regular

    Joined:
    Dec 21, 2006
    Messages:
    793
    Likes Received:
    30
    Location:
    France
    Yes, remimber the P4 1,5Ghz it's under P3 1,2 Ghz but it's long time ago…
    And G4 1,25 Ghz fight P4 2,5 Ghz in a lot of areas…
    The Wii CPU not exactly equivalent of a P4 2,2 Ghz all the time but may be in game code it's real?
     
  9. bomlat

    Regular

    Joined:
    Nov 5, 2006
    Messages:
    327
    Likes Received:
    0
    I check the rev gx.h and the dolphin gx.h
    Both of them contain 16 unit for the TEV.
    So it is not a proof.Max for the skills of the dev.:)
     
  10. Entropy

    Veteran

    Joined:
    Feb 8, 2002
    Messages:
    3,360
    Likes Received:
    1,377
    Regarding the CPU of the Wii, there is a significant part of the equation that keeps being ignored around here (in CPU performance discussions generally, actually), and that is the memory subsystem. A large part of extracting good performance from any processor lies in managing your memory.
    All evidence points towards the 750CL as the CPU of the Wii although it may or may not be slightly modified. Be that as it may, given the likely base processing power of the CPU, it is backed up by a relatively spiffy memory hierarchy. Unless the memory controller is completely borked (and there is no reason it should be) the CPU should enjoy very low latency access to main memory and a high bandwidth even for fine grain accesses, making corner cases rare and lessening algorithm sensitivity to memory peculiarities. (Although I wonder a little about how the boundary between the 24 and the 64 MB partitions are handled.)
    For most codes, it would seem that the Wii should offer quite robust performance.

    It would be nice to know if the Wii CPU uses the 60x bus protocol, as it would seem a reasonable place to offer a custom solution for the Wii.

    Quick edit: Of course, the CPU is never going to be a stream computing powerhouse, nice memory hierarchy or not. But still a pretty capable little chip particularly considering that it draws a maximum of 5-6W at full blast.
     
    #30 Entropy, May 13, 2007
    Last edited by a moderator: May 13, 2007
    Shifty Geezer likes this.
  11. fearsomepirate

    fearsomepirate Dinosaur Hunter
    Veteran

    Joined:
    Sep 1, 2005
    Messages:
    2,743
    Likes Received:
    65
    Location:
    Kentucky
    Nope. There's no high-bandwidth port to plug it into.

    That is one of the worst blogs ever. Why did that even get posted? I like where he said the Gamecube devkits were less powerful than the retail hardware. Seriously, the fake Revolution insider blogs should be dead now. It's not even English. I wish this were some student of mine so I could fail him.
     
  12. swaaye

    swaaye Entirely Suboptimal
    Legend

    Joined:
    Mar 15, 2003
    Messages:
    9,045
    Likes Received:
    1,119
    Location:
    WI, USA
    I think it's rather humorous that we're arguing over whether Wii's CPU can keep up with a P4 2.2. That's not exactly much of a processor. Especially in 2007!

    Besides, wouldn't it be more warm and fuzzy to compare it to, say, a Athlon XP 2000+? Not like these comparisons are worth jack anyway.

    I'd blame RAM amount, and the fact that Doom3's shadows are very CPU intensive. Doom3 went for more geometry work on the CPU than the video card, if I remember correctly what I've read about it. It had to work on NV17, after all.

    And let's not forget that Cube apparently wasn't capable of that at all. Doom3 uses normal mapping and that's not so possible on Flipper. Ever see what Doom3 looks like on a DX7-era card, when you can't do the lighting or normal mapping!? Ick!

    From these interviews I doubt that Wii could do that much normal mapping or per-pixel lighting, either. But who knows, I guess. Personally, I think I'll believe what these guys have said about the hardware now. It's been repeated enough from enough sources that it feels rather authentic to me.
     
    #32 swaaye, May 13, 2007
    Last edited by a moderator: May 13, 2007
  13. fearsomepirate

    fearsomepirate Dinosaur Hunter
    Veteran

    Joined:
    Sep 1, 2005
    Messages:
    2,743
    Likes Received:
    65
    Location:
    Kentucky
    I'm not saying it was. I think you incorrectly read my post as comparing Gamecube and Xbox, which it wasn't. The only point I was making is that it doesn't matter what your processor can do if you can't get information to it fast enough. That's an absolute statement, and it's completely true. And yes, the XGPU was severely bottlenecked. In theory, it was more powerful than any GF3. In practice, in actual games, a GF3-equipped PC with enough CPU and bandwidth would outperform it, meaning that extra power was largely wasted.

    Anyway, I thought the Doom 3 engine only used the CPU to do its shadows if you were running an GF4 MX, which partially executed vertex shader code on the CPU. On any of the Ti class cards, it used the vertex shader hardware and ran much faster. I could be wrong, though.
     
    #33 fearsomepirate, May 13, 2007
    Last edited by a moderator: May 13, 2007
  14. Ooh-videogames

    Regular

    Joined:
    Oct 13, 2002
    Messages:
    542
    Likes Received:
    1
    Is there any reason why a dev would use less than 16?
     
  15. swaaye

    swaaye Entirely Suboptimal
    Legend

    Joined:
    Mar 15, 2003
    Messages:
    9,045
    Likes Received:
    1,119
    Location:
    WI, USA
    Yeah that's very true. No doubt the PC version of the architecture has a huge advantage because of its dedicated RAM. An advantage that the CPU enjoys too, I'm sure.


    Intel has an article written up by J.M.P. van Waveren of Id Software that spells out how the engine works. It's highly CPU-based for a lot of the rendering stages. He cites that basically it was a decision made to work around limits even on cards with earlier vertex shader hardware. I'd imagine if they'd targetted VS 2.0, things could've been different. Who knows....
    http://www.intel.com/cd/ids/developer/asmo-na/eng/dc/games/293451.htm

    Well, the anonymous devs did say that each stage you use costs performance.
     
  16. Ooh-videogames

    Regular

    Joined:
    Oct 13, 2002
    Messages:
    542
    Likes Received:
    1
    Yeah, what would be the difference between GC and Wii, and wouldn't a faster clock and more available memory/bandwidth allow devs to come closer to that 16?
     
  17. Capeta

    Banned

    Joined:
    Dec 19, 2006
    Messages:
    817
    Likes Received:
    8
    The Wii CPU is equivalent to a 1.4GHz P3 at most...
     
  18. fearsomepirate

    fearsomepirate Dinosaur Hunter
    Veteran

    Joined:
    Sep 1, 2005
    Messages:
    2,743
    Likes Received:
    65
    Location:
    Kentucky
    I think it's equivalent to a 729 MHz PPC at most.
     
    StefanS likes this.
  19. bomlat

    Regular

    Joined:
    Nov 5, 2006
    Messages:
    327
    Likes Received:
    0
    The poin is the memory subsystem of the TEV
    But could you improve that with a 100% backward compatibility?
     
  20. Teasy

    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    4,563
    Likes Received:
    14
    Location:
    Newcastle
    Actually the dev (if he's a real dev) says:

    So he's saying both are 16 stages but Wii now has twice the texture pipelines.

    I suppose this does sound very plausible, given the die size differences between GC and Wii's GPUs. But who knows if its actually true. If the person making the blog is DrTre from IGN then I'm pretty sure he's not making this up. But he could have been duped by the 'developer' answering the questions, so who knows.
     
    #40 Teasy, May 15, 2007
    Last edited by a moderator: May 16, 2007
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...