The nvidia future architecture thread (G100/GT300 and such)

Discussion in 'Architecture and Products' started by CarstenS, Jul 14, 2008.

  1. Jawed

    Legend

    Joined:
    Oct 2, 2004
    Messages:
    10,873
    Likes Received:
    767
    Location:
    London
    Wow, that's a really nice set of data.

    The ring bus patent docoument:

    http://forum.beyond3d.com/showpost.php?p=1165766&postcount=1947

    says that in an embodiment the ring bus runs at core clock, not memory clock, so perhaps that's it. 500MHz is enough for 76.8GB/s, and it seems 750MHz (50% faster) is enough for 115.2GB/s (50% faster) :razz:

    At 1206MHz memory on RV670, its performance is 55% of R600. This increases to 64% at 297MHz memory compared with R600's 153MHz. A clear demonstration of the narrower bus being more efficient?

    Jawed
     
  2. Ailuros

    Ailuros Epsilon plus three
    Legend Subscriber

    Joined:
    Feb 7, 2002
    Messages:
    9,421
    Likes Received:
    180
    Location:
    Chania
    Depends if the codename represents a technology generation or a specific timeframe for release. If it's the first then GT200 is the "G200" found on older roadmaps, if it's the latter than it is truly "G100".

    On quite old roadmaps G100 used to stand for a D3D11 chip, like on ATI's old roadmaps where that place was captured by the "R700".

    Lord knows how old those roadmaps were and I doubt that both IHVs could have known at this stage what manufacturing process delays might have occured or what D3D11 is going to look like after all. It wasn't too long ago when ATI folks where "certain" that the Xenos/R6x0 Tesselator would be sufficient for D3D11 compliance. The recently revealed slides about 11 point into a different direction though.

    Sorry for the OT.
     
  3. DegustatoR

    Veteran

    Joined:
    Mar 12, 2002
    Messages:
    1,340
    Likes Received:
    66
    Location:
    msk.ru/spb.ru
    I heard that story (from you =)) but the truth is that GT200 has G100 id in the current drivers. And always had G100 id. So it's G100 even if it's G200 and GT200 at the same time. NVs codenames are a mess right now.
     
  4. Jawed

    Legend

    Joined:
    Oct 2, 2004
    Messages:
    10,873
    Likes Received:
    767
    Location:
    London
    :lol: I was thinking that they are merely meant to confuse outsiders.

    Jawed
     
  5. DegustatoR

    Veteran

    Joined:
    Mar 12, 2002
    Messages:
    1,340
    Likes Received:
    66
    Location:
    msk.ru/spb.ru
    I'm not sure that's the case. No one seriously judge future chips by their codenames.
    It's probably another one "end of available digits" thing.
    When they made NV4x->G7x switch they where hitting NV4x limits (you can't put more than 10 chips in NV4x line and there were more than 10 chips in NV4x+G7x combined).
    Now they're trying to avoid this situation in the future - you can have a hundred chips under Gxyz label where X means an architecture generation, Y - the current line-up and Z - a position of a chip in a line.
    Plus some marketing of course since the new market branding will probably be somewhat closer to the codenames (GT200 -> GTX200, GT300 -> GTX300? etc)
     
  6. Ailuros

    Ailuros Epsilon plus three
    Legend Subscriber

    Joined:
    Feb 7, 2002
    Messages:
    9,421
    Likes Received:
    180
    Location:
    Chania
    Just because their codenames are a mess (and always have been for that matter) it doesn't mean that years ago they hadn't planned something named back then as "G100" for D3D11. Roadmaps change on many levels and you very well know that beyond one year length they're merely optimistic estimates and nothing more. In any case and that's the last OT on the matter, the story with the G8x/G2x/G1x timeline came years ago from an ATI employee.

    Not that it really matters but RV770 could easily also have a "R700" driver codetag, while in reality the real "R700" would rather have a RV870 official codename.
     
  7. maniac

    Newcomer

    Joined:
    May 28, 2004
    Messages:
    18
    Likes Received:
    0
    :lol:
    let's hope they don't end up confusing themselves
     
  8. Cookie Monster

    Newcomer

    Joined:
    Sep 12, 2008
    Messages:
    167
    Likes Received:
    8
    Location:
    Down Under
    Link

    So rumours are suggesting that GT212 is a 384SP 40nm chip, using 512bit memory controller + GDDR5. GT200 on steroids?
     
  9. fellix

    fellix Hey, You!
    Veteran

    Joined:
    Dec 4, 2004
    Messages:
    3,494
    Likes Received:
    405
    Location:
    Varna, Bulgaria
    That would be 16*24 or 12*32 SPs for the TPC configuration. ;)
     
  10. CarstenS

    Veteran Subscriber

    Joined:
    May 31, 2002
    Messages:
    4,807
    Likes Received:
    2,073
    Location:
    Germany
    What for?
     
  11. Panajev2001a

    Veteran

    Joined:
    Mar 31, 2002
    Messages:
    3,187
    Likes Received:
    8
    [​IMG]

    To take over the world CarstenS, as they do every product cycle...
     
  12. DegustatoR

    Veteran

    Joined:
    Mar 12, 2002
    Messages:
    1,340
    Likes Received:
    66
    Location:
    msk.ru/spb.ru
    I'm not really sure that it's physically possible to have 512-bit bus in 40nm GT2xx GPU with 384 SPs. Probably just another more or less baseless rumour (AFAIK the original G100 project was supposed to have 384 SPs on 55nm so there may be some ground to all these 384 SPs rumours).
     
  13. Unknown Soldier

    Veteran

    Joined:
    Jul 28, 2002
    Messages:
    2,238
    Likes Received:
    33
    Erm, GTX 350?

    [​IMG]

    2GB Memory
    830Mhz Core
    55nm

    First thing I have to say, 177.51 drivers on XP? I have to call bs.

    US
     
  14. Davros

    Legend

    Joined:
    Jun 7, 2004
    Messages:
    15,068
    Likes Received:
    2,396
    going back to something cartens said in the first post is 16x aniso really free on the gt280 ?
    ive only been using 8..
     
  15. suryad

    Veteran

    Joined:
    Aug 20, 2004
    Messages:
    2,479
    Likes Received:
    16
    How is something like that free? But then again even on my 8800 Ultras I rarely notice any sort of performance drop at that setting. I have played all games at that setting including Crysis and Warhead.
     
  16. homerdog

    homerdog donator of the year
    Legend Veteran Subscriber

    Joined:
    Jul 25, 2008
    Messages:
    6,180
    Likes Received:
    964
    Location:
    still camping with a mauler
    Maybe not absolutely 100% free but as you note the performance hit is negligible with G80/G92 based hardware. The question is, does this hold true with GT200? The ALU/TEX ratio has increased (3 to 1 vs 2 to 1, although G80 is 1 to 2 address/filtering) so perhaps AF isn't as 'free' as it used to be.

    The same was true for my old 8800GT, but do keep in mind AF doesn't work properly in Crysis if you play at Very High settings.
     
  17. Ailuros

    Ailuros Epsilon plus three
    Legend Subscriber

    Joined:
    Feb 7, 2002
    Messages:
    9,421
    Likes Received:
    180
    Location:
    Chania
    Depends also what one exactly means with "free". AF normally doesn't tax bandwidth much but rather fillrate. If you're using default "quality" AF, the "brilinear" it uses comes virtually for free over plain blinear.

    Here's an AF only test from Computerbase: http://www.computerbase.de/artikel/..._gtx_280_sli/7/#abschnitt_aa_und_afskalierung

    In the worst case the 280 loses up to 18% in performance from 1xAF to high quality 16xAF and you can also see how insignificant the difference between 8x and 16xAF is in those 3 applications.
     
  18. Davros

    Legend

    Joined:
    Jun 7, 2004
    Messages:
    15,068
    Likes Received:
    2,396
    thanks
    ps: when they say 1xaa do they mean 0xaa (soz for the off topic)
     
  19. ShaidarHaran

    ShaidarHaran hardware monkey
    Veteran

    Joined:
    Mar 31, 2007
    Messages:
    3,984
    Likes Received:
    34
    Yes. The two are interchangeable.
     
  20. suryad

    Veteran

    Joined:
    Aug 20, 2004
    Messages:
    2,479
    Likes Received:
    16
    Oh snap I didnt know that! But then again I have been tweaking my cvars a lot so how can you tell what exactly is very high settings? Ah anyway I am not touching that game again until I get a pair of nvidia's next gen vid cards :)
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...