NVIDIA GT200 Rumours & Speculation Thread

Discussion in 'Architecture and Products' started by Arun, Feb 10, 2008.

Thread Status:
Not open for further replies.
  1. Arun

    Arun Unknown.
    Moderator Legend Veteran

    Joined:
    Aug 28, 2002
    Messages:
    5,023
    Likes Received:
    299
    Location:
    UK
    Note: If this isn't the right codename, obviously the thread title will be changed later.

    In recent months, there have been a variety of rumours about a new NVIDIA monster-chip. Fudzilla has been at the forefront of claiming that chip is called 'GT200':
    Next gen Nvidia is GT200
    GT200 won't do DirectX 10.1
    Nvidia's GT200 is a 65nm chip
    GT200 thermal design power (TDP) is 250W

    However, there are a variety of other rumours that might be refering to the same chip, such as:
    nVidias kommender G100 im Detail (Update) [Translation]

    Discussion points in this thread may include:
    - GT200 specs/power/costs.
    - GT200 Architecture: How different is it from G9x?
    - Advantages & disadvantages of 'monster chips' (financially and otherwise).
     
  2. Domell

    Newcomer

    Joined:
    Oct 17, 2004
    Messages:
    247
    Likes Received:
    0
    Well i really hope G100 (or GT200 whatever :) ) will be "next G80" and will bring similar performance leap like G80-->G71 :)
     
  3. Blazkowicz

    Legend Veteran

    Joined:
    Dec 24, 2004
    Messages:
    5,607
    Likes Received:
    256
    does it have FP64 (and would it run at half the speed of FP32, or something like 10% as on Cell)
     
  4. CJ

    CJ
    Regular

    Joined:
    Apr 28, 2004
    Messages:
    816
    Likes Received:
    40
    Location:
    MSI Europe HQ
    512-bit memory interface with GDDR3. ;)
     
  5. Domell

    Newcomer

    Joined:
    Oct 17, 2004
    Messages:
    247
    Likes Received:
    0
    No way. 512-bit interface is very expensive thing. Look at r600 - it doesn`t give any advantages over "only" 384-bit interface in G80 :) G100 should have more SP than 512-bit mem interface. :)
     
  6. Blazkowicz

    Legend Veteran

    Joined:
    Dec 24, 2004
    Messages:
    5,607
    Likes Received:
    256
    why not go 512bit, with that transistor and power/heat budget you're well over the top anyway.
     
  7. CJ

    CJ
    Regular

    Joined:
    Apr 28, 2004
    Messages:
    816
    Likes Received:
    40
    Location:
    MSI Europe HQ
    We'll see. ;)
     
  8. Domell

    Newcomer

    Joined:
    Oct 17, 2004
    Messages:
    247
    Likes Received:
    0
    See r600? They went to 512-bit and then in rv670 they go back to 256-bit again without any performance hit. This situation shows that 512-bit doesn`t give any advantages. IMO the most important thing is architectural improvements/changes and Shader processors. I think G100 should have at least 256 improved SP because of many future games will supposedly need much shader power.
    I belive GT200 (or G100) will be about 2 times faster in real world games because about 1,5 year without any performance boost is very annoying. We simply deserve on it ;)
     
  9. Domell

    Newcomer

    Joined:
    Oct 17, 2004
    Messages:
    247
    Likes Received:
    0
    Hmm ya know something or only playing with us ;)
     
  10. Arun

    Arun Unknown.
    Moderator Legend Veteran

    Joined:
    Aug 28, 2002
    Messages:
    5,023
    Likes Received:
    299
    Location:
    UK
    That's a ridiculous extrapolation and claim. R600 wasn't that fast, so it wasn't bandwidth-starved and 512-bit was massive overkill. Now, if what you want to see is 2x the performance of G80, good luck getting that with 256-bit or 384-bit GDDR3... Unless what you're thinking of is 256-bit GDDR5, in which case that's another (and much more complex) debate entirely!
     
  11. nexus_alpha

    Newcomer

    Joined:
    Nov 25, 2006
    Messages:
    44
    Likes Received:
    0
    Well he rarely has been wrong before.
     
  12. Domell

    Newcomer

    Joined:
    Oct 17, 2004
    Messages:
    247
    Likes Received:
    0
    When 256-bit will be bottleneck for sure 384-bit should be enough for much faster GPU than G80 :) Well i could be wrong but GF8800GTX is doing great in every resolution with 384-bit mem interface (AA including too).
     
  13. BRiT

    BRiT (╯°□°)╯
    Moderator Legend Alpha

    Joined:
    Feb 7, 2002
    Messages:
    12,652
    Likes Received:
    8,958
    Location:
    Cleveland
    And what if you want to double the performance of the 8800GTX Ultra? Do you really think keeping the current memory throughput as is won't become a bottleneck?
     
  14. AnarchX

    Veteran

    Joined:
    Apr 19, 2007
    Messages:
    1,559
    Likes Received:
    34
    In my opinion not enough(~140GB/s with 0.83ns) to feed this monster.:mrgreen:

    The availability of GDDR5(which starts with <0.5ns) could be relevant for GT200s release-date, I suspect...
     
  15. Domell

    Newcomer

    Joined:
    Oct 17, 2004
    Messages:
    247
    Likes Received:
    0
    What monster? :) How do you know how fast will be G100/GT200?
     
  16. Pantagruel's Friend

    Newcomer

    Joined:
    Jun 17, 2007
    Messages:
    59
    Likes Received:
    0
    Location:
    Budapest, Hungary
    it certainly will, as eg. the 8800GTS-512 with its 256-bit bus is more often BW limited than not. the interesting question is how nVidia will resolve the issue. I see 4 solutions:
    - 256-bit with insanely fast memory (and what about latency?)
    - 384-bit with gDDR4 (this probably sounds feasible)
    - 512-bit crossbar (good luck for that :eek: )
    - 512-bit with a new memory bus (that had better been a long while in the making)

    on a different note, I think nVidia would be really stupid to release a 2Bn transistor chip. no way they can get decent yields with that...
     
  17. ShaidarHaran

    ShaidarHaran hardware monkey
    Veteran

    Joined:
    Mar 31, 2007
    Messages:
    3,984
    Likes Received:
    34
    Man I wish they would go back to the old NV nomenclature... It was so easy back in those days. I pretty much assume whatever NV puts out next will be "NV55".
     
  18. pjbliverpool

    pjbliverpool B3D Scallywag
    Legend

    Joined:
    May 8, 2005
    Messages:
    7,583
    Likes Received:
    703
    Location:
    Guess...
    Hopefully it will run Crysis on very high settings. Then I can actually pick up the game!
     
  19. Megadrive1988

    Veteran

    Joined:
    May 30, 2002
    Messages:
    4,642
    Likes Received:
    155
    My thoughts exactly.


    GT200, D9E, G9X, or whatever, is NV55, a major refresh / overhaul of G80. Like what NV47 / G70 / GF 7800 GTX was to NV40 / GF 6800. Not a totally new architecture (as G80 is from NV4x/G7x) but still a new GPU. A highend GeForce 9 series.

    Nvidia's next-gen NV60 / GeForce 10 series, I don't expect to see that until late 2009, or around the time Larrabee becomes a product.
     
  20. pjbliverpool

    pjbliverpool B3D Scallywag
    Legend

    Joined:
    May 8, 2005
    Messages:
    7,583
    Likes Received:
    703
    Location:
    Guess...
    Isn't NV60/GF10 expected in Q3 this year? That would certainly fit into the "normal" NV timescales. We get NV55/GF9 in Q1, and then 6 months later we get the "real" next gen GPU.

    Personally, i'm holding out for GF10.
     
Loading...
Thread Status:
Not open for further replies.

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...