NVIDIA GT200 Rumours & Speculation Thread

Discussion in 'Architecture and Products' started by Arun, Feb 10, 2008.

Thread Status:
Not open for further replies.
  1. AnarchX

    Veteran

    Joined:
    Apr 19, 2007
    Messages:
    1,559
    Likes Received:
    34
    General shading ~gaming...

    Do not forget that GTX 280 has in comparison to 88 Ultra only:
    33% more ROPs @ ~ same clock
    25% more TMUs @ ~ same clock
    35% more BW

    And we are talking about a benchmark at 1920x1200 with 4xAA and 16xAF enabled, so this 66% more performance would be not a bad result.

    But lets see what in real games happen... :grin:
     
  2. igg

    igg
    Newcomer

    Joined:
    May 16, 2008
    Messages:
    63
    Likes Received:
    0
    I currently own a GF6800GT and I wonder whether it's worth waiting for the GT200b or just get an GT200. If it's just a simple die shrink the difference should be marginal (except for Nvidias profit).
     
  3. Jawed

    Legend

    Joined:
    Oct 2, 2004
    Messages:
    11,716
    Likes Received:
    2,137
    Location:
    London
    Damn if it works that way it puts the cat amongst the pigeons. My mind boggles :razz: but I can't entirely rule it out...

    Jawed
     
  4. Voxilla

    Regular

    Joined:
    Jun 23, 2007
    Messages:
    832
    Likes Received:
    505
    I'm claiming the price for best guess of transistor count. :smile:

    The 1.4 B could refer to all transistors including the unused redundant ones.




    Quote:
    Originally Posted by Voxilla
    My guess is we will see another monster GPU for the 9800GTX:

    55 nm
    1.2 B transistors

    256 SP
    64 trilinear texture units (64 TA 128 TF)
    2 GHz shader
    750 MHZ core
    512 bit bus, 150 GB/s
    1 GB

    So basically twice a G80, 3x shader speed at 1.5 TFlop
     
  5. leoneazzurro

    Regular

    Joined:
    Nov 3, 2005
    Messages:
    518
    Likes Received:
    25
    Location:
    Rome, Italy
    Maybe, maybe not. We cannot state it now, AFAIK, as there are other factors to becounted in. I.e. , if G92 is bandwidth limited, GT200 will be the same, as if it even would have more than 2x the power of G92, has "only" two times the bandwidth. Of course, if this is the case, it will fare more than 2 times higher only when the 512 Mbyte framebuffer of most of sold G92s is exhausted and the 896-1024 Mbyte framebuffer is not. Only exception, if Nvidia studied some new bandwidth saving technique we are not aware yet.
     
  6. Ailuros

    Ailuros Epsilon plus three
    Legend Subscriber

    Joined:
    Feb 7, 2002
    Messages:
    9,511
    Likes Received:
    224
    Location:
    Chania
    The only thing you got right on those was the buswidth and the memory amount. Considering the amount of mistakes from the name over the manufacturing process and the rather long list of the remaining I wouldn't be too proud about those "laurels".
     
  7. Shtal

    Veteran

    Joined:
    Jun 3, 2005
    Messages:
    1,344
    Likes Received:
    4
  8. ZerazaX

    Regular

    Joined:
    Oct 29, 2007
    Messages:
    280
    Likes Received:
    0
    If true then that might be a sign that ATI's strategy worked this round that they realized a monolithic GPU might just not be worth it if yields are truly 40%
     
  9. MfA

    MfA
    Legend

    Joined:
    Feb 6, 2002
    Messages:
    7,610
    Likes Received:
    825
    R700 as in the speculated dual setup, not R700 as in the architecture.
     
  10. INKster

    Veteran

    Joined:
    Apr 30, 2006
    Messages:
    2,110
    Likes Received:
    30
    Location:
    Io, lava pit number 12
  11. Arty

    Arty KEPLER
    Veteran

    Joined:
    Jun 16, 2005
    Messages:
    1,906
    Likes Received:
    55
    Boomerang!
     
  12. Silent_Buddha

    Legend

    Joined:
    Mar 13, 2007
    Messages:
    19,426
    Likes Received:
    10,320
    Still if GT200 is truly such a massive power hungry beast. One wonders if they can pull an ATI and do something similar to a R600 -> Rv670 conversion. Somehow I doubt it as part of the problem with the R600 was a very leaky process which I don't think GT200 has to contend with.

    If not, then it's quite possible that Nvidia might be leaning towards abandoning monolithic. After all things are only going to keep getting larger with the need to add features for a future DX11.

    They already couldn't get enough tranny budget to add both FP64 support and DX10.1 support in the current chip. I'd hate to see how monstrous this might become with DX11 in the mix at some point in the future.

    However, with the resources available to NV. I'd surely expect they are already doing as much R&D on multi-gpu as ATI is. At least I would hope so.

    Regards,
    SB
     
  13. nAo

    nAo Nutella Nutellae
    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    4,400
    Likes Received:
    440
    Location:
    San Francisco
    DX10.1 cost is probably very small (area wise).
     
  14. MfA

    MfA
    Legend

    Joined:
    Feb 6, 2002
    Messages:
    7,610
    Likes Received:
    825
    Hell, it might even be able to do it ... actually supporting it in the drivers though is clearly a bad idea for them for the moment.
     
  15. Twinkie

    Regular

    Joined:
    Oct 22, 2006
    Messages:
    386
    Likes Received:
    5
    Im still baffled by nVIDIA's decision on sticking with DX10 instead of DX10.1. They've added roughly 50% more shaders, along with doubling of ROPs (compared to G92), doubling of them memory interface and so on, but at the end of the day they decided to leave DX10.1 out.

    Does a GPU require alot of a change/tweaks in the pre-existing DX10 hardware for them to satisfy the requirements of DX10.1?
     
  16. MfA

    MfA
    Legend

    Joined:
    Feb 6, 2002
    Messages:
    7,610
    Likes Received:
    825
    Supporting DX10.1 only in their high end range puts weight behind the whole rest of the range from their competitor ... it's a net loss. Until they are ready to support it at least in the mid end they won't support it at all IMO.
     
  17. apoppin

    Regular

    Joined:
    Feb 12, 2006
    Messages:
    255
    Likes Received:
    0
    Location:
    Hi Desert SoCal
    it still does not explain Nvidia decision to not support Dx10.1 - at all - in Tesla
    - or do you think they will be able to add support in their refresh and shrink?
    - - Do they just not care - at all - that AMD will tout it and even flaunt it if they can .. we are talking at least 18 months of us doing without while every benchmarker and HW site mentions it
     
  18. Arty

    Arty KEPLER
    Veteran

    Joined:
    Jun 16, 2005
    Messages:
    1,906
    Likes Received:
    55
    Dont they already meet some of the 10.1 spec, atleast thats what I remember as one of Nvidia's exec saying.
     
  19. Ailuros

    Ailuros Epsilon plus three
    Legend Subscriber

    Joined:
    Feb 7, 2002
    Messages:
    9,511
    Likes Received:
    224
    Location:
    Chania
    If the rumour that they would have to revamp the G8x TMUs is true, then there might not be a signficant cost in area, yet quite a bit in added R&D resources?

    Allowing the fact to leak out that they've just taped out a 55nm whatever GT200 variant isn't a net win exactly either. I realize it's not related, yet under that reasoning they shouldn't have combined HDR+MSAA since G80 either.

    The real question is if there are any games under development with deferred shading/shadowing with a 10.1 path for MSAA.
     
  20. Voxilla

    Regular

    Joined:
    Jun 23, 2007
    Messages:
    832
    Likes Received:
    505
    The 1.2B transistors is what Nvidia talked about yesterday.

    http://anandtech.com/weblog/showpost.aspx?i=453
    See text besides the die picture of the G100.

    Were it not for spoiling the R770 launch the 65nm version would not be released.




     
Loading...
Thread Status:
Not open for further replies.

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...