NVIDIA GT200 Rumours & Speculation Thread

Discussion in 'Architecture and Products' started by Arun, Feb 10, 2008.

Thread Status:
Not open for further replies.
  1. Razor1

    Veteran

    Joined:
    Jul 24, 2004
    Messages:
    4,232
    Likes Received:
    749
    Location:
    NY, NY

    this is why I say the g100 its goin be awhile before its out ;)
     
  2. SirPauly

    Regular

    Joined:
    Feb 16, 2002
    Messages:
    491
    Likes Received:
    14

    Well, yeah...........they're a lot cheaper. Especially price/performance products like the GT - -especially in Sli.
     
  3. Shtal

    Veteran

    Joined:
    Jun 3, 2005
    Messages:
    1,344
    Likes Received:
    4
  4. The_Wolf_Who_Cried_Boy

    Newcomer

    Joined:
    Feb 18, 2005
    Messages:
    172
    Likes Received:
    9
    Location:
    Floating face down in the stagnant pond of life.
    I thought the rule of thumb was half nodes bring more a financial improvement through smaller dies than a performance improvement for a given design?
     
  5. Ailuros

    Ailuros Epsilon plus three
    Legend Subscriber

    Joined:
    Feb 7, 2002
    Messages:
    9,511
    Likes Received:
    224
    Location:
    Chania
    G70@110nm = 334mm2
    7800GTX 512 (call me hard to find at launch) = 550MHz
    7800GTX 256 = 430MHz

    G71@90nm = 196mm2
    7900GTX = 650MHz
     
  6. pjbliverpool

    pjbliverpool B3D Scallywag
    Legend

    Joined:
    May 8, 2005
    Messages:
    9,237
    Likes Received:
    4,260
    Location:
    Guess...
    The GTX isn't really that much cheaper. But yes, overall, there is a cost saving. However putting up with a much slower GPU for 18 months all for the sake of a $100 or so? Not my idea of a great plan.
     
    #106 pjbliverpool, Feb 28, 2008
    Last edited by a moderator: Feb 28, 2008
  7. leoneazzurro

    Regular

    Joined:
    Nov 3, 2005
    Messages:
    518
    Likes Received:
    25
    Location:
    Rome, Italy
    Hmm.. but 110 nm was the half node from 130 nm, while 90 nm is a "full node".
     
  8. mczak

    Veteran

    Joined:
    Oct 24, 2002
    Messages:
    3,022
    Likes Received:
    122
    That's not a half-node shrink. 130nm, 90nm, 65nm are the full nodes, 110nm, 80nm, 55nm are the half-nodes.
     
  9. ShaidarHaran

    ShaidarHaran hardware monkey
    Veteran

    Joined:
    Mar 31, 2007
    Messages:
    4,027
    Likes Received:
    90
  10. INKster

    Veteran

    Joined:
    Apr 30, 2006
    Messages:
    2,110
    Likes Received:
    30
    Location:
    Io, lava pit number 12
    It's not that simple.
    If a given half node high performance version is required for a GPU, then it may end up being more expensive than the full node from which it derives directly.

    For instance, the TSMC 55nm half node has a bunch of options, which will determine the final price regardless of the die size of any given chip:

    [​IMG]

    The 65nm full node has an even greater amount of options, and it's likely also cheaper than the 55nm half node with equivalent options, so the balance between the price that Nvidia or AMD must pay to TSMC, yield rate and/or time to market, volume of orders and overall die size is always extremely volatile:

    [​IMG]


    So, i'm pretty sure that the 110nm half node version used on G70 was the high performance one, while the NV43 had a more economical version of the same half node.
     
    #110 INKster, Feb 28, 2008
    Last edited by a moderator: Feb 28, 2008
  11. AnarchX

    Veteran

    Joined:
    Apr 19, 2007
    Messages:
    1,559
    Likes Received:
    34
  12. AlexV

    AlexV Heteroscedasticitate
    Moderator Veteran

    Joined:
    Mar 15, 2005
    Messages:
    2,535
    Likes Received:
    144
    Sure, sure...that girl(?) over at AT also knows the password for area 51 and where Jimmy Hoffa's earthly remains are located, among other things:D
     
  13. Domell

    Newcomer

    Joined:
    Oct 17, 2004
    Messages:
    247
    Likes Received:
    0
    Hmm such a high CPU score with C2Q clocked only @2,67Ghz? Does anybody think this score could be true? ;)
     
  14. Shtal

    Veteran

    Joined:
    Jun 3, 2005
    Messages:
    1,344
    Likes Received:
    4
    [​IMG]

    My Q6600 @ 333x9=3000MHz scores 4700 points in 3DMark06 using WinXP 32bit for CPU-TEST.

    I would say it's fake score.
     
  15. AnarchX

    Veteran

    Joined:
    Apr 19, 2007
    Messages:
    1,559
    Likes Received:
    34
    Did not know, that GT200 is x86 capable and can support your CPU in every app? :wink::lol:

    The claimed 2560x1600 can also easy exposed, by the visible 4 -> 1280x1024.


    But since some GT200 are supposed to be around, but not public, I would think that GPU-Z would be possible to show GPU and Device-ID.
     
  16. AlexV

    AlexV Heteroscedasticitate
    Moderator Veteran

    Joined:
    Mar 15, 2005
    Messages:
    2,535
    Likes Received:
    144
    No. The CPU score is fucked up(too high)-a 4GHz Penryn gives about 5900, more or less, and a 3GHz one gives 4597-so the odds of an old C2Q scoring what it scores in that pic are non-existant. Oh, wait, she's probably testing it on an 8-core Nehalem...gah, that doesn't work either, as the score would be too flimsy. It's a load of crock spilled by someone crying for attention:).
     
  17. CarstenS

    Legend Subscriber

    Joined:
    May 31, 2002
    Messages:
    5,800
    Likes Received:
    3,920
    Location:
    Germany
    No way! GT200'll be so powerful, it'll define 2.560x1.604 as the new Ub0r-standard for enthusiasts all over the world. *SCNR*
     
  18. DegustatoR

    Veteran

    Joined:
    Mar 12, 2002
    Messages:
    3,244
    Likes Received:
    3,408
    Interesting.
    Now if only this G92B would allow to use GDDR5 memory... -)

    From what i heard G100 should be quite a bit faster than anything G92-based so a G92 55nm shrink won't be a problem for G100 to handle.
    If anything this shrink will help them to maintain a more solid line-up with less of a hole between G100 and G92...
     
  19. Domell

    Newcomer

    Joined:
    Oct 17, 2004
    Messages:
    247
    Likes Received:
    0
    Hmm quite a bit faster than G92? ;) But how much - you can tell us :D I hope more than 50% at least.

    The second thing is when G100 will be released. If there is G92B in plans so G100 release date could be 2009 :(
     
  20. AnarchX

    Veteran

    Joined:
    Apr 19, 2007
    Messages:
    1,559
    Likes Received:
    34
    G92(A) vs GT200 ~ 7900GTX vs 8800 GTX... :wink:

    But the competition does not sleep and R700XT is also supposed to be a more than significant upgrade over RV670XT.
     
Loading...
Thread Status:
Not open for further replies.

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...