NVIDIA: Beyond G80...

Discussion in 'Architecture and Products' started by kyetech, Nov 19, 2006.

  1. vertex_shader

    Banned

    Joined:
    Sep 8, 2006
    Messages:
    961
    Likes Received:
    14
    Location:
    Far far away
    Funny to see how fast this almost 2 months old forum post spread in the net, almost all big hw forum already have a topic for it, most of the users talking about it as for official confirmed specification :smile:
    When i need to choice from the 2 source than vrzone better source for NV rumors about G92 than a forum member post :smile:

    Why NV try to release in november a 65nm highend GPU when they have no competition for it?
    Its much easier to start the 65nm technology with not the higend GPU, the time is on NV side, they not need to take any risk, step is big from 90nm to 65nm and the risk is high, its not a "simple" shrink.

    I think g92 will be a nice 256bit performance card with 8800gts performance (or slightly lower/better) but the cost to make much less than the 8800gts, so its will be cheap and NV earn more money on it.

    G9x highend GPU at 65nm more likely in 2008Q1.
     
  2. CarstenS

    Legend Subscriber

    Joined:
    May 31, 2002
    Messages:
    5,800
    Likes Received:
    3,920
    Location:
    Germany
    Strictly speaking, everything since R300 is superscalar - except, arguably, Geforce FX.
     
  3. _xxx_

    Banned

    Joined:
    Aug 3, 2004
    Messages:
    5,008
    Likes Received:
    86
    Location:
    Stuttgart, Germany
    Why don't you see the FX as superscalar? I'm not all that familiar with it, but I had the impression it is also superscalar.
     
  4. nicolasb

    Regular

    Joined:
    Oct 21, 2006
    Messages:
    421
    Likes Received:
    4
    Because it'll be a year since G80 came out. The time when they would have started working on G92 would have been several years ago; back then there could have been no way of predicting that ATI would fail quite so miserably when it came to providing competition for G80, so planning a new high-end chip one year after the previous one would have made sense. By the time it became obvious that there wasn't any competition, it would have been too late to start working on a completely new design for release in early 2008.

    I suppose Nvidia could just take 6 months holiday and do nothing at all until ATI catches up, but I can't see the shareholders liking that. :)
     
  5. AnarchX

    Veteran

    Joined:
    Apr 19, 2007
    Messages:
    1,559
    Likes Received:
    34
    That's what I also think. Enthusiasts will be served with a G92GX2 card and so everyone will be happy. :grin:

    And so NV can more concentrate on the D3D10.1 HighEnd-GPU, which maybe comes with G9x midrange in spring.
     
  6. no-X

    Veteran

    Joined:
    May 28, 2005
    Messages:
    2,451
    Likes Received:
    471
    Do you compare architectures (/way, how is unification implemented), or performance of particular products? I think ATi's way is better, because G80 has some limitations, which prevents to show full potential of unified core. I mean operations with many vertices. G80 has fast unified core, but slow front-end prevents to show advantages of unification and in many situations, when G80 operates with many vertices, it's vertex shader/geometry performance isn't significantly better, than on last-gen non-unified GPUs.
     
  7. AnarchX

    Veteran

    Joined:
    Apr 19, 2007
    Messages:
    1,559
    Likes Received:
    34
    Maybe it should not? ;) NV wants to also sell Quadro cards.

    And GS-performance is in newest test with new drivers not so bad -> look on RM3D 2.0.
     
  8. vertex_shader

    Banned

    Joined:
    Sep 8, 2006
    Messages:
    961
    Likes Received:
    14
    Location:
    Far far away
    Shaderholders know what happend with ATi when hunting the lower process technology, and shaderholders see how big lead NV have in the discrete GPU segment i think most of them choice the "less risk route", and not the high risk with highend GPU at 65nm.

    Check it what happend with L.P.H. AMD again with they 390million transistor 65nm midrange GPU (its more a 65nm lowend uper half GPU), i don't think NV need to take so big risk with a 1 billion transistor highend GPU, its clearly visible from r600 performance (i think 95% of the user realized already no rx6xx "magic driver" coming what change the whole picture 180 degree) AMD can't do anything to catch up, they not even have a 8800ultra segment card, 65nm r600 won't change anything, 2xrv670 card won't change anything either.

    Without release any 65nm highend GPU this year NV still have end of the year ~80% of the dx10 discrete graphics segment, almost all dx10 coming this year part of twimtbp program, so there is zero presure.

    (everything what happening its a very bad situation rom user aspect, but this is another story)
     
    #928 vertex_shader, Jul 26, 2007
    Last edited by a moderator: Jul 26, 2007
  9. _xxx_

    Banned

    Joined:
    Aug 3, 2004
    Messages:
    5,008
    Likes Received:
    86
    Location:
    Stuttgart, Germany
    Ok vertex_shader ^^, now get off your arse and start performing better! Now! :wink:
     
  10. vertex_shader

    Banned

    Joined:
    Sep 8, 2006
    Messages:
    961
    Likes Received:
    14
    Location:
    Far far away
    In G80 i need some redbull to "fly" :lol:
     
  11. AnarchX

    Veteran

    Joined:
    Apr 19, 2007
    Messages:
    1,559
    Likes Received:
    34
  12. _xxx_

    Banned

    Joined:
    Aug 3, 2004
    Messages:
    5,008
    Likes Received:
    86
    Location:
    Stuttgart, Germany
    8800GT sounds promising if true. That could finally be a suitable card for HTPC users like me who can't use double-slot power hogs.
     
  13. vertex_shader

    Banned

    Joined:
    Sep 8, 2006
    Messages:
    961
    Likes Received:
    14
    Location:
    Far far away
  14. Blazkowicz

    Legend

    Joined:
    Dec 24, 2004
    Messages:
    5,607
    Likes Received:
    256
    well I tend to read "8600 GT" on that especially bad picture
     
  15. vertex_shader

    Banned

    Joined:
    Sep 8, 2006
    Messages:
    961
    Likes Received:
    14
    Location:
    Far far away
    Yes its 8600GT
     
  16. Arun

    Arun Unknown.
    Legend

    Joined:
    Aug 28, 2002
    Messages:
    5,023
    Likes Received:
    302
    Location:
    UK
  17. Jawed

    Legend

    Joined:
    Oct 2, 2004
    Messages:
    11,708
    Likes Received:
    2,132
    Location:
    London
    Amusing really, since 8800GTx can't run full resolution, 1920x1080, HDCP-protected content.

    Jawed
     
  18. Blazkowicz

    Legend

    Joined:
    Dec 24, 2004
    Messages:
    5,607
    Likes Received:
    256
    as usual the video engine is better on midrange and lowend I think ;)
     
  19. INKster

    Veteran

    Joined:
    Apr 30, 2006
    Messages:
    2,110
    Likes Received:
    30
    Location:
    Io, lava pit number 12
    I've never heard that before.
    Isn't that only of interest to those who have a monitor that requires a dual-link DVI to work (i.e., a 30 inch LCD, usually with 2560 x 1600 native resolution) ?
    To my knowledge, a single-link DVI output with HDCP can drive a 1920 x 1200 @ 60Hz display, more than enough for a 1080p Blu-ray or HD-DVD, but i might be mistaken.
     
  20. Arnold Beckenbauer

    Veteran Subscriber

    Joined:
    Oct 11, 2006
    Messages:
    1,756
    Likes Received:
    722
    Location:
    Germany
    It can.
    You're right.
    The source of this myth:
    http://uk.theinquirer.net/?article=35966
     
    #940 Arnold Beckenbauer, Jul 30, 2007
    Last edited by a moderator: Jul 30, 2007
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...