NVIDIA GT200 Rumours & Speculation Thread

Discussion in 'Architecture and Products' started by Arun, Feb 10, 2008.

Thread Status:
Not open for further replies.
  1. shiznit

    Regular

    Joined:
    Nov 27, 2007
    Messages:
    345
    Likes Received:
    95
    Location:
    Oblast of Columbia
    Does crossfire shut off the secondary monitor while gaming like SLI? Until multi-gpu problems are fixed (that includes dual monitor, stuttering, lower effective framerates, wasted memory), my stimulus check is going to a GTX 280, I couldn't care less if 4870X2 gets double the score in 3DMark.
     
    #2181 shiznit, Jun 6, 2008
    Last edited by a moderator: Jun 6, 2008
  2. AlexV

    AlexV Heteroscedasticitate
    Moderator Veteran

    Joined:
    Mar 15, 2005
    Messages:
    2,535
    Likes Received:
    144
    No it doesn't.
     
  3. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    12,059
    Likes Received:
    3,119
    Location:
    New York
    No it all makes perfect sense. Nvidia obviously knew that they were about to lose the performance crown to a multi-GPU setup so they planted the seed of FUD, having it spread across the internet like a plague just in time for everyone to spit on the 4870X2 when benchmarks come out showing it beating up on the GTX280.

    Duh! :razz:
     
  4. ShaidarHaran

    ShaidarHaran hardware monkey
    Veteran

    Joined:
    Mar 31, 2007
    Messages:
    4,027
    Likes Received:
    90
    LOL, I like that theory.

    Seriously though, why deal with stuttering if you don't have to? Single fast GPU > dual slower GPUs most of the time.
     
  5. ChrisRay

    ChrisRay <span style="color: rgb(124, 197, 0)">R.I.P. 1983-
    Veteran

    Joined:
    Nov 25, 2002
    Messages:
    2,234
    Likes Received:
    26
    You got me. Thats why I defend it vehemently :)
     
  6. Ailuros

    Ailuros Epsilon plus three
    Legend Subscriber

    Joined:
    Feb 7, 2002
    Messages:
    9,511
    Likes Received:
    224
    Location:
    Chania
    As I said before the reason why I personally haven't been yet convinced by dual -chip/GPU setups is the whole redundancy that surrounds such sollutions. As I said before it looks like AMD will take in the future a step in the right direction with ram redundancy on dual-chip boards.
     
  7. Rangers

    Legend

    Joined:
    Aug 4, 2006
    Messages:
    12,791
    Likes Received:
    1,596
    Except they dont have to plant anything, fanboys will run with it on their own.
     
  8. sc3252

    Newcomer

    Joined:
    Jun 6, 2008
    Messages:
    36
    Likes Received:
    3
    One thing I haven't seen people talk about is the noise level of a gpu that has a TPD of 240~ watts. Does anyone have an idea how loud this little guy is going to be? I can only imagine it is going to be up there with the 2900xt or the 5800 ultra.
     
  9. apoppin

    Regular

    Joined:
    Feb 12, 2006
    Messages:
    255
    Likes Received:
    0
    Location:
    Hi Desert SoCal
    You are imagining if you think the 2900xt is *loud* in actual day to day use. My VisionTek's VGA cooler idle is only slightly louder than my 8800GTX. And my case is cooler too!

    At normal gaming play - not overclocked - the 2900xt is not very noticeable; barely more than my GTX. OTOH, if i crank the cooler past 60% or so, it is a real "whoosh" of air.

    My x1950p/x850xt and 7800GS-OC were all a hell of a lot more annoying than my 2900xt .. so it depends on the design, i would say.

    if i said that, what would you have said to me?

    So if it IS true that the 4870x2 beats up on the GTX x1, clearly we should expect a GTX-x2 with the shrink; do you think Nvidia will allow AMD to keep any performance crown? i doubt it even if they have to create another expensive unappetizing and ugly sandwich just to blast it. My opinion, clearly!

    finally, if you use CrossFireAA or SLI-AA - non-AFR - you bypass the Micro Stutter in most cases. Dual-GPU setups generally allow you to crank up the detail and the filtering for us guys with modest displays like 16x10 or 16x12 - like i do with 2900xt crossfire; of course i eventually plan to have GTX-280 SLi so i will see if Micro Stutter is a big deal or not; i generally like Crossfire although i would prefer a single more powerful GPU and mostly play with my GTX.
     
    #2189 apoppin, Jun 7, 2008
    Last edited by a moderator: Jun 7, 2008
  10. Tchock

    Regular

    Joined:
    Mar 4, 2008
    Messages:
    849
    Likes Received:
    2
    Location:
    PVG
    Pssh, the 2900XT went silent after a few drivers.

    The Cooler Master solutions nVidia's been using seem to range from good to great, so you shouldn't need to worry too much.
     
  11. Nuker

    Newcomer

    Joined:
    Jun 7, 2008
    Messages:
    16
    Likes Received:
    0
    The 2900XT is VERY noisy in my opinion, and it heats my ~60 m^3 room to some very hot temps (although some of that I may blaim on my Q6600 at 3.6 ghz).
     
  12. willardjuice

    willardjuice super willyjuice
    Moderator Veteran Alpha

    Joined:
    May 14, 2005
    Messages:
    1,386
    Likes Received:
    299
    Location:
    NY
    I don't think it's as clear as you think it is.
     
  13. apoppin

    Regular

    Joined:
    Feb 12, 2006
    Messages:
    255
    Likes Received:
    0
    Location:
    Hi Desert SoCal
    Yes, 2900xt Crossfire heats my room, but not my case. If you are used to quiet Video cards, the 2900xt is "noisy" .. but at half-throttle, there is very little difference between my GTX and my XT. At full throttle the 2900xt is a monster whoosh and no one can really stand it without headphones. But i never once heard it go over 60% - even in full bore situations; mildly annoying - but not as bad as my x1950p/512 [AGP] or that winy b!tch of a 7800GS; the GS-OC used to get on my nerves and i didn't keep it long.; in fact, i was not oc'd much higher than the factory - just because of the irritating fan.

    why did you leave out the main part? - where i said it is clearly my own opinion!~ :razz:
     
  14. Arty

    Arty KEPLER
    Veteran

    Joined:
    Jun 16, 2005
    Messages:
    1,906
    Likes Received:
    55
    [​IMG]

    Courtesy: VRZ
     
  15. mczak

    Veteran

    Joined:
    Oct 24, 2002
    Messages:
    3,022
    Likes Received:
    122
    I'll second that. First a 65nm->55nm shrink is unlikely to provide massive benefits (just look at the RV630->RV635 - yes the chip got smaller but power draw is still similar, and chip clock too). Second, even if this does provide some benefits (regarding power) nvidia might want to use this to boost (shader) clocks instead - those are arguably very low now, presumably due to power/thermal issues. Of course two lower-clocked units might be possible - after all it's already the case with 9800GX2 vs. 9800GTX but I suspect the difference in clock would need to quite a bit larger.
     
  16. Twinkie

    Regular

    Joined:
    Oct 22, 2006
    Messages:
    386
    Likes Received:
    5
    The only way to make any GX2 concept possible for the GT200 would result in alot of redesigning the chip to have much lower power/heat envelope. A mere die shrink to 55nm process would not be sufficient enough. Its still a big chip even at 55nm process, unless you start cutting the fat out.

    That being said, i hope there is no more GX2 variants. With such low clocks on the GT200, theres plenty of performance to be gained just by upping the clock/shader frequency.
     
  17. Pete

    Pete Moderate Nuisance
    Moderator Legend

    Joined:
    Feb 7, 2002
    Messages:
    5,777
    Likes Received:
    1,814
    So did we establish what the 'T' in GT200 stood for? And--seeing that die shot--can we just drop it in favor of G200? :)
     
  18. INKster

    Veteran

    Joined:
    Apr 30, 2006
    Messages:
    2,110
    Likes Received:
    30
    Location:
    Io, lava pit number 12
    That's not a die shot. ;)
     
  19. Berek

    Regular

    Joined:
    Oct 17, 2004
    Messages:
    274
    Likes Received:
    4
    Location:
    Austin, TX
    Will these GT200 cards, or even a R700, need PCI express 2.0 in a single card configuration, or is 1.1 enough still?
     
  20. ChrisRay

    ChrisRay <span style="color: rgb(124, 197, 0)">R.I.P. 1983-
    Veteran

    Joined:
    Nov 25, 2002
    Messages:
    2,234
    Likes Received:
    26
    1.1/2.0 are backwards compatible.
     
Loading...
Thread Status:
Not open for further replies.

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...