NVIDIA GT200 Rumours & Speculation Thread

Status
Not open for further replies.
Does crossfire shut off the secondary monitor while gaming like SLI? Until multi-gpu problems are fixed (that includes dual monitor, stuttering, lower effective framerates, wasted memory), my stimulus check is going to a GTX 280, I couldn't care less if 4870X2 gets double the score in 3DMark.
 
Last edited by a moderator:
I love this trend "Lets all hate Multi GPU" all of a sudden.

No it all makes perfect sense. Nvidia obviously knew that they were about to lose the performance crown to a multi-GPU setup so they planted the seed of FUD, having it spread across the internet like a plague just in time for everyone to spit on the 4870X2 when benchmarks come out showing it beating up on the GTX280.

Duh! :p
 
No it all makes perfect sense. Nvidia obviously knew that they were about to lose the performance crown to a multi-GPU setup so they planted the seed of FUD, having it spread across the internet like a plague just in time for everyone to spit on the 4870X2 when benchmarks come out showing it beating up on the GTX280.

Duh! :p

LOL, I like that theory.

Seriously though, why deal with stuttering if you don't have to? Single fast GPU > dual slower GPUs most of the time.
 
No it all makes perfect sense. Nvidia obviously knew that they were about to lose the performance crown to a multi-GPU setup so they planted the seed of FUD, having it spread across the internet like a plague just in time for everyone to spit on the 4870X2 when benchmarks come out showing it beating up on the GTX280.

Duh! :p

You got me. Thats why I defend it vehemently :)
 
As I said before the reason why I personally haven't been yet convinced by dual -chip/GPU setups is the whole redundancy that surrounds such sollutions. As I said before it looks like AMD will take in the future a step in the right direction with ram redundancy on dual-chip boards.
 
No it all makes perfect sense. Nvidia obviously knew that they were about to lose the performance crown to a multi-GPU setup so they planted the seed of FUD, having it spread across the internet like a plague just in time for everyone to spit on the 4870X2 when benchmarks come out showing it beating up on the GTX280.

Duh! :p

Except they dont have to plant anything, fanboys will run with it on their own.
 
One thing I haven't seen people talk about is the noise level of a gpu that has a TPD of 240~ watts. Does anyone have an idea how loud this little guy is going to be? I can only imagine it is going to be up there with the 2900xt or the 5800 ultra.
 
One thing I haven't seen people talk about is the noise level of a gpu that has a TPD of 240~ watts. Does anyone have an idea how loud this little guy is going to be? I can only imagine it is going to be up there with the 2900xt or the 5800 ultra.

You are imagining if you think the 2900xt is *loud* in actual day to day use. My VisionTek's VGA cooler idle is only slightly louder than my 8800GTX. And my case is cooler too!

At normal gaming play - not overclocked - the 2900xt is not very noticeable; barely more than my GTX. OTOH, if i crank the cooler past 60% or so, it is a real "whoosh" of air.

My x1950p/x850xt and 7800GS-OC were all a hell of a lot more annoying than my 2900xt .. so it depends on the design, i would say.

. . . Nvidia obviously knew that they were about to lose the performance crown to a multi-GPU setup so they planted the seed of FUD, having it spread across the internet like a plague just in time for everyone to spit on the 4870X2 when benchmarks come out showing it beating up on the GTX280.
if i said that, what would you have said to me?

So if it IS true that the 4870x2 beats up on the GTX x1, clearly we should expect a GTX-x2 with the shrink; do you think Nvidia will allow AMD to keep any performance crown? i doubt it even if they have to create another expensive unappetizing and ugly sandwich just to blast it. My opinion, clearly!

finally, if you use CrossFireAA or SLI-AA - non-AFR - you bypass the Micro Stutter in most cases. Dual-GPU setups generally allow you to crank up the detail and the filtering for us guys with modest displays like 16x10 or 16x12 - like i do with 2900xt crossfire; of course i eventually plan to have GTX-280 SLi so i will see if Micro Stutter is a big deal or not; i generally like Crossfire although i would prefer a single more powerful GPU and mostly play with my GTX.
 
Last edited by a moderator:
One thing I haven't seen people talk about is the noise level of a gpu that has a TPD of 240~ watts. Does anyone have an idea how loud this little guy is going to be? I can only imagine it is going to be up there with the 2900xt or the 5800 ultra.

Pssh, the 2900XT went silent after a few drivers.

The Cooler Master solutions nVidia's been using seem to range from good to great, so you shouldn't need to worry too much.
 
The 2900XT is VERY noisy in my opinion, and it heats my ~60 m^3 room to some very hot temps (although some of that I may blaim on my Q6600 at 3.6 ghz).
 
The 2900XT is VERY noisy in my opinion, and it heats my ~60 m^3 room to some very hot temps (although some of that I may blaim on my Q6600 at 3.6 ghz).

Yes, 2900xt Crossfire heats my room, but not my case. If you are used to quiet Video cards, the 2900xt is "noisy" .. but at half-throttle, there is very little difference between my GTX and my XT. At full throttle the 2900xt is a monster whoosh and no one can really stand it without headphones. But i never once heard it go over 60% - even in full bore situations; mildly annoying - but not as bad as my x1950p/512 [AGP] or that winy b!tch of a 7800GS; the GS-OC used to get on my nerves and i didn't keep it long.; in fact, i was not oc'd much higher than the factory - just because of the irritating fan.

I don't think it's as clear as you think it is.
why did you leave out the main part? - where i said it is clearly my own opinion!~ :p
 
11tak9z.jpg


VR-Zone said:
Here are some close-up shots of a stripped down GeForce GTX 280 card for you guys to enjoy over the weekend. The markings on the IHS is labeled as G200-300-A2 and has a total of 16 Hynix 0.8ns GDDR3 memory chips.

Courtesy: VRZ
 
I don't think it's as clear as you think it is.
I'll second that. First a 65nm->55nm shrink is unlikely to provide massive benefits (just look at the RV630->RV635 - yes the chip got smaller but power draw is still similar, and chip clock too). Second, even if this does provide some benefits (regarding power) nvidia might want to use this to boost (shader) clocks instead - those are arguably very low now, presumably due to power/thermal issues. Of course two lower-clocked units might be possible - after all it's already the case with 9800GX2 vs. 9800GTX but I suspect the difference in clock would need to quite a bit larger.
 
The only way to make any GX2 concept possible for the GT200 would result in alot of redesigning the chip to have much lower power/heat envelope. A mere die shrink to 55nm process would not be sufficient enough. Its still a big chip even at 55nm process, unless you start cutting the fat out.

That being said, i hope there is no more GX2 variants. With such low clocks on the GT200, theres plenty of performance to be gained just by upping the clock/shader frequency.
 
So did we establish what the 'T' in GT200 stood for? And--seeing that die shot--can we just drop it in favor of G200? :)
 
Will these GT200 cards, or even a R700, need PCI express 2.0 in a single card configuration, or is 1.1 enough still?
 
Status
Not open for further replies.
Back
Top