NVIDIA GT200 Rumours & Speculation Thread

Status
Not open for further replies.
Wow, I didn't expect such a response to the ranting of a grumpy old man! I just felt like it was akin to reading through a supercar forum where the hot topic of the day was the miles-per-gallon figure. More power for less cost is of course always beneficial for all (including the environment), but in the context of a discussion around the performance of a high-end graphics chip - well yes, I did personally find it insignificant.

I didn't think there was the need for retaliatory, personal-orientated insults (particularly from moderators) though - that was a bit dissapointing.

Anyway, sorry for the off-topic. Must go, I have a planet to destroy!
 
But even Formula 1 cars have fuel-saving modes they run in when they have finished qualifying.

I don't think you know just how much power is lost due to high idle power draw. Think of all the TVs in the world that are on standby during the night, think of all the VCRs (PVRs, whatever) etc. Anyway, we are really getting off topic.
 
to build on this, let's remember that humanity has to cut CO2 emissions by at least 80% (some people say at least 90%) and that almost every country uses coal and/or oil on a massive scale to generate electric power.

Please, let's not bring politics into this. :p You do realize that various studies have shown that Cattle produce FAR more CO2 than all the oil and coal burning in the world today?

I suppose we "could" just kill all the cattle in the world. :p

Anyway, while I don't like the power consumption of the GT200 if it works with Hybrid power and people actual use motherboards that support Hybrid power it's not so bad.

Although I prefer a solution that doesn't have to rely on certain power hungry chipsets. Not being able to conserve power when using an Intel MB or ATI MB with a GT200 based card would just be a nightmare.

And considering I won't use any current Nvidia chipset boards due to their overall power inefficiency... Well that's a bit of a conundrum for me.

Regards,
SB
 
Heh, HawkeyeGnu, I didn't want to force another car analogy onto the internet, but I did have one comparing the GTX280 to a Ferrari that I passed on. No one's going to complain that they suck power at the limit, but idle power use in a GPU is more akin to leaving the Ferrari idling in the garage even when you're not driving (see Colbert's El Camino). Then I was going to point out that people pay to hear a Ferrari rumble, while it's the opposite in PCs. Probably better that I skipped it.

Sure, sure, power draw isn't the only variable in HSF noise, but it's one less excuse for a noisy fan, especially when your speakers aren't drowning out your PC's fans. Yeah, enthusiast-class GPUs usually come with large fans which are naturally quieter b/c they don't have to spin as fast to push air, but any extra heat you're not dumping into your case helps.

The 1950P might not be the best analog b/c it's not in the same price bracket as an enthusiast-class GPU like the GTX 280, so card builders may not care to spend as much on a nice fan solution. If you look at the load dB, too, you see they're unchanged for the 1950P, so it looks like fan speed doesn't even vary with heat or load--that looks more like scrimping on the HSF than the effect of idle draw. Not to mention that a range of 0.9dB for 12 different cards doesn't promise a very exacting comparison, "real-world" as it may be. SPCR shows slightly more of a difference b/w a 3850 and a 3870 than TR does, but that may be down to the s/w each site used to stress the card than anything.

I'm not sure how 1dB translates into perceived sound, though. 3dB is a doubling of sound power level, but what's 1dB? 50%? Is it even noticeable at 50dB?

As for any real or perceived ad-homs, I think it's reasonable to expect you'd get as well as you gave. I mean, "tree-hugger?" Come on. :LOL: But, as always, feel free to report any post you think crosses the line by clicking on that tiny ! triangle icon.

Anyway, it wasn't a bad rant, so don't let a little opposition stop you from posting more. And we can all stop nit-picking minutia once the cards are released and we have more meaningful points to arg--er, discuss.
 
Heh, HawkeyeGnu, I didn't want to force another car analogy onto the internet, but I did have one comparing the GTX280 to a Ferrari that I passed on. No one's going to complain that they suck power at the limit, but idle power use in a GPU is more akin to leaving the Ferrari idling in the garage even when you're not driving (see Colbert's El Camino). Then I was going to point out that people pay to hear a Ferrari rumble, while it's the opposite in PCs. Probably better that I skipped it.

Sure, sure, power draw isn't the only variable in HSF noise, but it's one less excuse for a noisy fan, especially when your speakers aren't drowning out your PC's fans. Yeah, enthusiast-class GPUs usually come with large fans which are naturally quieter b/c they don't have to spin as fast to push air, but any extra heat you're not dumping into your case helps.

The 1950P might not be the best analog b/c it's not in the same price bracket as an enthusiast-class GPU like the GTX 280, so card builders may not care to spend as much on a nice fan solution. If you look at the load dB, too, you see they're unchanged for the 1950P, so it looks like fan speed doesn't even vary with heat or load--that looks more like scrimping on the HSF than the effect of idle draw. Not to mention that a range of 0.9dB for 12 different cards doesn't promise a very exacting comparison, "real-world" as it may be. SPCR shows slightly more of a difference b/w a 3850 and a 3870 than TR does, but that may be down to the s/w each site used to stress the card than anything.

I'm not sure how 1dB translates into perceived sound, though. 3dB is a doubling of sound power level, but what's 1dB? 50%? Is it even noticeable at 50dB?

As for any real or perceived ad-homs, I think it's reasonable to expect you'd get as well as you gave. I mean, "tree-hugger?" Come on. :LOL: But, as always, feel free to report any post you think crosses the line by clicking on that tiny ! triangle icon.

Anyway, it wasn't a bad rant, so don't let a little opposition stop you from posting more. And we can all stop nit-picking minutia once the cards are released and we have more meaningful points to arg--er, discuss.

I thought it was every 10db that sound doubled, maybe I'm wrong.

http://www.gcaudio.com/resources/howtos/loudness.html

Thought I was right, every 10db is a doubling of teh power of sound
 
Ah, sarcasm - I recognise that! So idle power is somehow proportional to idle fan noise, then?

http://techreport.com/articles.x/13772/6

An x1950pro has the 5th lowest idle power, but the highest idle noise level. Sheesh, who would have thought?
Of course not. i had AGP Sapphire x1950p [yes, i was an ATi fan; i've been "converted"] and it is a nasty annoying irritating whine at load; like my old 7800GS and my x850xt@PE. However, tmy HD2900xt is a Nasty Power Hogg - a bit worse than my 8800GTX @ ultra at full bore; BUT the fan is not annoying!

if you go 100% on my 2900xt's fan it sounds like a baby dragon is trying to "whoosh" itself out of the case - it is startling for the first time and damned impressive for HD2900xt Crossfire - but it is well engineered to not be annoying - as the x1950p was. At idle, you barely hear the 2900xt and it is just barely louder than my 8800GTX. Of course, i have a Thermalright VGA cooler for my GTX and i tossed the stock cooler - so it is an unfair comparison - except to say that my case is a LOT cooler with Crossfire than a single GTX - as the air is exhausted

so it is all engineering

imo =P
 
Check again. 10db corresponds to a perceived doubling of loudness. It's actually 10 times the sound pressure. (10db = 10 decibels = 1 bel = 10 times the sound pressure since it's a log base 10) See Wiki.

Yes. An increase of 6 dB is generally seen as a doubling of volume. If you use the volume function in Sony Sound Forge, +6 dB is listed as 200%.
 
Last edited by a moderator:
As for any real or perceived ad-homs, I think it's reasonable to expect you'd get as well as you gave. I mean, "tree-hugger?" Come on. :LOL:

Well, "tree-hugger" doesn't deserve a "bad father" in my book, but I take your point. From now on I shall use "conifer cuddler" to describe the obsessively environmentally-conscious among us, hopefully an altogether more acceptable and less antagonistic term.

Thanks for the reply, Pete.

;)
 
I'm all for efficiency and cutting power when idle, as long as it doesn't compromise absolute performance. I don't believe in yielding performance when *I need it*. As an example if "being green" meant using 10% more transistors on a desktop part or losing 10% more performance, I'd say forget it. Give me all the trannies for perf, I'll pay extra for carbon offsets f you want. For other people, they might not make that choice, but that's why it's the "high end".
 
I'm all for efficiency and cutting power when idle, as long as it doesn't compromise absolute performance. I don't believe in yielding performance when *I need it*. As an example if "being green" meant using 10% more transistors on a desktop part or losing 10% more performance, I'd say forget it. Give me all the trannies for perf, I'll pay extra for carbon offsets f you want. For other people, they might not make that choice, but that's why it's the "high end".

I agree in every respect.
 
KillZone2 (a PS3 title) is based on a classic deferred renderer + MSAA and runs on NVIDIA hardware. This should tell us that NVDIA got at least DX10.1 features working way before AMD/ATI and way before DX10 was even finalized, it's not like they don't know how to do it.
1. You don't need 10.1 for a deferred shader + MSAA to work. You can access the color subsamples with 10.0. Only if you need the Z subsamples you need 10.1. In a classic deferred renderer you only need to deal with color.
2. RSX doesn't directly support accessing the subsamples in the shader. It only works because they exactly know the memory layout of the framebuffer and can work around that problem manually.

But I'm also very curious as to which feature exactly NVIDIA didn't implement in G80 which cost them the D3D10.1 compliance. I think it's the programmable AA patterns but I could be wrong.
 
Programmable AA is just a matter of being able to do a lot of point sampling and some math. I'm sure they can do it but with less shader power than AMD they're probably hesitant to do so.
 
Well if NV is letting people under NDA talk about the chip, I guess that means the game is well and truly up in terms of information being kept under hats. I'll post everything I've got tomorrow while there are still a few surprises left.
 
Could it be the Texture Units and how its configured that breaks the DX10.1 compliance in G8x/G9x?

edit - hooray for more leaks :D
 
Well if NV is letting people under NDA talk about the chip, I guess that means the game is well and truly up in terms of information being kept under hats. I'll post everything I've got tomorrow while there are still a few surprises left.
I guess some of these are controlled leaks ..

Although Nvidia might ward off all attention off from the GT200b rumor, mainly because it of its cannabalizing nature.
 
Status
Not open for further replies.
Back
Top