NVIDIA GF100 & Friends speculation

HHHmmmm, that's interesting. Wasn't aware of this SLI Issue. Still, more clarification is need, if you please. Does the second card stay in non ultra low power when SLI is enabled but in idle 2D or when SLI is disabled and in 2D?

If it is the first, then ATI has the same problem as well. My second 5850 stays at 400/900 if I enable Crossfire and just stay on the desktop. If I disable it, the both cards operate at 157/300. In all honesty, I don't know if this one of the "features" of Cat 10.8. Didn't notice it before, because I always keep crossfire disabled anyway and I only enable it before running some bitchy game that needs dual gpu power.
From what I understand, there simply is no ultra low power mode for the slave card. The power consumption numbers I get from here:
http://www.anandtech.com/show/3917/...-palit-and-calibre-overclocked-and-reviewed/8

I don't remember where I saw the note that there is no ultra low power mode for the slave card, unfortunately. I don't think it's a huge issue for these particular cards, however, as I doubt that many people will go for 450 in SLI. It would still obviously be nice if nVidia improved this for future cards, especially higher-end cards.
 
you're comparing Power Draw versus TDP here!

the 450 draws 88W in Grid, the 5870 Vapor-X draws 116W.
If I combine that with the average performance of TPU's SLI benchmark (where the power usage is even worse for the 450) you're talking about a 135% power draw for 90% performance.

Well, yes actually I did! Guilty as charged!:oops:

Indeed TPU's 450 SLI is closer to the 5850 than it is to the 5870. I cannot find the power measurements though...! :S

Computerbase shows better numbers at 1920X1200+4XAA. The 5870 is only 4% faster. The power draw is substantially higher though. Still I cannot shake the thought of the idle SLI problem and the possibility of a fix.

In your Grid example if a single 450 draws 88W and you add another 20W for the idle of the second 450, you still would get 108W vs 116W compared to the 5870, while you would still get the same performance. Two many variables are getting into play though and it is getting funny really!:p
 
In your Grid example if a single 450 draws 88W and you add another 20W for the idle of the second 450, you still would get 108W vs 116W compared to the 5870, while you would still get the same performance. Two many variables are getting into play though and it is getting funny really!:p

why would one 450 have the same performance as an 5870? or even two?

DiRT2 : http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTS_450_SLI/10.html

TPU didn't measure the power of the sli setup, unfortunatly.
 
why would one 450 have the same performance as an 5870? or even two?

DiRT2 : http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTS_450_SLI/10.html

TPU didn't measure the power of the sli setup, unfortunatly.

Oh, I was referring to the Grid example. Technically the performance would not be the same (wrong expression there), but they would both do more than 60fps, so the end result for the user would be the same anyway. Even if you enabled vsync in both of them, so to keep the power draw more at check, I still believe that a 5870 would consume more than a 450 at 3D and on 450 at idle (at least when the SLI problem gets fixed). This could be the case for all light games. All this is theoretical obviously.

All I am saying is that Nvidia still has some interesting offers. No need to grab the shotgun and shoot them in the head.

From a strictly economical point of view, even if the 450s consume 50W more at load than the 5870 (computerbase), it's not the end of the world. In my country with its awful electricity prices, a constant 50W in a 24/7 scenario, costs 50 euros per year. Not such a large amount, especially if you consider that you need like 100 euros less for the 450 SLI setup compared to the 5870. In nuclear powered countries, the prices should be even lower.

Obviously, noone can game 24/7, so the difference would be even greater. I am looking at 1/3 of 50 euros, which would be the respective money amount of 1/3 of 24hours, for 8 hours gaming per day, which is still crazy. Add to that, the time spent with one 450 disabled and presto, the 450 SLI is a better choice. xD
 
that makes it meaningful for a lot of people :p
I had a buddy owning his house and living on welfare (which is funny as house ownership is a pipedream here), specifically complaining about this. (sharing the house for a modest rent to pay property taxes, electricity and such). SLI video cards add up, with a power hungry CPU and careless uses of power.

I still wish for a meaningful low power 3D mode : disable unnecessary units when playing non demanding games, as long with lower clocks and lower voltage. on a GTX460 you would wall off a whole GPC and all depending SMs, on a GF106 you would disable 96 out of the 192SP, etc.
 
I still wish for a meaningful low power 3D mode : disable unnecessary units when playing non demanding games, as long with lower clocks and lower voltage. on a GTX460 you would wall off a whole GPC and all depending SMs, on a GF106 you would disable 96 out of the 192SP, etc.

It's called V-Sync.

shameless rips from the evga forum re: v-sync

GPU utilization with vsync on: 11%
http://a.imageshack.us/img210/3054/torchlight2010080917012.jpg

GPU utilization with vsync off: 50%
http://a.imageshack.us/img31/8917/torchlight2010080917014.jpg

Rendering Quack3 at 800FPS is not cool kids!
 
Last edited by a moderator:
The cost angle for power consumption has always been academic for anyone who isn't on welfare.
Well, actually, in my apartment there are three things that use up a significant amount of power:
1. My computer.
2. Hot water heater.
3. Refrigerator.

It turns out that they each make up about a third of my power bill, so noticeably reducing the power consumption of my computer helps the power bill quite a lot.
 
tI had a buddy owning his house and living on welfare specifically complaining about this.

He's on welfare and is worried about a few dollars a year in electricity? :eek:

It turns out that they each make up about a third of my power bill, so noticeably reducing the power consumption of my computer helps the power bill quite a lot.

What are the absolute numbers? I live in NYC where electricity costs are near the highest in the US and my bill is still a pittance. Leaving the bathroom light on overnight costs me more than anything else.
 
What are the absolute numbers? I live in NYC where electricity costs are near the highest in the US and my bill is still a pittance. Leaving the bathroom light on overnight costs me more than anything else.
Well, I'm living in Italy, and my power bill here is quite a bit higher than it was in the US. I typically pay about 100 euros every three months or so. It's not terrible, but for having low power usage in total, I would definitely prefer to have a lower-power computer.
 
What are the absolute numbers? I live in NYC where electricity costs are near the highest in the US and my bill is still a pittance. Leaving the bathroom light on overnight costs me more than anything else.

Try Folding. My power bill in winter is as much as it is in summer running the AC 24x7, and that's due in large part to folding.
 
Yeah 10 euro a month on computer power is a whole lot. But how much of that is based on actual usage? Just looked at my last bill and only 25% was based on usage. The rest of it was fixed charges and taxes.
 
Yeah 10 euro a month on computer power is a whole lot. But how much of that is based on actual usage? Just looked at my last bill and only 25% was based on usage. The rest of it was fixed charges and taxes.
I couldn't say. I don't keep the bills with me (my landlady goes over the bills with me when I pay them along with the rent). Could be interesting to check next time the bill comes up.
 
ATI ‘cheating’ benchmarks and degrading game quality, says NVIDIA:

http://www.atomicmpc.com.au/Feature...and-degrading-game-quality-says-nvidia.aspx/6

The graphics processing world is ever-turbulent. Arguments appear cyclical in nature from both ATI and NVIDIA, and bickering over issues like physics engines isn't uncommon - bringing about phrases like: 'They've been cheating with their so-and-so', or 'We offer the better solution without a drawback like theirs!' from both sides.


It's all part and parcel of competition. However, when Atomic received the recent NVIDIA GTS450 card, press release and included reviewer's guide, we noticed that it came with a page that implied certain unfavourable things about their competition, ATI.


We discovered that this page had been included in every reviewer's guide sent to hardware reviewers since the launch of the GF100 'Fermi' family: the GTX480, GTX470, GTX465, and both GTX460 cards. It had been included with the GTS450 and five other cards, but it seems no-one has paid it much attention, dismissing it offhandedly - as we admittedly dismiss most guides that attempt to dictate our testing methodology.


This page, as it appears below, claims many things. Prime among these is an assertion that ATI were utilising a technique in their drivers called 'FP16 Demotion' to boost their graphical performance at the cost of image quality. It names a number of older gaming titles as the only ones affected by this so-called hack, and welcomes reviewers to work it out for themselves. So, that's what we did ...
The article is 6 pages long , and it did confirm some reduction in image quality in a game or two .

What do you guys think ? Personally I see it as a pathetic way for Nvidia to claw at AMD's back , but the ethical question still remains , Should we tolerate optimization that affects image quality ? , what if the situation reversed and Nvidia was the one doing these optimizations ?
 
What do you guys think ? Personally I see it as a pathetic way for Nvidia to claw at AMD's back , but the ethical question still remains , Should we tolerate optimization that affects image quality ? , what if the situation reversed and Nvidia was the one doing these optimizations ?
A graphics driver that does not do the operations that the application requests is just plain wrong. I know that nVidia has done this sort of thing in the past as well, but I can see nothing wrong in pointing out when ATI does it.
 
These titles aren't used for current benchmarking, so I wouldn't call it "cheating".

I think ATi fixed, what developers should do. I understand that nVidia had much bigger market share at that time and it was needless to use FP10 on their hardware (cause it was as slow as FP16), but situation changed, more gamers use ATi's hardware now and this hardware offers full-speed FP10. I see no reason why to avoid FP10. It doesn't impact image quality in most situations and offers better performance.

Anyway, I don't think situation will change, because GF104, GF106 and likely all new nVidias GPUs will support full-speed FP16, so developers will likely not bother with FP10 usage... again
 
There's no problem legitimately using FP10 when it is called for by the dev where the argument comes from is when AMD did the akin to the shader replacement arguments from a few years in their drivers.
 
Back
Top