NVIDIA GF100 & Friends speculation

Best Fermi Review in my opinon:

http://translate.google.com/transla...tocp.com/htmls/59/n-959-22.html&sl=auto&tl=en

BFBC2_1920_4AA_1.png


480power.jpg
 
There's a switch in Dirt 2's config.xml (it's called something complicated really), which allows you to force DX9 mode. Everyone should have known that by now.

AMD has warned all the reviewers about the DiRT2 DX9 problems and offered free steam-keys so they could test with the full version.

Imho, that makes you a tool if you didn't.
 
AMD has warned all the reviewers about the DiRT2 DX9 problems and offered free steam-keys so they could test with the full version.
Imho, that makes you a tool if you didn't.
Tool or fool? :)
Anyway: Especially after AMDs warning every sensible test should take this information into account, but it was known beforehand, that you can go the safe route via config.
 
Last edited by a moderator:
The thing is that the cost adds up. If you're a big torrenter and keep your computer on 24/7, 25W higher idle works out to $20 per year. 120W higher peak power could mean a new power supply, so you need another $50.

You've got no argument from me, my point is, consumers lack real time power meters to let them know what they're using, and don't get charged a price high enough to dissuade them from waste (look at water wastage too).

With people getting into the kilowatt range now on some of these rigs, you're talking 10 cents per hour of play. If you had a meter on your desk that showed the $ being racked up, you might be more conservative. As it is now, the pain only comes delayed at the end of the month.

The same thing applies to calories and food. Basically, consumers go through life consuming without looking at the inputs and outputs.
 
Thanks, seems like the dirt2 results on the net is a little scandal on its own:oops:

Wonder how charlie knew this was going to happen.

Have you seen my hint about how AMD warned reviewers about the DiRT2 stuff?

anyway.. that's a *HINT*, you go figure the link with CD.
 
Feelings I'm getting from different reviews:
- very erratic performance
- super fast in DX9 games
- no so fast in DX11 where tesselation is not overdone
- amazing tesselation performance, but a bit ahead of it's time I think
- too hot, power hungry and loud!
- very good compute, but I still wait for someone with Fermi to test OpenCL GPU raytracing app from this thread -> SmallPT

So basing on that I'm very happy I bough my HD5870 at launch below MSRP (just by £10.00 but always :smile:).

GTX480 will make sense if performance is your only consideration and you either keep rig in other room or watercool your GFX card.
I think refresh part will be very decent, but for now it's meh.

Unfortunately for nVidia in games I play or plan to play in near future GTX480 is within 5% of my card, but can OC on AIR only to ~800MHz core (+100MHz) where my Sapphire can do 1GHz+ with ease on default FAN profile (which means acceptable acoustic).
 
Last edited by a moderator:
Wow , isn't it calm and peaceful now that the reviews are out !

anyhow , I read somewhere that GTX400 RAM clocks were low , because nvidia didn't do it's memory controller homework , and ended up with less pinning for the memory , and thus lower clocks ..

what does pinning here mean exactly ?
 
Yes, I saw that too. Ouch!! Basically saying there has to be some sort of trade-off.
It's only a trade off if >80 degree temperatures significantly impacts reliability ... if it doesn't it's not a trade off, it's simply the rational engineering choice (ignoring PR for a moment, customers aren't always rational).
 
anyhow , I read somewhere that GTX400 RAM clocks were low , because nvidia didn't do it's memory controller homework , and ended up with less pinning for the memory , and thus lower clocks ..

what does pinning here mean exactly ?

"Pinning" refers to the number of power and/or I/O pins dedicated to the memory controller on the chip/package.
 
Back
Top