NVIDIA GF100 & Friends speculation

You'd think reviews came out or something recently...

I was a bit surprised by the differences in idle/load power draw and temps.
 
And for DIRT2 benches all other sites point to better perfomance on GTX480 and again at Hardocp they got different settings. Seems a quite biased site that dont bench on equal terms.

Are you crazy? HardOCP test what settings will user be able to use and still keep game smooth - which is for me as customer much more importants than just numbers on some dumb ingame benchmark.

I trust Kyle (and Anand, but I prefer HardOCP type of tests) much more than most other reviewers. NV fanboys are clearly retarded - when Kyle was saying bad things about 2900XT it was OK, but when he does the same to GTX470 and 480 he is suddenly biassed :rolleyes::rolleyes:
 
Now, for Dirt 2 we unfortunately had to ditch all results we had made in the past and I thing there will be a number of websites out there that made this istake. The previous time demo test was based on the DX11 demo of the game. The demo code however will only work at DX9 for the GeForce GTX 400.

We swapped out the time demo to the full version of the game, you guessed it... our lovely recorded time demo refused to work. Therefore we moved onwards and completely started all results from scratch to make sure both cards are rendering PROPERLY at DirectX 11.
http://www.guru3d.com/article/geforce-gtx-470-480-review/21
 
I trust Kyle (and Anand, but I prefer HardOCP type of tests) much more than most other reviewers. NV fanboys are clearly retarded - when Kyle was saying bad things about 2900XT it was OK, but when he does the same to GTX470 and 480 he is suddenly biassed :rolleyes::rolleyes:

I had actually thought him to be biased against ATI but given that others have felt the opposite it means the reality is it was my own bias and lack of perspective at the time not any bias on Kylies part.
 
Bjorn3D's review have HUUUUGE problem:

The test setup stated:
"Drivers for ATI GPU's 8.663 Recommended Review Drivers" - that is driver from LAST YEAR!!
 
Was this posted already? Don't let the numbers on the left irritate you! It's all about the graphs :D

nvidia09.jpg
 
I would love to see more DirectX 11 test, like Stalker or Metro with different DX11 feature sets and maybe a couple of DirectX 11 SDK samples....

Thomas
 
I trust Kyle (and Anand, but I prefer HardOCP type of tests) much more than most other reviewers. NV fanboys are clearly retarded - when Kyle was saying bad things about 2900XT it was OK, but when he does the same to GTX470 and 480 he is suddenly biassed :rolleyes::rolleyes:

Kyle can be rough around the edges but he's just about the only reviewer I trust. Look at all these sites that have used months old drivers for AMD/ATI? WTF? Did NV send them driver CDs to use for ATI?

And why is it ATI owns in DIRT2 DX11 on [H] but gets slammed on most every other site? DX9 path? How bought and paid for are you if you present data like that?

I'm glad Fermi doesn't suck and, in fact, looks to rock for tessellation and some games. What bothers me is the number of sites that are putting up fishy reviews. As if the signal-to-noise ratio for computer hardware wasn't already bad enough.

Kudos to Kyle and [H] for not being on the take - of either major GPU supplier.
 
And why is it ATI owns in DIRT2 DX11 on [H] but gets slammed on most every other site? DX9 path? How bought and paid for are you if you present data like that?
Alternatively, how stupid and gullible are you if you can be gamed like that?

Kudos to Kyle and [H] for not being on the take - of either major GPU supplier.
I skimmed thru the reviews and found that only [H] had a disappointing take on the matter. Rest were saying that 480 is kinda good.
 
someone knows the ROP clock? Haven't seen that mentioned anywhere.
Power draw is rather horrible, TDP 250W yeah right but actually exceeds 300W (and HD5970). Something really can't be right with that chip, voltage is quite low, clocks aren't that high neither, not even all units enabled and it draws 300W... And despite really really low idle clocks (not lowering voltage though) idle power draw isn't great neither. Maybe there's not much clock gating or it isn't working...
At least looks like AF quality, despite the lower amount of texture units, is still the same, so no new "tricks" there. It'll indeed however has a higher performance hit for enabling AF than older chips (http://www.computerbase.de/artikel/hardware/grafikkarten/2010/test_nvidia_geforce_gtx_480/5/).
 
Let me try to compare it with people's troubles with NV30.

http://forum.beyond3d.com/showpost.php?p=1401943&postcount=2518

The conclusion bit from techreport's review.
This would have been a great product had it arrived six months earlier at this same clock speed with lower heat levels, a more reasonable cooler, and lower prices.

Late-Check.
Hot-Check.
ginormous cooler-check
pricing-every body seems to think that 480's price is higher compared to the price/perf curve.

Execution wise, I think it _is_ a nv30/r600 like. It won't be panned like them, sure as hell no. But that is for different reasons. nv30 vs r600 was a halo at all costs era, and r300 blew it out of the water. r600 ended up bigger than it's intended area as the process was delayed, setting it up against g80 and got hammered. This time amd has a smaller die and quite substantial perf/mm2 advantage. So absolute perf is ok-ish. But not on a per mm2 basis.

FWIW, I expect the GDDR5 clock gap to vanish by the time of NI.
 
Nvidia's biggest concern must be, how on earth are they going to be selling these as HPC cards? They are clearly huge power hogs, and probably too hot for a full rack..?

What do you think..? My bet is placed in no tesla card until b1
 
The biggest problem with the NV3x was that the performance in DX9 was terrible, which is not the case with Fermi and DX11. Really, there hasn't been as big a fubar as the NV3x. The R600 had lots of issues, but overall, it was still a reasonably performing chip. I guess you could compare the R600 and Fermi, atleast from the standpoint of being late and having manufacturing issues.

Really, if NVidia had delivered a 512SP chip at their planned clocks, the fact that perf/mm isn't as good as AMD would be a non-issue for most enthusiasts, they'd pay the premium. What is disappointing to people is that they soft launched a cut down chip that is obviously super binned and patched up from bad yielding silicon.

I wouldn't buy an initial run of the Fermi lots, I'd wait for B1 or whatever. I like the architecture and features, but I don't want to be a guinea pig buying Fermi 1.0, I'd wait for the revision.
 
Back
Top