NVIDIA GF100 & Friends speculation

With 5% additional perf @ Cat 10.3a it should be enough with some overclocking to catch up with 480. But wasn't said earlier that AMD will not overclock any more? Partners will. They can, aint?
 
OK, some first hand info, but my source is still under NDA, so not much in-depth details.

GTX470 is ~10% faster than stock HD5870, benched in Crysis.
The noise level under load is similar to GTX285 -- noticeable but not irritating.

I'll post more if something comes along. ;)
 
It'd be a great ball drop if they decide not to. ATI has never been good about pressing an advantage. If they do have Nvidia down with Fermi performance, they should go for the KO. Not start dancing around the ring trying to play it cool....


Maybe any refresh won't have much of a life because AMD have something bigger up their sleeves for this Autumn?
 
Maybe any refresh won't have much of a life because AMD have something bigger up their sleeves for this Autumn?

Still though the 5870 is $400. The 5970 is $600. There is a nice large margin in there for a part that is 20% faster than the 5870. If all it takes is an offical 5870 at 1.1ghz and amd can do it , they should do it so they bring in more money.
 
OK, some first hand info, but my source is still under NDA, so not much in-depth details.

GTX470 is ~10% faster than stock HD5870, benched in Crysis.
The noise level under load is similar to GTX285 -- noticeable but not irritating.

I'll post more if something comes along. ;)

2560 + 4xAA again?
 
What for? There are plenty of new games where Cypress isn't scaling as wonderfully as you suggest.
Likely for the same reason as Quake 3 wouldn't scale. You have to prove the game can scale before you can use it for this analysis.

e.g. I showed:

http://forum.beyond3d.com/showthread.php?p=1363207#post1363207http://forum.beyond3d.com/showpost.php?p=1363207&postcount=1675

that Vantage Extreme is scaling fairly well on ATI, but the shape of the fillrate graphs indicates that something that doesn't vary with screen resolution is a bottleneck, likely shadow map rendering rather than setup.

The shape for NVidia indicates a tendency towards not being shadow map rendering bottlenecked in GT2.

Overall you can say these tests are reasonable indicators of performance scaling within an architecture.

As I posted earlier, GT1 is 5% faster on HD5870, while GT2 is 4% faster on GTX480, implying that NVidia has retained a slight Z rate advantage, but overall NVidia has essentially lost the fairly significant lead it had in Vantage Extreme.

Does Vantage Extreme say much about game performance on average? If it shows very good scaling for Evergreen architecture, but games on average don't, is that because the architecture is failing or is it because the games are scaling badly? If it indicates GTX285 has more of a lead over HD4890 than actually shows in games, what then? etc.

Jawed
 
Again you're selecting titles for inclusion in the analysis based on whether they "scale" or not. Since the purpose of the analysis is whether or not an architecture scales you're just sweetening the pot and your results aren't meaningful. That's the equivalent of looking at the past performance of fund managers and picking the best performers and then claiming you're great at picking good fund managers.

In any case what's the definition of a game that doesn't scale? Resolution isn't the only determinant of workload or performance so that can't be the only measuring stick. I would argue that if an architecture doesn't address bottlenecks that in itself is an issue with the architecture, not the application.
 
很夸张,整机466W显卡满载功耗,以前HD5870测试的时候是33xW,也就是说两者差了将近130W,但是两家官方PPT里的TDP又是多少呢?HD5870为188W TDP,某卡为250W TDP,是谁再撒谎?
至于温度,满载1分钟不到飙升到98度,随后风扇开始提速,温度逐渐下降一直到92度,之后再也没有发生变化.(室内温度20度)
nApoleon 发表于 2010-3-23 09:14
GTX480 machine power consumption at full load is nearly 130W more than HD5870 machine,full load Temperature is 92℃.
 
Last edited by a moderator:
oh God. That sounds incredibly bad. It also says that temperature rose to as much as 98C before the fans started kicking in and brought it to 92C.
 
From the same thread(Post #32):
比HD5970高的去了,HD5970我之前测试是403W的TDP.
(To be 5970 higher than, HD5970 i test before and is 403W TDP)

As he got 466W for the GTX480 system, seems like his GF100 experienced some kind of thermal runaway.

Guess overclocking is looking a kindof of hazardous activity...

Edit: He should double-check his cooler is attached properly and retry his testing with fan forced to 100%(and some earplugs ;)) see if can get something more reasonable. Cannot let 40nm parts get as hot as previously, bad things happen.
 
Last edited by a moderator:
Back
Top