If by biased you mean lopsided, then I agree. Just like Lost Planet.I still think COJ is a very biased benchmark that shouldn't be trusted at all.
If by biased you mean lopsided, then I agree. Just like Lost Planet.I still think COJ is a very biased benchmark that shouldn't be trusted at all.
If by biased you mean lopsided, then I agree. Just like Lost Planet.
Well this is old news and OT but take a read.
Courtesy of Guru3d.
http://www.guru3d.com/newsitem.php?id=5464
Well this is old news and OT but take a read.
Courtesy of Guru3d.
http://www.guru3d.com/newsitem.php?id=5464
But GTX260 SLI will be $800, and could well drop to $700 before long (according to some predictions). That's a long way below $1300.
assassin's creed in reverse?
Maybe, but its abit different when you compare the two cases. AC did have a DX10.1 path that did provide performance benefits to ATi cards (both cards i.e 8800GT/HD3870 provided enough performance to fully play AC at decent settings) but it was buggy and showed rendering problems in some situations. Ubi took the easiest method (as devs) of fixing this by removing DX10.1 completely.
Where as in COJ case, there was no adequate reason behind the changes. Infact the inital benchmark saw nVIDIA cards performing a little better than its AMD counterparts. Interestingly, AMD bought the COJ benchmark, including its publishing rights and what not. Soon after came a patch that enabled whats listed in guru3d, which seriously crippled nVIDIA cards as seen by most reviews out there today. IQ comparisons were done but the supposed improved IQ according to techland was so minuscule it made no sense to disable certain features such as hardware AA path being completely removed. There was other observations made such as no AF being applied on R600 hardware.
As opposed to all the remaining titles with benchmark modes that are part of the TWIMTBP program?I still think COJ is a very biased benchmark that shouldn't be trusted at all.
As opposed to all the remaining titles with benchmark modes that are part of the TWIMTBP program?
Maybe ATI should be issueing statements about each of them too, so that they can get people to doubt their validity?
Actually, the developer responded nVidia and made them look like idiots. The basis of nVidia complaining was the use of shader-assited AA resolve, which nV chips aren't optimized for, but although CoJ is perhaps the one and only game using it, from Direct3D 10 point of view it's a standard procedure. Apart from that, nVidia complained about texture filtering, to which the developers replied that it actually benefits nVidia's GPUs (and we all know R6xx suck with AF, don't we?). Yet I agree that Call of Juarez should not be used as a game benchmark, since the game is so hopelessly boring.Whoa, that explains a lot, no wonder NVIDIA cards are so far behind in that benchmark.
Then i'd suggest you do some digging. We have benchmarked Call of Juarez quite regurlarly since the DX10-Enhancement-Pack came out and it seems that Radeons do considerably worse, once you really use a savegame & Fraps compared to the separate benchmark utility that was provided.=>CarstenS: Well I haven't been digging into the details. But nVidia's accusations certainly failed to mention your point.
Yes - but we should discuss this in a separate topic if need be.Wait, so the benchmark utility performance figures are completely different with the actual ingame numbers?
Anyway think we are going too offtopic.
Nvidia founded Team “Whoopass”, which consists only of several computers that are running the Folding@Home GPU client. Even with just 4-5 test machines, the team quickly moved into the top 5% of all contributors by sheer processing power.
If i may add, i know its R7xx thread, but just for comparison i add Geforce 280GTX score
perlin noise:
HD4850: ~ 335
HD3870: ~ 175
RV670X2: ~ 355
GTX 280: ~ 300