martrox said:Ummm... looks like the 5600Ultra get's it's head handed to it by the 9600Pro.....
http://www.hardocp.com/article.html?art=NDU4
ATI Radeon 9600 Pro Reference Card - Operating at default clock speeds (400/300) using WHQL driver version 7.84.10. "Quality" settings selected within drivers for all options.
ATI Radeon 9500 Pro - Operating at default clock speeds (277/270) using Catalyst 3.2 drivers. "Quality" settings selected within drivers for all options.
Thats funny.. All the reviews im seeing where thay actually use the Same IQ settings for both cards the Ultra gets absolutly crushed. ITs not even close. The only benchmarks Nvidia wins are when they are allowed to use their Xabresque AF settings..In french, but i think you can have a look at the graphs. I do understand the last boost on the 9600 pro. Otherwise it would have been difficult to compete with the 5600 ultra :/
Well, i generally fing HFR reviews better than Anand/[H] reviews, including this one. But well, you are Hellbinder[CE]Hellbinder[CE said:]Thats funny.. All the reviews im seeing where thay actually use the Same IQ settings for both cards the Ultra gets absolutly crushed. ITs not even close. The only benchmarks Nvidia wins are when they are allowed to use their Xabresque AF settings..In french, but i think you can have a look at the graphs. I do understand the last boost on the 9600 pro. Otherwise it would have been difficult to compete with the 5600 ultra :/
what's this 7.84.10 driver? it maks 9600pro outperform 9500pro&cat3.2 in splinter cell.
Hellbinder[CE said:]The only benchmarks Nvidia wins are when they are allowed to use their Xabresque AF settings..
Note that they used the "application" anisotropic settings.martrox said:Ummm... looks like the 5600Ultra get's it's head handed to it by the 9600Pro.....
http://www.hardocp.com/article.html?art=NDU4
Chalnoth said:As the article stands, they should at least have benchmarked the FX 5600's 8x aniso vs. the Radeon 9600's 16x aniso, in order to offset the off-angle problems that the Radeon has.
Chalnoth said:As it turns out, setting the slider to "optimal" or "aggressive" even on a GeForce4 improves performance. This says to me that setting the FX 5600 to application also gets rid of lots of improvements made back in the GeForce3/4 era that don't significantly reduce image quality.
This says to me that setting the FX 5600 to application also gets rid of lots of improvements made back in the GeForce3/4 era that don't significantly reduce image quality.
Looking at the pics in Ante P's latest 5800U review, it seems nV just lowered the FX's AF quality. His pics show the GF4 (at the same driver setting as the NV30) with MIP-maps pushed much further back than the FX's.DaveBaumann said:This says to me that setting the FX 5600 to application also gets rid of lots of improvements made back in the GeForce3/4 era that don't significantly reduce image quality.
The application mode for 5600U is faster than a Ti4200. I don't think they've removed improvements, clearly they've added some as the 5600U has a much lower texture fill-rate.