It's an article entitled "3d Graphics: Quality Matters" here:
http://www.extremetech.com/article2/0,3973,1225676,00.asp
I have to say I've seen a few bizarre spins on 3d image quality, but I don't ever recall reading one quite as strange as this. Basically, the article takes a 9800P and a 5900U and decides that the best way to go about judging IQ is to pick arbitrary frame rates as targets and load the various cards with *different* IQ settings so that their frame rates normalize around these targets (well, as close as possible.) There are no screen shots or other examples of IQ anywhere in the article.
Now, in one case the GFFX is run at 4x FSAA/8x AF, and contrasted with the R9800P which is run at 6xFSAA/16xAF which brings the R9800P to ~ 110 fps and the GFFX to ~150 fps. The conclusion of the article states that the "3d quality" between the products in that game is a "draw." Nowhere is there a single screen shot illustrating why the author might have reached this conclusion, nor does he ever explain the methodology he used in making his decision (that I saw, anyway.)
Basically, I saw nothing in the article that might justify its title... IMO, it seemed the strangest attempt I have ever seen to communicate the PR concept of "little to any differences" between the 9800P and the 5900U. It contained nonsensical statements like "We were surprised at how we had to load up the 9800P to bring it down to our framerate targets." The idea that the image quality should be better on the 9800P at 6xFSAA/16xAF than it was on the GFFX at 4x FSAA/8xAF never seemed to enter the author's mind.... It's beyond me--but maybe I've missed something. Anyone else find it as strange?
http://www.extremetech.com/article2/0,3973,1225676,00.asp
I have to say I've seen a few bizarre spins on 3d image quality, but I don't ever recall reading one quite as strange as this. Basically, the article takes a 9800P and a 5900U and decides that the best way to go about judging IQ is to pick arbitrary frame rates as targets and load the various cards with *different* IQ settings so that their frame rates normalize around these targets (well, as close as possible.) There are no screen shots or other examples of IQ anywhere in the article.
Now, in one case the GFFX is run at 4x FSAA/8x AF, and contrasted with the R9800P which is run at 6xFSAA/16xAF which brings the R9800P to ~ 110 fps and the GFFX to ~150 fps. The conclusion of the article states that the "3d quality" between the products in that game is a "draw." Nowhere is there a single screen shot illustrating why the author might have reached this conclusion, nor does he ever explain the methodology he used in making his decision (that I saw, anyway.)
Basically, I saw nothing in the article that might justify its title... IMO, it seemed the strangest attempt I have ever seen to communicate the PR concept of "little to any differences" between the 9800P and the 5900U. It contained nonsensical statements like "We were surprised at how we had to load up the 9800P to bring it down to our framerate targets." The idea that the image quality should be better on the 9800P at 6xFSAA/16xAF than it was on the GFFX at 4x FSAA/8xAF never seemed to enter the author's mind.... It's beyond me--but maybe I've missed something. Anyone else find it as strange?