Chalnoth said:You know what really pisses me off about this? The entire reason that nVidia doesn't still support their nearly angle-independent anisotropic is because of ATI's own angle-dependent anisotropic, that most users seemed to claim had a minor image quality impact at most.
And now people are saying that ATI's new anisotropic filtering is fantastic? The two stances don't add up.
Granted, I've always been in the camp of angle-independent anisotropic, and was really disappointed by the NV4x, but it just upsets me that the response seems to be much more active now than it was back when ATI was using its highly angle-dependent implementation and nVidia wasn't.
Chalnoth said:And now people are saying that ATI's new anisotropic filtering is fantastic? The two stances don't add up.
Chalnoth said:You know what really pisses me off about this? The entire reason that nVidia doesn't still support their nearly angle-independent anisotropic is because of ATI's own angle-dependent anisotropic, that most users seemed to claim had a minor image quality impact at most.
And now people are saying that ATI's new anisotropic filtering is fantastic? The two stances don't add up.
Granted, I've always been in the camp of angle-independent anisotropic, and was really disappointed by the NV4x, but it just upsets me that the response seems to be much more active now than it was back when ATI was using its highly angle-dependent implementation and nVidia wasn't.
geo said:I dunno, I'd want to see performance numbers that I haven't seen yet before I started yelling "hypocrisy" on this issue. Both relative hit and absolute FPS with it enabled. Context is all, and it seems to me that ATI has been pretty consistent with the "useable features" theme over the last several years, meaning the performance is there to do it at playable framerates, not just "for show".
That is because ATi has used angle AF algorithms with every single core, while NVidia didn't use any on the FX, and when people didnt complain about ATi's af they decided to do the same. Now ATi does what NVidia used to do.SugarCoat said:yes but you're missing a key thing here, Nvidia's AF quality has effectively worsened in the last 2 cores, from the FX to the NV40 to the G70, its been downgrading in quality. ATI's AF quality has not done this core to core, and they continue with this one.
Banko said:That is because ATi has used angle AF algorithms with every single core, while NVidia didn't use any on the FX, and when people didnt complain about ATi's af they decided to do the same. Now ATi does what NVidia used to do.
How is G70's AF worse than NV40's?Nvidia's AF quality has effectively worsened in the last 2 cores, from the FX to the NV40 to the G70, its been downgrading in quality.
geo said:http://www.beyond3d.com/forum/showthread.php?p=587137#post587137
It strikes me this is the most positive public reaction from developers out-of-the-chute that I've seen on an ATI part in some time. . . a vote of confidence that has to be worth a little something about the quality of this part.
You .. just .. shot .. yourself .. in .. the .. foot.ondaedg said:It is also amazing to me that no one is talking about Sander's amazingly accurate review now that we have hindsight.
didn't Carmack prefer NV3x over R3x0 for development? food for thought.Skrying said:Did we just hear the bell letting loose every single gamer out there to grab that new graphics card that ATi made that game dev's like?
In all honesty, its stuff like this that the majority of people who I deal with come to my shop for. People eat it up when they know developer's love a new card. I do to, and this is great news. Hopefully we'll see wide acceptance of all the new features on the X1k cards.
Bob said:How is G70's AF worse than NV40's?
We did some testing to see how much High Quality AF hurt performance and we found only a difference of around a few FPS, so this isn’t a huge burden on this video card.
Isn't it going to vary completely from game to game (i.e., it depends entirely on how many angles are getting AF that wouldn't otherwise get as much)?geo said:Re my point above on performance with HQ AF, I'm still trying to catch up on reviews. . .but I just ran across this at [H] re XL on HL2:
The Baron said:didn't Carmack prefer NV3x over R3x0 for development? food for thought.
(I am in no way, shape, or form equating R520 with NV30.)
Junkstyle said:Yeah Anandtech has it.
http://www.anandtech.com/video/showdoc.aspx?i=2551
AnandTech: ATI's Avivo vs. NVIDIA's PureVideo: De-Interlacing Quality Compared
The Baron said:didn't Carmack prefer NV3x over R3x0 for development? food for thought.
(I am in no way, shape, or form equating R520 with NV30.)
Keep in mind that NV3x had fewer restrictions on the theoretical use of its PS units (e.g., longer instruction counts, silly things that were part of ps_2_a, etc.); it's just that using them for anything but development was, uh, dumb.Skrying said:True, but in all honesty, I think he loves OpenGL more than anything and that makes him love Nvidia cards.
The Baron said:Isn't it going to vary completely from game to game (i.e., it depends entirely on how many angles are getting AF that wouldn't otherwise get as much)?