R520 = Dissapointment

Well. I honestly dont see the 512 XT falling in price competitively to the 256 meg 7800GTX. The 256 Meg verson of the XT would be far more likely.
 
Chalnoth said:
You know what really pisses me off about this? The entire reason that nVidia doesn't still support their nearly angle-independent anisotropic is because of ATI's own angle-dependent anisotropic, that most users seemed to claim had a minor image quality impact at most.

And now people are saying that ATI's new anisotropic filtering is fantastic? The two stances don't add up.

Granted, I've always been in the camp of angle-independent anisotropic, and was really disappointed by the NV4x, but it just upsets me that the response seems to be much more active now than it was back when ATI was using its highly angle-dependent implementation and nVidia wasn't.

Yeah it's ironic that Nvidia gave up their algorithm for ATI's and now ATI is going to the (angle independent?) one, but doesn't ATI do it for a fraction of the performance hit that Nvidia used to take?

Well, in the long run they both will eventually do it.
 
Last edited by a moderator:
Chalnoth said:
And now people are saying that ATI's new anisotropic filtering is fantastic? The two stances don't add up.

I dunno, I'd want to see performance numbers that I haven't seen yet before I started yelling "hypocrisy" on this issue. Both relative hit and absolute FPS with it enabled. Context is all, and it seems to me that ATI has been pretty consistent with the "useable features" theme over the last several years, meaning the performance is there to do it at playable framerates, not just "for show".
 
Chalnoth said:
You know what really pisses me off about this? The entire reason that nVidia doesn't still support their nearly angle-independent anisotropic is because of ATI's own angle-dependent anisotropic, that most users seemed to claim had a minor image quality impact at most.

And now people are saying that ATI's new anisotropic filtering is fantastic? The two stances don't add up.

Granted, I've always been in the camp of angle-independent anisotropic, and was really disappointed by the NV4x, but it just upsets me that the response seems to be much more active now than it was back when ATI was using its highly angle-dependent implementation and nVidia wasn't.

yes but you're missing a key thing here, Nvidia's AF quality has effectively worsened in the last 2 cores, from the FX to the NV40 to the G70, its been downgrading in quality. ATI's AF quality has not done this core to core, and they continue with this one.
 
geo said:
I dunno, I'd want to see performance numbers that I haven't seen yet before I started yelling "hypocrisy" on this issue. Both relative hit and absolute FPS with it enabled. Context is all, and it seems to me that ATI has been pretty consistent with the "useable features" theme over the last several years, meaning the performance is there to do it at playable framerates, not just "for show".


Bingo..


Beats me that several "knowledgeable reviewers" benchmarked using Nvidia's default AF quality - which is a piece of s&^t IMHO
 
SugarCoat said:
yes but you're missing a key thing here, Nvidia's AF quality has effectively worsened in the last 2 cores, from the FX to the NV40 to the G70, its been downgrading in quality. ATI's AF quality has not done this core to core, and they continue with this one.
That is because ATi has used angle AF algorithms with every single core, while NVidia didn't use any on the FX, and when people didnt complain about ATi's af they decided to do the same. Now ATi does what NVidia used to do.
 
Banko said:
That is because ATi has used angle AF algorithms with every single core, while NVidia didn't use any on the FX, and when people didnt complain about ATi's af they decided to do the same. Now ATi does what NVidia used to do.

Maybe because they couldnt tell the difference but can now? People dont care when it helps performance and doesnt hurt IQ, they do care when it hurts IQ though.
 
Nvidia's AF quality has effectively worsened in the last 2 cores, from the FX to the NV40 to the G70, its been downgrading in quality.
How is G70's AF worse than NV40's?
 
geo said:
http://www.beyond3d.com/forum/showthread.php?p=587137#post587137

It strikes me this is the most positive public reaction from developers out-of-the-chute that I've seen on an ATI part in some time. . . a vote of confidence that has to be worth a little something about the quality of this part.

Did we just hear the bell letting loose every single gamer out there to grab that new graphics card that ATi made that game dev's like?

In all honesty, its stuff like this that the majority of people who I deal with come to my shop for. People eat it up when they know developer's love a new card. I do to, and this is great news. Hopefully we'll see wide acceptance of all the new features on the X1k cards.
 
ondaedg said:
It is also amazing to me that no one is talking about Sander's amazingly accurate review now that we have hindsight.
You .. just .. shot .. yourself .. in .. the .. foot. ;)

Amazingly accurate, yeah. :rolleyes:
 
Skrying said:
Did we just hear the bell letting loose every single gamer out there to grab that new graphics card that ATi made that game dev's like?

In all honesty, its stuff like this that the majority of people who I deal with come to my shop for. People eat it up when they know developer's love a new card. I do to, and this is great news. Hopefully we'll see wide acceptance of all the new features on the X1k cards.
didn't Carmack prefer NV3x over R3x0 for development? food for thought.

(I am in no way, shape, or form equating R520 with NV30.)
 
Bob said:
How is G70's AF worse than NV40's?


Shimmering was and still is far more pronounced. Nvidia has released drivers to correct it. First it was borked on the NV40 launch, fixed via drivers to acceptable standards of most, then the G70 launched plagued with the exact same problem all over again. Even after recent drivers i can still notice a difference in IQ comparison from NV40 to G70, so the problem still very much exists. Its far more pronounced in some titles. The worst part with angle AF is that you simply cannot get rid shimmering, its always there no matter how miniscule or pronounced, and will always cause a problem anytime you use that type of AF in games. If Nvidia keeps on the path, which i doubt, it wouldnt shock me to see the problem arise for a third time on their next core.
 
Re my point above on performance with HQ AF, I'm still trying to catch up on reviews. . .but I just ran across this at [H] re XL on HL2:

We did some testing to see how much High Quality AF hurt performance and we found only a difference of around a few FPS, so this isn’t a huge burden on this video card.
 
geo said:
Re my point above on performance with HQ AF, I'm still trying to catch up on reviews. . .but I just ran across this at [H] re XL on HL2:
Isn't it going to vary completely from game to game (i.e., it depends entirely on how many angles are getting AF that wouldn't otherwise get as much)?
 
The Baron said:
didn't Carmack prefer NV3x over R3x0 for development? food for thought.

(I am in no way, shape, or form equating R520 with NV30.)

Sure, but has "tiebreaker" value, no? Much like NV40 got tiebreakers for features. Didn't we just have this conversation, in fact? :p
 
The Baron said:
didn't Carmack prefer NV3x over R3x0 for development? food for thought.

(I am in no way, shape, or form equating R520 with NV30.)

True, but in all honesty, I think he loves OpenGL more than anything and that makes him love Nvidia cards.

I like new stuff, I was very happy with Nv40 cards because of some of the forward thinking features it had, and I'm really happy with R5x0 with its forward thinking features. I enjoy features that are useful now, and in the future when the other guy isnt offering them. Hehe, that's why I bought a 6800, and now I'm going to buy an X1800XL.... when I know they OC good. Lol.
 
Last edited by a moderator:
Skrying said:
True, but in all honesty, I think he loves OpenGL more than anything and that makes him love Nvidia cards.
Keep in mind that NV3x had fewer restrictions on the theoretical use of its PS units (e.g., longer instruction counts, silly things that were part of ps_2_a, etc.); it's just that using them for anything but development was, uh, dumb.
 
The Baron said:
Isn't it going to vary completely from game to game (i.e., it depends entirely on how many angles are getting AF that wouldn't otherwise get as much)?

Mebbee. Went with what I had. I know that you'll believe me when I say I'd have come back here with an opposite indicator. So, somewhat better than anecdotal (because Brent is a trained observer), but I'd agree not conclusive.
 
Back
Top