another Dave Orton interview

DemoCoder said:
Depends on how you count usable features. NV3x was bashed over the head for 18 months for lack of HDR support. No games used it, and Valve is stating they won't use it in HL2 because it's too slow on R300. (I speculate they might use it in rare circumstances or cut scenes where it won't perform badly)

nv3x was bashed for a number of really good reasons, I don't recall lack of hdr ever being a really significant part of that.

When did valve say they were not going to support hdr? Got a link?

DemoCoder said:
The status of 3Dc, FP blending/filtering, geometry instancing, SM3.0, et al, are to be determined in the future as to whether they are "usable". I mean, how many games are using boat loads of normal maps over the last 2 years and need hi-res ones? FarCry, HL2, and D3 are probably the first, and D3/HL2 ain't out yet.

For the near future it's going to come down to how easy it is to implement some of these features. Games written from the ground up using any of these in mind are 2-4 years away. Expect to see only glimpses of what SM3, 3Dc etc are capable of in the interim.

DemoCoder said:
The only real universally used features are AA and AF, and I think NV40's 4xAA and 16xAF are emminently usable and good enough in IQ. Sure, sometimes 6x can be switched on with R420 depending on game and resolution, and sometimes temporal AA will work as planned on some games. But you know, sometimes HDR and SM3.0 might be usable as well. We don't know yet.

Good enough is entirely subjective, its like saying shader lengths of 32 is good enough. You can do a lot of nice things with it but more is always better. The question is, what is the cost?

The answer for ATi was apparently that they would rather have the IQ and performance now than the future unknown potential of sm3.0. It wasn't the best choice for every user, but neither is the lower performance and lower IQ, higher power consumption of nv40. Both companies made choices. Time will tell who made the right ones.
 
What makes you think that the NV40 has lower IQ? I'd say that, using comparable settings, both the NV and ATI cards are extremely close in image quality this go around. Most reviewers have given the slight edge to ATI in AA quality, and most reviewers have given the slight edge to NV in AF quality.

Also, when you talk about performance, it totally depends on what game you are talking about. The X800 XT PE is generally slightly faster than the 6800 Ultra when at 1600x1200, 4xAA/16xAF, but this is not the case in every game, especially not in OpenGL games. You can see MikeC's chart at NVNews that compares performance victories across games. I also have a feeling that NV has a lot of room to improve performance once they figure out how to properly optimize for this new architecture and move away from some of the NV3x optimization techniques.
 
jimmyjames123 said:
I think that the "nonsense" that he was talking about was the fact that people were spreading the rumor that the 6800 Ultra requires a 480 watt power supply.

Which is also the same nonsense nvidia told reviewers in their guide for the 6800U.

jimmyjames123 said:
I think you are arguing semantics here, and maybe reading too much into the statement. What Jen Hsung seems to be saying is that the 6800 was/is designed for the hardcore gamer, who would be interested in overclocking the cards and pushing them to the max. To account for headroom above core clock frequency, they needed to add the second molex connector.

Of course that is what he is trying to spin it as.

jimmyjames123 said:
What have been your experiences running the 6800 Ultra with one molex connector so far? Have you followed up yet with other reviewers regarding what their experience has been?

read this thread for some info on what happens with one molex connected.

jimmyjames said:
I'm not sure how much one power cord can supply, but the single molex/single slot 6800 GT seemed to have run just fine using PSU's that would be appropriate with a 9800XT/5950U (ie, 350 watt PSU's). FS was even able to overclock their 6800 GT to slightly beyond 6800U levels.

For how long? Most of the GT reviews went up a few hours after sites received the boards, I would be interested to see some long term type results.

Interestingly he also states that everyone of their boards will ship at the same speed, despite the 450MHz Ultras - not Ultra Extreme's according to UK PR - going to certain select vendors

jimmyjames said:
Isn't this again just arguing semantics? The 450Mhz boards sent to reviewers were not marketed as 6800 Ultras, but rather as 6800 Ultra Extremes. The 350Mhz boards were marketed as 6800 GT. This was shown clearly in the few reviews that included these cards.

I think it is fair to say that Jen Hsun was spinning some things in favor of NV, but that is not so different from what many other CEO's do with their own companies, including ATI.

It's not semantics. Dave just told you they shipped some boards marked as Ultra cards at 450mhz, despite JHH's claim to the contrary.
 
It's not semantics. Dave just told you they shipped some boards marked as Ultra cards at 450mhz, despite JHH's claim to the contrary.

I guess I'm not sure what you mean by "marked as Ultra" cards. The 450Mhz cards are Ultra cards, and from everything I have read, it is pretty clear that they will be an "Extreme" Ultra variant, to be priced higher than the 6800 Ultra. The final price has not been agreed upon, but it is very clear that these 450Mhz cards are not the same as regular 400Mhz Ultra cards. This issue is not the same as sending out review samples to reviewers where what is supposed to be the exact same cards have varying frequencies.

Is it true that ATI sent out X800 cards to reviewers with different core and mem clock speeds? If so, why?
 
jimmyjames123 said:
What makes you think that the NV40 has lower IQ? I'd say that, using comparable settings, both the NV and ATI cards are extremely close in image quality this go around. Most reviewers have given the slight edge to ATI in AA quality, and most reviewers have given the slight edge to NV in AF quality.

At max IQ settings, the ati card will look better. ATi has still has a minute advantage on AA quality and offers a quite usable 6xAA mode. It also offers temporal AA which may give some users a significant IQ benefit in certain cases. ATi has a clear advantage here, however small it may be. The only thing nvidia could claim is that their supersampling in 8XAA will offer advantages in some situations, the problem is that it is mostly too slow to be useful.

jimmyjames said:
Also, when you talk about performance, it totally depends on what game you are talking about. The X800 XT PE is generally slightly faster than the 6800 Ultra when at 1600x1200, 4xAA/16xAF, but this is not the case in every game, especially not in OpenGL games. You can see MikeC's chart at NVNews that compares performance victories across games. I also have a feeling that NV has a lot of room to improve performance once they figure out how to properly optimize for this new architecture and move away from some of the NV3x optimization techniques.

In games where the nv40 finished ahead, what was the framerate of the r420? 90? 100? I am sorry but winning in a quake 3 based game is hardly important when you are exceeding the refresh of most displays. I will believe in mystical performance improvements when I see them. Will they come with brilinear, _pp or will they actually fix the IQ in games like farcry and still be somewhat competitive with the X800 cards?
 
At max IQ settings, the ati card will look better. ATi has still has a minute advantage on AA quality and offers a quite usable 6xAA mode.

At 6xAA on ATI vs 4xAA on NV, the ATI card will look slightly better naturally.

It also offers temporal AA which may give some users a significant IQ benefit in certain cases.

Yes, this is an advantage in games where framerate can be kept sufficiently high.

Temporal AA and a useable 6xAA setting are what help to give ATI the slight edge in AA quality this time around. These features can be taken advantage of on the R3xx cards too.

But then again, AA is only one part of the picture. NV still arguably has the edge in AF quality. Image quality between ATI and NV in general will be similar in my opinion.

In games where the nv40 finished ahead, what was the framerate of the r420? 90? 100? I am sorry but winning in a quake 3 based game is hardly important when you are exceeding the refresh of most displays.

There were some games where the X800 XT finished ahead and where the 6800 Ultra was still around 100 fps. The 6800 Ultra also was arguably as fast or faster in games like Call of Duty, RTCW, Halo, Jedi Academy, NWN, Prince of Persia, Serious Sam. And the differences in most other games was relatively small. The X800 XT PE generally had a slght edge at 1600x1200 with 4xAA/16xAF over the 6800 Ultra, but then again it depends entirely on the game. I think there is little question that the X800 and 6800 cards are peers in terms of overall performance. One card is not inherently superior to the other with respect to performance this go around.

I will believe in mystical performance improvements when I see them. Will they come with brilinear, _pp or will they actually fix the IQ in games like farcry and still be somewhat competitive with the X800 cards?

Obviously no one has an answer to that right now. I have a feeling that people will not be disappointed, but pure speculation at this time.
 
jimmyjames123 said:
At 6xAA on ATI vs 4xAA on NV, the ATI card will look slightly better.

4xAA vs 4xAA the ATi card will look slightly better. Gamma correction is still helping ATi in this area.

jimmyjames123 said:
But then again, AA is only one part of the picture. NV still arguably has the edge in AF quality. Image quality between ATI and NV in general will be similar in my opinion.

If Nvidia does have and enable their non angle dependant AF at usable framerates, I will agree. Personally I find a higher level of AA more important than perfect AF but each to their own.

jimmyjames123 said:
The 6800 Ultra arguably was faster in games like Call of Duty, RTCW, Halo, Jedi Academy, NWN, Price of Persia, Serious Sam.
My point was that most of those games are quake based engines, which the X800 runs at more than adequate speeds despite the 6800U being faster. For example the X800 struggles through SS:SE and RTCW at ~100+fps with 1600x1200 with aa and af on. And I wouldn't say 6800U wins in halo. Anand and some others had the X800 ahead. There is no doubt however that ATi can use an improvement in ogl performance.

jimmyjames123 said:
And the differences in most other games was relatively small.

In TR:AoD and FarCry, nv40 gets creamed. It's not close at all. Nv40 is struggling through these games at high res, while the x800 is handling them quite well.
 
4xAA vs 4xAA the ATi card will look slightly better. Gamma correction is still helping ATi in this area.

True, but even at 4xAA vs 4xAA, the algorithms are still slightly different. Firingsquad did some IQ analysis on the X800 and 6800 RGMS AA, and they found that the X800 looked better in some areas, and the 6800 looked better in other areas, using the same AA setting. They gave ATI the slight edge overall because of temporal AA.

http://www.firingsquad.com/hardware/ati_radeon_x800/page8.asp

If Nvidia does have and enable their non angle dependant AF at usable framerates, I will agree. Personally I find a higher level of AA more important than perfect AF but each to their own.

Even NV's angle dependent algorithm is still different than ATI's. Most reviewers felt that NV's angle dependent AF algorithm was slightly sharper/clearer than ATI's. But you are right, to each his own, because some people concentrate more on AA while others prefer less AA and more AF.

My point was that most of those games are quake based engines, which the X800 runs at more than adequate speeds despite the 6800U being faster. For example the X800 struggles through SS:SE and RTCW at ~100+fps with 1600x1200 with aa and af on.

Some of the games, not all of the games, are based on quake3 engines where the NV40 has the edge. Be careful about painting things with too broad a brush. I am looking at the Xbit Labs review right now at some of the games that the X800 XT leads, using 1600x1200 and 4xAA and 8/16xAF: UT2003, the 6800U is at 98fps; UT2004, the 6800U is at 108fps; Highly anticipated next gen DX9.0 game, the 6800U is at 99fps; DX9.0 game 2 demo 1, the 6800U is at 121fps; DX9.0 game 2 demo 2, the 6800U is at 142fps; Firestarter, the 6800U is at 214fps; Painkiller, 6800U is at 82fps; Max Payne 2, 6800U is at 130fps. Need I go on?

In TR:AoD and FarCry, nv40 gets creamed. It's not close at all. Nv40 is struggling through these games at high res, while the x800 is handling them quite well.

I think you have to put some of this into context, and you have to be more specific when you say X800. First of all, the 6800U is incredibly buggy with FarCry. The 6800 cards get detected as NV3x hardware as far as I know. According to Xbit Labs, the 6800U is actually only slightly behind the X800 XT PE when run without AA/AF, and ahead of the X800 Pro. When AA/AF is enabled in FarCry, the 6800U seems to take an unusually large hit. In fact, at 1600x1200 with AA/AF, the 6800U is barely ahead of the 9800XT (which signifies that something is probably amiss)! The X800 Pro takes a hit too, although not as large. The difference at 1600x1200 with AA/AF between 6800U and X800 Pro is about 6fps. The X800 XT PE is the card that has the clear lead in this game at the moment. However, I expect good things from NV, after they fix the bugs and after the SM 3.0 add-on is available for FarCry.

In Tomb Raider, the X800 XT PE also has a noticeable lead over the other cards. However, the 6800U is shown by Xbit Labs as being as fast or faster than the X800Pro, even with AA/AF enabled at 1600x1200. The X800 XT PE has the lead by 15 fps at 1600x1200 with AA/AF, but the other cards are still quite playable at this setting.

Seriously, try to be more open minded about the NV cards this generation. They do not sacrifice IQ or performance in any major way to obtain their featureset. Probably the main drawback (other than availability of course) is power requirements on the 6800 Ultra and Ultra variants, but the 6800 GT and lower priced variants will not suffer much from this as they are all single slot/single molex cards.
 
jimmyjames123 said:
I think you have to put some of this into context, and you have to be more specific when you say X800. First of all, the 6800U is incredibly buggy with FarCry. The 6800 cards get detected as NV3x hardware as far as I know. According to Xbit Labs, the 6800U is actually only slightly behind the X800 XT PE when run without AA/AF, and ahead of the X800 Pro. When AA/AF is enabled in FarCry, the 6800U seems to take an unusually large hit. In fact, at 1600x1200 with AA/AF, the 6800U is barely ahead of the 9800XT (which signifies that something is probably amiss)! The X800 Pro takes a hit too, although not as large. The difference at 1600x1200 with AA/AF between 6800U and X800 Pro is about 6fps. The X800 XT PE is the card that has the clear lead in this game at the moment. However, I expect good things from NV, after they fix the bugs and after the SM 3.0 add-on is available for FarCry.

In Tomb Raider, the X800 XT PE also has a noticeable lead over the other cards. However, the 6800U is shown by Xbit Labs as being as fast or faster than the X800Pro, even with AA/AF enabled at 1600x1200. The X800 XT PE has the lead by 15 fps at 1600x1200 with AA/AF, but the other cards are still quite playable at this setting.

Look I am comparing the $500 products. If you want to say the 6800U is generally faster than X800pro I would concede that.

I seriously doubt that fixing the IQ on farcry for nv40 is going to lead to a huge performance boost. It loses at high res with AA/AF by as much as 100% atm, and it's still not rendering all of the effects properly when it renders them at all. If nvidia can remove the bugs, improve the IQ and get it rendering farcry at the same speed it's running now that would be a major feat. All this plus a small performance increase would be an astounding feat, it gaining 80% and catching the X800... I don't think its going to happen.

Part of the reason that the X800XT is ahead in some games could well have something to do with the fact that the X800XT has a massive fill rate advantage (31%).

jimmyjames123 said:
Seriously, try to be more open minded about the NV cards this generation. They do not sacrifice IQ or performance in any major way to obtain their featureset. Probably the main drawback (other than availability of course) is power requirements on the 6800 Ultra and Ultra variants, but the 6800 GT and lower priced variants will not suffer much from this as they are all single slot/single molex cards.

They only sacrifice IQ and performance when compared with their competition. There is a price to be paid for those extra 60 million transistors. I would say its about 125 mhz of clock speed. The nv40 is a massive step forward from the FX series and if ps3.0 turns out to be massively adopted and has incredible performance on the nv40 cards, then likely nvidia made the right choice and ATi will lose this round. If the opposite is true and no one bothers with real ps3.0 support they sacrificed the performance lead for useless checkboxes. The truth most likely will lie somewhere in between.

It's all about choices for ATi and Nvidia, balancing performance and features with cost. Nothing comes free... well except temporal aa. ;)
 
jimmyjames123 said:
I think you have to put some of this into context, and you have to be more specific when you say X800. First of all, the 6800U is incredibly buggy with FarCry. The 6800 cards get detected as NV3x hardware as far as I know. According to Xbit Labs, the 6800U is actually only slightly behind the X800 XT PE when run without AA/AF, and ahead of the X800 Pro. When AA/AF is enabled in FarCry, the 6800U seems to take an unusually large hit. In fact, at 1600x1200 with AA/AF, the 6800U is barely ahead of the 9800XT (which signifies that something is probably amiss)! The X800 Pro takes a hit too, although not as large. The difference at 1600x1200 with AA/AF between 6800U and X800 Pro is about 6fps. The X800 XT PE is the card that has the clear lead in this game at the moment. However, I expect good things from NV, after they fix the bugs and after the SM 3.0 add-on is available for FarCry.
You really buy into the NVIDIA marketing machine, eh? First, the fact that the 6800 Ultra is being detected as a 5900 can only help performance as it means that less demanding shaders are used (PS 1.1 vs 2.0). Second, did it ever occur to you that maybe the performance hit with AF is actually real? In the past, NVIDIA's large boosts in AF performance were due to disabling of trilinear and lower degrees of AF. Wonder what will happen this time. :rolleyes:

When the PS 3.0 patch comes out, then maybe the NV40 will perform better than it would have on the PS 2.0 path, but since it's not currently running the PS 2.0 path it's hardly relevant to current results.
In Tomb Raider, the X800 XT PE also has a noticeable lead over the other cards. However, the 6800U is shown by Xbit Labs as being as fast or faster than the X800Pro, even with AA/AF enabled at 1600x1200. The X800 XT PE has the lead by 15 fps at 1600x1200 with AA/AF, but the other cards are still quite playable at this setting.
Congratulations! A $500 card is faster than a $400 card, that's a really compelling argument!

-FUDie
 
I seriously doubt that fixing the IQ on farcry for nv40 is going to lead to a huge performance boost.

Looks like you are just basing your conclusions on the initial set of reviews using very obviously buggy drivers. Look at the Xbit Labs data. The 6800U is only 1-4fps ahead of the 9800XT at 1600x1200 using 4xAA/8xAF. You don't think there is anything strange about that result? Maybe I am more of an optimist than you are, but I think it is wishful thinking to believe that there will not be significant gains in FarCry performance moving forward for the NV 6800 cards. Time will tell, of course.

Part of the reason that the X800XT is ahead in some games could well have something to do with the fact that the X800XT has a massive fill rate advantage (31%).

The X800 XT PE does have the fillrate advantage. At the same time, the 6800 Ultra has a significant advantage in fillrate over the X800 Pro, and yet the FarCry benchmarks with AA and AF don't reflect that. Clearly the issue with FarCry is more complex than sheer fillrate.

They only sacrifice IQ and performance when compared with their competition.

I just proved you wrong above, but obviously you so closed-minded that you are not interested in listening.

There is a price to be paid for those extra 60 million transistors.

I thought it was established that ATI counts transistors more conservatively than NV?

Nothing comes free... well except temporal aa.

Temporal AA has a cost too, in the sense that you have to keep minimum framerates above monitor refresh rate.
 
You really buy into the NVIDIA marketing machine, eh? First, the fact that the 6800 Ultra is being detected as a 5900 can only help performance as it means that less demanding shaders are used (PS 1.1 vs 2.0).

Not necessarily. There were some synthetic tests done where the NV40 was actually faster at times using PS 2.0 vs PS 1.1. Regardless, it's pretty obvious that there are some major bugs to work out in FarCry before it becomes appropriate to compare on a more apples to apples basis.

Second, did it ever occur to you that maybe the performance hit with AF is actually real? In the past, NVIDIA's large boosts in AF performance were due to disabling of trilinear and lower degrees of AF. Wonder what will happen this time.

In the past, NV did not have the more efficient angle dependent AF. Xbit Labs did test the 6800U in FarCry with trilinear optimizations on and optimizations off, and it only gained 2fps at 1600x1200 with AA/AF with optimizations on.
 
AlphaWolf said:
jimmyjames123 said:
At 6xAA on ATI vs 4xAA on NV, the ATI card will look slightly better.

4xAA vs 4xAA the ATi card will look slightly better. Gamma correction is still helping ATi in this area.

I'll take that one step further, ATIs 2XAA looks as good as NVs 4XAA.

EDIT: Since the 6800s 4XAA is identical to its 8XAA, the X800STs 2XAA produces nearly the same quality of AA as the highest AA setting on the 6800.

6800-8xaa

http://techreport.com/reviews/2004q2/radeon-x800/image.x?img=6800-8xaa.png

x800-2xaa

http://techreport.com/reviews/2004q2/radeon-x800/image.x?img=x800-2xaa.png
 
jimmyjames123 said:
Looks like you are just basing your conclusions on the initial set of reviews using very obviously buggy drivers. Look at the Xbit Labs data. The 6800U is only 1-4fps ahead of the 9800XT at 1600x1200 using 4xAA/8xAF. You don't think there is anything strange about that result? Maybe I am more of an optimist than you are, but I think it is wishful thinking to believe that there will not be significant gains in FarCry performance moving forward for the NV 6800 cards. Time will tell, of course.

I have nothing else to base my conclusions on but the facts in evidence. You can base your conclusions on the alignment of the stars if you want. There are other games (mafia) where the nv40 finished behind a 9800XT. The only thing I find funny about the result is that nvidia isn't rendering as many ps2 shaders, or displaying fog. I would love to see how it does running the r300 path.

When it comes to nvidia you are most certainly more of an optimist than I am.

The X800 XT PE does have the fillrate advantage. At the same time, the 6800 Ultra has a significant advantage in fillrate over the X800 Pro, and yet the FarCry benchmarks with AA and AF don't reflect that. Clearly the issue with FarCry is more complex than sheer fillrate.

The 6800U's fillrate advantage over the X800 pro is small in comparison. There are obviously other things at work here. It's not just down to fillrate, bandwidth and drivers. There are a number of other things going on in a graphics chip. Register use, memory handling... the list is extensive.

I just proved you wrong above, but obviously you so closed-minded that you are not interested in listening.

You didn't prove me wrong, I just chose to ignore you. Show me nvidia's usable AA above 4x. These cards are blazingly fast, often only seperated from being cpu limited at high resolutions and high levels of AA/AF. I expect a lot of X800XT users will get a fair bit of use out of that 6XAA.

I thought it was established that ATI counts transistors more conservatively than NV?

I am just going by what JHH said and what reviewers put in their reviews. There is no doubt that nv40 has significantly more transistors than r420.

Temporal AA has a cost too, in the sense that you have to keep minimum framerates above monitor refresh rate.

It was just a joke, hence the ;) . However you are correct temporal aa does require vsync to be active and won't help when refresh rates drop below the threshold.
 
Sabastian said:
I'll take that one step further, ATIs 2XAA looks as good as NVs 4XAA.

EDIT: Since the 6800s 4XAA is identical to its 8XAA, the X800STs 2XAA produces nearly the same quality of AA as the highest AA setting on the 6800.

Umm... How do these pics defend your argument at all? There is a drastic difference between NV's 8XAA and ATI's 2XAA in those screenshots.
Your eyes remove jaggies on ATI's AA for you?! :oops:
 
I have nothing else to base my conclusions on but the facts in evidence.

What facts are you using exactly? You are ignoring the "fact" that there is only a 1-4fps difference between 6800U and 9800XT in FarCry at high resolution with AA/AF, even though the 6800U surges past the 9800XT in virtually every other game and synthetic benchmark. You are also ignoring the "fact" that FarCry is extremely buggy on the 6800 cards. You are ignoring the "fact" that most reviewers have found ATI and NV's image quality to be very similar this go around. You are ignoring the "fact" that the performance victories between 6800 and X800 depend entirely on what game is being tested, and that both are undeniably performance peers.

There are other games (mafia) where the nv40 finished behind a 9800XT.

Was this at 1600x1200, 4xAA/8xAF? And what makes you think this is in any way applicable to FarCry, or applicable to games in general?

The 6800U's fillrate advantage over the X800 pro is small in comparison.

But it is still a significant advantage in fillrate, 12%, is it not? The 6800U also has nearly double the fillrate of the 9800XT too, correct? And yet that is not in any way reflected in the FarCry results.

You didn't prove me wrong, I just chose to ignore you.

Then you are just being ignorant. I showed you a FS article where the 4xAA between ATI and NV cards was very comparable. I also told you about how some reviewers have found NV's AF method to be clearer/sharper than ATI's. I just don't see how anyone in their right mind can say that image quality between the ATI and NV cards will not be very similar this round.

I am just going by what JHH said and what reviewers put in their reviews. There is no doubt that nv40 has significantly more transistors than r420.

When did JHH claim that ATI counts transistors in the same way as NV? The NV40 should have more transistors than the R420 when all is said and done, so what's your point? Since when do people purchase cards based on transistor count?

You seem to be really grasping for straws here to justify one company's design decisions vs the other company's design decisions.
 
bdmosky said:
Sabastian said:
I'll take that one step further, ATIs 2XAA looks as good as NVs 4XAA.

EDIT: Since the 6800s 4XAA is identical to its 8XAA, the X800STs 2XAA produces nearly the same quality of AA as the highest AA setting on the 6800.

Umm... How do these pics defend your argument at all? There is a drastic difference between NV's 8XAA and ATI's 2XAA in those screenshots.
Your eyes remove jaggies on ATI's AA for you?! :oops:

Err, yeah well I think "drastic" is a strong word to use. I thought that the AA disparities were... not so dramatic hence my use of the word nearly. Sure there are a few more jaggies on the 2XAA setting but there is not a "drastic" disparity between them.
 
jimmyjames123 said:
I'll take that one step further, ATIs 2XAA looks as good as NVs 4XAA.

Obviously, FS doesn't agree with you.

Since the 6800s 4XAA is identical to its 8XAA

This is obviously not true at all. NV's 8xAA mode uses SS + MS.

What is this, a matter of national pride or something? :D


http://techreport.com/reviews/2004q2/radeon-x800/image.x?img=6800-4xaa.png

http://techreport.com/reviews/2004q2/radeon-x800/image.x?img=6800-8xaa.png

In terms of there output the 8X AA is nearly the same as the 4X AA. There really isn't much of a difference.
 
You have a sample size of one picture. Way to go.

Trust me, you are wrong. There is a difference between using MS vs using a mix of MS + SS. That difference isn't striking in every single picture at every single angle. You should talk with the reviewers and ask them to show you examples of where IQ will differ among the two modes.
 
Back
Top