"4xAA": Microsofts big mistake.

Status
Not open for further replies.

Brimstone

B3D Shockwave Rider
Veteran
IMHO Microsoft and ATI have made a big mistake with how they are marketing 4xAA on the Xenos. The image quality between the Xenos and RSX won't be the same, yet consumers after reading a spec sheet will conclue 4xAA on Xenos and the RSX are the same because of the "4xAA tag", and they aren't.

The 4xAA for almost "free" isn't very informative. Nvidia marketing has expertly blured the line between the G70 (RSX) and Xenos A.A. so far.


From the Anand Tech article.


At 720p, the G70 is entirely CPU bound in just about every game we’ve tested, so the RSX should have no problems running at 720p with 4X AA enabled, just like the 360’s Xenos GPU. At 1080p, the G70 is still CPU bound in a number of situations, so it is quite possible for RSX to actually run just fine at 1080p which should provide for some excellent image quality.

http://www.anandtech.com/video/showdoc.aspx?i=2453&p=10

To me one of the best attributes of Xenos has been pulverized by smart propaganda from Nvidia. So far Nvidia has stuck it to Xenos big time.


And these slides are great.


1119063771Y3O0GyEDBw_4_9_l.jpg


1119063771Y3O0GyEDBw_4_10_l.jpg
 
Not all the facts.

In fact, at 1080p G70 takes ~40% hit in performance with 4x MSAA in a couple modern games like D3. So comparing how the G70 does on old DX7/DX8 level games is very misleading on how well it will perform in games with much higher requirements.

The PS3 and 360 are expected to have games that are a couple steps beyond D3. Just look at the high detail / high geometry Sony Render Targets. This will have even an even more significant hit on performance.

And the eDRAM is not only for AA, but also Alpha blends, Z, and other backbuffer tasks.
 
Since when does halo support AA? :rolleyes:


.and nvidia don't know that 720p means 1280*768 and not 1024*768 :rolleyes: :rolleyes:

And these slides are :rolleyes: :rolleyes: :rolleyes: :rolleyes: :rolleyes:
 
Acert93 said:
Not all the facts.

In fact, at 1080p G70 takes ~40% hit in performance with 4x MSAA in a couple modern games like D3. So comparing how the G70 does on old DX7/DX8 level games is very misleading on how well it will perform in games with much higher requirements.

The PS3 and 360 are expected to have games that are a couple steps beyond D3. Just look at the high detail / high geometry Sony Render Targets. This will have even an even more significant hit on performance.

And the eDRAM is not only for AA, but also Alpha blends, Z, and other backbuffer tasks.


When talking about A.A. and Filtering, it can get pretty esoteric and put people to sleep. So far Nvidia has totally neutralized the Smart Memory Xenos has. Nvidia has taken a scapel to the Xenos A.A. capabilities and carved it up with the skill of a surgeon.
 
Will the bandwidth on the PS3 also be a limiting factor? I think it's hard to compare the G70 with a RSX considering the radical differences between the two systems it will be in. Then comparing it to Xenos, well… :?
 
FiggyG said:
Will the bandwidth on the PS3 also be a limiting factor?

When you talk about AA, it boils down to bandwidth.

One can concoct scenarios when anything would be bandwidth limited.

I don't think it's going to be a massive issue, at least relatively speaking.
 
Well you beat me to the slides (was looking at the forum):

http://www.beyond3d.com/forum/viewtopic.php?t=24214&postdays=0&postorder=asc&start=300

Note that at Doom 3 (a low geometry game) there is a 17% hit at 4xAA at 720p. At 1080p there is a 44% AND the framerate is BELOW 60fps. FarCry sees a 6% and 39% hits, respectively.

Basically that slide is extremely misleading. By lumping in older games with the newer games its gives the impression that PS3 games wont take a significant hit when AA is enabled.

Further, the 3rd slide slyly mentions SSAA and HDR. But the devil is in the details because the previous charts are MSAA, G70 cannot do MSAA and HDR, and SSAA has a much bigger performance hit.

So looking at the progressive games--like FC and D3--we can see that there is a SIGNIFICANT hit for enabling 4x AA.

Who knows what it will be when we begin seeing games like Killzone, MotorStorm, etc... that have a ton of geometry.

The proof of the pudding is in the eating. Obviously these slides mean nothing to the mainstream consumer, so the entire "Oh noes MS is dead" stuff is irrelevant.

In reality, if RSX is limited like the G70 and cannot do HDR & AA at the same time, while Xenos does 4xAA and FP10 (or FP16), there will be a distinct IQ difference.

In that perspective these slides are meaningless for next gen game usage.
 
Titanio said:
I don't think anyone will even attempt 4xAA at 1080p, and it isn't as needed at that resolution anyway.

Anyone using a PS2 at 1080p is likely to be playing on a pretty big HDTV. I can imagine 4xAA would still be pretty useful when your pixels are that big.
 
Gerry said:
Anyone using a PS2 at 1080p is likely to be playing on a pretty big HDTV.

Not necessarily. More resolution is the most direct alleviator of aliasing - you'd probably have to be sitting closer than is comfortable to be picking apart pixels. Add in motion + other postprocessing..

I think the argument is complex enough without adding television size + viewing distance ;)

I would take AA at 1080p, but I don't think it's as necessary as at lower resolutions. Most HD people anyway will be getting a 2xSSAA image out of that (at 720p), and many many more will get an even more AAed image on SDTVs.

I think it'll be a case of 1080p with no AA (or 2x at most), or 720p with AA, and personally I'd be fine with that.
 
Brimstone said:
When talking about A.A. and Filtering, it can get pretty esoteric and put people to sleep. So far Nvidia has totally neutralized the Smart Memory Xenos has. Nvidia has taken a scapel to the Xenos A.A. capabilities and carved it up with the skill of a surgeon.

You REALLY need to go back and look at those figures, look at when some of those games were released (and look at the rendering technologies in them) and compare them to Next Gen Targets.

Do you really believe next gen games will have less geometry than FarCry and Doom 3?

Also, note that some of those games are CPU limited. CPU limited games give us very little feedback on GPU performance hits.

Anyhow, it seems you are convinced NV has just lobotomized ATI. I think when 360 games have HDR + AA on at the same time in highly detailed games we can revisit these slides ;)
 
Something tells me the general consumer isn't going to be looking at g70 pr slides from anandtech when they decide which console to buy. In the end it will be about the games and brand recognition, not necessarily in that order.
 
Wasn't this same topic on the slides already discussed?

Well, anyway yeah those slides are crap - and I have to wonder if it even has anything to do with RSX or not. I mean, it seems aimed at Xenos - that's for sure - but it just seems so otherwise out of place.
 
The relation between these slides and RSX is flawed on a number of levels - not just because you're comparing current PC games to next-gen games, but also in fairness to RSX/PS3, you're looking at games that a) aren't specfically targetted at a closed platform and thus can't take specific advantage of any one gpu in the same way PS3 (exclusives, at least) can and b) are running on a card with less bandwidth to memory than RSX. I'm not saying that those necessarily balance out the first flaw, just that it's not a one-way street.
 
Titanio said:
Gerry said:
Anyone using a PS2 at 1080p is likely to be playing on a pretty big HDTV.

Not necessarily. More resolution is the most direct alleviator of aliasing - you'd probably have to be sitting closer than is comfortable to be picking apart pixels. Add in motion + other postprocessing.

Upping the resolution just makes more jagges, albit smaller ones. Sure, if they are small enough then they won't be noticed, but with consoles getting played on 50"+ screens, even 1080p has a rather low DPI compared to say a 21" monitor at 1600x1200.
 
kyleb said:
Upping the resolution just makes more jagges, albit smaller ones. Sure, if they are small enough then they won't be noticed, but with consoles getting played on 50"+ screens, even 1080p has a rather low DPI compared to say a 21" monitor at 1600x1200.

I'm not disagreeing, but you're making the argument even more complex, and trying now to not only account for variables between games, but between gamers' setups too.

Also, in saying that it's not so much of a problem, I'm comparing to the previous generation of consoles, not 1600x1200 PC monitors ;) Furthermore, with the type of post-processing that should be going on elsewhere in a lot of next-gen games (DOF, motion blur etc.), I think they'll also have an impact on percieved aliasing.

Anyone want to send me a 1080p HS trailer and a 1080p projector so I can experiment with the "big screen" impact? Please? ;)
 
Acert93 said:
Brimstone said:
When talking about A.A. and Filtering, it can get pretty esoteric and put people to sleep. So far Nvidia has totally neutralized the Smart Memory Xenos has. Nvidia has taken a scapel to the Xenos A.A. capabilities and carved it up with the skill of a surgeon.

You REALLY need to go back and look at those figures, look at when some of those games were released (and look at the rendering technologies in them) and compare them to Next Gen Targets.

Do you really believe next gen games will have less geometry than FarCry and Doom 3?

Also, note that some of those games are CPU limited. CPU limited games give us very little feedback on GPU performance hits.

Anyhow, it seems you are convinced NV has just lobotomized ATI. I think when 360 games have HDR + AA on at the same time in highly detailed games we can revisit these slides ;)

I'm not saying that the RSX is better. The marketing of Nvidia has damaged one of the key aspects of Xenos. The A.A. capabilities of the RSX aren't even in the same ballpark as Xenons, yet mainstream consumers reading the press are going to conclude there isn't much special about the second 10 MB core Xenos has.

A.A. isn't the same as cranking up the resolution either.
 
Brimstone said:
A.A. isn't the same as cranking up the resolution either.

Technically true, but it's a solution to a lack of resolution though ;)

1080p isn't "enough" resolution, of course, but it goes a long way IMO.
 
I have to agree with the original post to a point.. it is a good piece of spin from NVidia, though completely misleading. Hey NVidia & Sony, Why not just add Quake 1 or the original Unreal Tournament (EDIT: LOL, they did include it) to the list of games on your slides to bring the average % hit from enabling AA even lower ;)
 
Status
Not open for further replies.
Back
Top