So, the RSX can do AA or HDR... but not both?

Status
Not open for further replies.

BenQ

Newcomer
I have read many times on these boards that the RSX ( G70 ) can do AA or HDR, but not both.

Is that correct?

I know some things fairly well, but I'm no hardware expert like many of you are. Can you exaplin to me in simple terms why the RSX ( G70 ) can't do both.

I have also read that the Xenos IS capable of doing both ( AA and HDR ), correct?

Why can the Xenos do both, but not the RSX?
 
We don't know about RSX yet, only G70. G70 can do HDR + AA, but only SuperSampling AA, not MultiSampling AA which is more efficient.
 
Shifty Geezer said:
We don't know about RSX yet, only G70. G70 can do HDR + AA, but only SuperSampling AA, not MultiSampling AA which is more efficient.

How much more of a performance hit is SSAA over MSAA?
 
Enough that [H]ard|OCP only recommends it in new games with SLI 7800GTX. e.g.

http://www.hardocp.com/article.html?art=Nzg0LDEw

DOOM 3
7800GTX 1600x1200 4x TR MSAA 8xAF :: 53.4fps
SLI 7800GTX 1600x1200 4x TR SSAA 8xAF :: 57.5fps

Although some older games seem to run with SSAA at an acceptible framerate. Basically New games = No, some old games = Yes.
 
(Without looking at the link) I assume "Tr" in the front means Transparency SSAA, which isn't SuperSampling (just SuperSampling on Alpha textures).
 
Thanks for the info Acert93 = )

But I still don't understand WHY the G70 has to choose between AA or HDR.

What is the limiting factor?
 
DaveBaumann said:
(Without looking at the link) I assume "Tr" in the front means Transparency SSAA, which isn't SuperSampling (just SuperSampling on Alpha textures).

But, would you say that the info Acert93 provided is still an accurate representation of the differences in performance between MSAA and SSAA?
 
Some thoughs about AA.

- Now that I finally have a decent video card (Ati 9800), I think that with higher resolutions like 1280x1024, even 2X AA does a nice job, even on my LCD monitor where I can see every single pixel, and I'm sitting close to it. I'd imagine on a TV where everything always looks a bit more blury, the result would be even better. I don't know how much of a hit 2xSSAA would be on a G70 class chip, though.

- MSAA seems to be doing poor job on fixing aliasing on some pixel shader visuals. VERY poor job, in fact nonexistant. I've noticed this in some recent demos I ran which were pixel shader heavy. With them, even 4XMSAA looked really aliased for some reason. On the other hand, 3D Mark 2003 and 2005 looked just fine, but I have no idea what kind of AA they're defaulting to.

So basically, 2xSSAA may be the right choice with HDR + pixel shaded visuals on RSX, depending of course, on how much of a hit that presents.
 
BenQ said:
DaveBaumann said:
(Without looking at the link) I assume "Tr" in the front means Transparency SSAA, which isn't SuperSampling (just SuperSampling on Alpha textures).

But, would you say that the info Acert93 provided is still an accurate representation of the differences in performance between MSAA and SSAA?

Standard SSAA meaning that every pixel is oversampled multiple times, MSAA means that oversampling only occurs where a polygon edge interestcs a pixel. So, with 4x FSAA multisampling means a little greater than 1 texture sample and one pass through the shaders is required per pixel, but 4x colour and z values are required (which are also accelerated by the pipelines being able to take more than one multisample colour/z value per cycle), whilst for every pixel with supersampling 4x texture samples are required, 4x shader passes are required and 4x colour/z samples are required.
 
BenQ said:
DaveBaumann said:
(Without looking at the link) I assume "Tr" in the front means Transparency SSAA, which isn't SuperSampling (just SuperSampling on Alpha textures).

But, would you say that the info Acert93 provided is still an accurate representation of the differences in performance between MSAA and SSAA?

Basically on the 7800GTX the hit is about 50% in framerate (give or take depending on the game and complexity) over MSAA. If MSAA has a 40% hit in Doom3 at 1600x1200, and then you take another 50% on top of that for SSAA... OUCH.

IMO if RSX is basically a faster G70 then SSAA is really out of question for most next gen games.

As for WHY NV40/G70 cannot do HDR and MSAA at the same time, I do not know the specifics but it seems it would take more logic realestate to impliment it. That would be the easy answer: cost.

And realistically, it is probably a good choice. HDR is a big hit to performance. MSAA is a big hit in modern games. G70 can do these features independantly with good playable results at high resolutions, but together it would be messy. Transistor budget was spent better elsewhere than supporting both features together when G70 probably could not power both features at the same time in most modern games at high resolutions.

That transistor budget was a concern I think can be deduced from the design. 24 fragment pipes and 16 ROPs. Why not 24 ROPs?

Obviously the "WHY" is a bit of conjecture, but the pieces seem to fit.
 
I'd imagine on a TV where everything always looks a bit more blury, the result would be even better.

Exactly running at 720p compared to 480p and below the need for AA is reduced quite a bit. Personally I'd rather see them have better textures\effects rather than run some high end AA.
 
c0_re said:
I'd imagine on a TV where everything always looks a bit more blury, the result would be even better.

Exactly running at 720p compared to 480p and below the need for AA is reduced quite a bit. Personally I'd rather see them have better textures\effects rather than run some high end AA.
But wouldn't the clarity of HD reduce the blurring and the fact that HD screens are usually larger show the aliasing effects better.
 
True but it depends on how close you sit and how large the screen is, personally jaggies don't really bother me to much the days of low polygon models are pretty much gone.

I won't say having AA isn't nice I just see it as an extra there's other things that are more important to me, HDR is deffinetly one of them.

Just my opinion take it for what it's worth

Ok don't get me wrong I'm an Xbox fan but is the AA really going to be TOTTALLY "FREE" I just find it kinda hard to belive can't they use that for somthing else besides AA?
 
I don't know how many of you actually have big-screens but jaggies are TERRIBLE!

I have a 1080i CRT 46" and every single xbox game I have has noticeable jaggies. Halo 2 even has a few, but not bad.

Even MVP baseball at 720p has some jaggies, although few they are still noticeable.

So even the best games still have some jaggies, most have alot.

I think they look terrible, cheap, and they ruin what would otherwise be good graphics. Games like GTA SA are so bad on a big screen that you nearly have tio turn HD off just to play.

It bothers me when people start talking about AA not really being needed etc, I think it's most definately needed, Aliasing looks terrible and it's extremely noticeable on big screens.

The fact all 360 games will probably have 4xAA is excellent news to a big-screen owner like myself.

I think AA should be a priority for all next-gen games, people who don't mind jaggies probably have small TV's, I know they didn't bother me until I upgraded, and now I can't stand them.
 
scooby_dooby said:
I don't know how many of you actually have big-screens but jaggies are TERRIBLE!

I have a 1080i CRT 46" and every single xbox game I have has noticeable jaggies. Halo 2 even has a few, but not bad.

Very few Xbox games actually output a native HD resolution. Halo 2 isn't one of them, GTA isn't one of them. MVP is one, but is it using any AA at that res (AA at 720p shouldn't be a prob on either next-gen system, potential HDR tradeoffs aside)? I'm not sure if (m)any Xbox games can be used as any kind of reference re. the relationship between resolution and aliasing in next-gen games (especially given other factors which may have an impact perceptually, that should be either as standard or more common next gen - eg. higher poly counts, scene-postprocessing etc.).

More resolution does help reduce aliasing, or perceived aliasing. Funamentally aliasing is due to a lack of resolution.
 
Those nice, polished PR screens of games that companies have been sending out for years use really, really high levels of anti-aliasing to help look so perfect. The low levels of AA which next generation console games will be able to use don't even come close, so aliasing will still be very apparent in tomorrow's high definition games. 2xAA is certainly not enough, and 4xAA, while defintely better, isn't either to reach the ideal pre-render smooth.

marconelly!:
I think that with higher resolutions like 1280x1024, even 2X AA does a nice job, even on my LCD monitor where I can see every single pixel, and I'm sitting close to it. I'd imagine on a TV where everything always looks a bit more blury, the result would be even better.
Anti-aliasing has a more noticeable effect when the display is sharp, but there's a little less need for AA in the first place when the display isn't so sharp.
 
I wonder how much of an impact 2xSSAA (just 2x) would have on a HDR rendered picture on G70? After all is said and done could that be the best solution for it? Also, am I right in my observation that MSAA doesn't help any with pixel shaded effects aliasing?

And realistically, it is probably a good choice. HDR is a big hit to performance.
Is it really on G70? Does anyone care to speculate what does the "full speed HDR" quote really means? (found on nvidia's 7800 PDF)
 
I wonder how much of an impact 2xSSAA (just 2x) would have on a HDR rendered picture on G70? After all is said and done could that be the best solution for it? Also, am I right in my observation that MSAA doesn't help any with pixel shaded effects aliasing?

hdr has a big bandwidth and processing hit to it , super sampling has a huge bandwidth and processing hit . I don't think putting together the two is a smart idea .
 
Status
Not open for further replies.
Back
Top