X360 and Anti Aliasing?

from ERP:
Using AA requires using tiling which requires your rendering engine to support it in some reasonable fashion. It isn't just a turn it on and it works thing, and as has been mentioned it is not completly free.

I think it was probably considered a significant risk by some early titles since you couldn't measure the impact until final kits were available so they chose to avoid it. There are more than enough risks with launch titles as it is without adding to them.

Thanks for the insight.
Would you anticipate that future titles, where devs have had more time with the hardware, will opt to use the AA features of Xenos?
 
Vysez said:
It's true that 2XRGMSAA is not the be all end all, but what would you have liked to see be done with the ~20M transistors, in the daughter die, budget used for this AA?
Honestly, there's not much you could do. Well, not much that I could imagine, anyway.

Well, a few things come to mind... They could insert a frame with a pic of naked [put hot chick of the moment here] every 60 frames. Subliminal messages anyone!? Most people won't even notice... errr... :LOL: And call it the P(orn)-Buffer.

Ok sorry.
 
Vysez said:
It's true that 2XRGMSAA is not the be all end all, but what would you have liked to see be done with the ~20M transistors, in the daughter die, budget used for this AA?
Honestly, there's not much you could do. Well, not much that I could imagine, anyway.
Presumably there is more that can be done because from the sounds of it, developers ARE doing things other than using AA. If it's a choice between 4xAA, DOF or MoBlur, I'll take the other two over the AA. I think the eDRAM is actually better suited for these tasks anyway. I think MS presented the AA facet whereas the other capabilities of the eDRAM solution are more worthwhile. Though really I haven't enough info to know exactly what and how the eDRAM is doing what it does, so don't know how certain effects use up resources from other effects. If the AA can be added without impacting anything else, sure thing go use it. But if AA comes at a cost to other effects, I wouldn't instantly place AA as a higher priority to include.
 
coco cola and other companys did that if I'm not mistaken back many years ago.
just flash a frame inbetween during the moviez .
i think thats forbidden now.

Or is it an urban legend i dont know

anyway, personaly i prefer more effects better gfx then wastig resources on AA. (even if it is only 5% penalty)
 
Subliminal messages are illegal in most places I believe. But I never understood why. I've seen TV programs with deliberate subliminals (unannounced) and they were mindnumbingly obvious IMO. At 24 fps a movie sublimal would be a painful interruption and everyone would be asking what it's in for, I'm sure.
 
Shifty Geezer said:
Presumably there is more that can be done because from the sounds of it, developers ARE doing things other than using AA. If it's a choice between 4xAA, DOF or MoBlur, I'll take the other two over the AA. I think the eDRAM is actually better suited for these tasks anyway. I think MS presented the AA facet whereas the other capabilities of the eDRAM solution are more worthwhile. Though really I haven't enough info to know exactly what and how the eDRAM is doing what it does, so don't know how certain effects use up resources from other effects. If the AA can be added without impacting anything else, sure thing go use it. But if AA comes at a cost to other effects, I wouldn't instantly place AA as a higher priority to include.

What i am thinking at the moment is that the final image you will even on a HDTV will be scaled by the TV. So the edges and the image in general will be smoothed out.

Most HDTVs in Europe don't have a native resolution of 1280x720, they're more like 1368x768, which means that we won't get 1:1 pixel mapping and therefore the image we will see will be slightly scaled and therefore "smoother" because of that.
Not many people will get 1:1 pixel mapping on their sets.
Even then, at the distance people usually play games at (in a living room), i really don't see the point of 2xAA. Might as well leave it and use that 5% for something else.
If it were 8xAA then it's another story as that would definately show compared to a non-AA image, though i'm sure there would still be people who can't tell the difference.
Most people will be playing on SD displays, which means they will get free AA from the downsampling.

At this point, i really don't see the point of 2xAA. 4xAA is a bit better and starts getting noticed, but i think this is all a PR manouver to make the internet screenshots look better, have the cool sounding AA "checklist" feature, for something that really won't be noticed by many people.

Even PS3, even if it doesn't output at 1080p as default, the amount of scaling the final image will go through before hitting our eyes will make AA useless.
 
. . . Even then, at the distance people usually play games at (in a living room), i really don't see the point of 2xAA. Might as well leave it and use that 5% for something else. . .
The 5% hit was supposed to be for 4XAA.

Anywhoo, sounds like you don't feel AA is that big a deal due to the higher resolution. Early on, from what MS was saying, it sounded like virtually all games would feature 4XAA but devs seem to be heading a different way. We'll see how it all pans out. . .
 
I was under the impression that a lot of computational power was sacrifice in the Xenos main die to allow for the daughter die, in order for 4xAA to be applied with minimal performance hit. Yet it's not being used, or isn't as useful as it once was?

Am I wrong, or was this a huge mistake on ATi's part?
 
LB's saying (and I agree) that AA isn't a big deal because it makes negligable difference at 2xMS, regardless of resolution. The point about HD resolution is that for most gamers on SD sets we'll get true supersampling AA which does make a difference. So jaggies will be reduced on all sets either by being smaller on HDTVs, or supersampled down for SDTV. Definitely a marked and wanted improvement over a 640x480 no AA display even without any rendering AA being added.
 
RobHT said:
The 5% hit was supposed to be for 4XAA.

Anywhoo, sounds like you don't feel AA is that big a deal due to the higher resolution. Early on, from what MS was saying, it sounded like virtually all games would feature 4XAA but devs seem to be heading a different way. We'll see how it all pans out. . .

Supposedly...

Personally i feel that AA will be needed mostly for people who will play X360 (or PS3) games on PC monitors.
In a living room, at the distances people stay from the TV in average, i think it really isn't that big a deal, especially if we consider the fact that we'll be playing on so many kinds of different TV sets that in the end the final image will only be an approximation (for better or for worse) than what one would see on a PC monitor (being so close to it while playing).
 
Gholbine said:
I was under the impression that a lot of computational power was sacrifice in the Xenos main die to allow for the daughter die, in order for 4xAA to be applied with minimal performance hit. Yet it's not being used, or isn't as useful as it once was?

Am I wrong, or was this a huge mistake on ATi's part?
I think looking at the eDRAM as solely providing AA is where you're going wrong. Most importantly it alleviates a huge amount of BW demands from system RAM. The eDRAM can cope with loads of alpha blending effortlessly, whereas a typical architecture like PS3 will noice the BW impact of such features. I think the AA functionality is just being overstated as that's more a checklist, bulletpoint spec people can latch onto - 'XB360 has AA and PS3 doesn't'. It's benefits are much more than that.
 
london-boy said:
In a living room, at the distances people stay from the TV in average, i think it really isn't that big a deal, especially if we consider the fact that we'll be playing on so many kinds of different TV sets that in the end the final image will only be an approximation (for better or for worse) than what one would see on a PC monitor (being so close to it while playing).
I'm about 2 foot from my 1024x768 LCD monitor. Playgin GW at 1024x768 jaggies aren't at all noticeable in the main playing, though of course texture aliasing is obvious and those near horizontal and vertical lines too. I jiggled with the settings to turn on AA and noticed almost no difference. Same with 4xAA. What really struck me was that the texture were still aliased and they were making up the most of the jaggies. If I stop and look the AA makes a difference obviously, but it has little impact on the game in general.

From that experience I think texture AA is more important. This seems to be something next-gen consoles will feature. Certainly PGR3's wirefences were looknig pretty smooth last I saw of them.
 
Shifty Geezer said:
I'm about 2 foot from my 1024x768 LCD monitor. Playgin GW at 1024x768 jaggies aren't at all noticeable in the main playing, though of course texture aliasing is obvious and those near horizontal and vertical lines too. I jiggled with the settings to turn on AA and noticed almost no difference. Same with 4xAA. What really struck me was that the texture were still aliased and they were making up the most of the jaggies. If I stop and look the AA makes a difference obviously, but it has little impact on the game in general.

From that experience I think texture AA is more important. This seems to be something next-gen consoles will feature. Certainly PGR3's wirefences were looknig pretty smooth last I saw of them.

Just a question, but what video card are you using? And what level of AF? ;)
AA starts being damn nice at 8X, when it makes games look almost like CGI image quality (shame the content itself is still the same crappy PC-level graphics). 16X is a feast for the eyes.
 
Good old Ti4200. I don't do much PC gaming (apart from the original Master of Orion :oops: ) so never really got to try out AA. I got Ti4200 around the time I got Morrowind, and was very disappointed how little improvement it added to framerate over the previous card, but subsequently learnt that was Bethesda's fault and not the GPU. And GW's the first time I tried AA and was so let down! I thought it was going to make an amazing difference but for the drop in framerate versus the benefit to IQ I stick to leaving it off.
 
Shifty Geezer said:
Good old Ti4200. I don't do much PC gaming (apart from the original Master of Orion :oops: ) so never really got to try out AA. I got Ti4200 around the time I got Morrowind, and was very disappointed how little improvement it added to framerate over the previous card, but subsequently learnt that was Bethesda's fault and not the GPU. And GW's the first time I tried AA and was so let down! I thought it was going to make an amazing difference but for the drop in framerate versus the benefit to IQ I stick to leaving it off.

That explains it then.

I think we need to be careful around here. There are many geeks around here who will bitch about having 2xAA and other things that are just checklist features, without realising that even on big HDTV, the distance will make it almost negligeable.
I think developers should focus on techniques that DO make a big difference to the final image (DOF, MoBlur, HDR and some others) instead of trying to squeeze a feature that only makes a few geeks wet with excitement, but is not even noticed by anyone else.

On a PC monitor, i cannot see a difference between a 2xAA image and one without AA unless i actually stop and search for it, only to find out that the 2xAA one still has jaggies anyway. At 4X things start getting better and general IQ is improved. Not by much mind you, but the general feel is that it's "cleaner". AA is not only for jaggies in the end.

On my (future) Samsung HDTV, i will probably never see the difference between a 2xAA image and a non-AA one at 720p.

I was thinking however, the stands in the shops, the ones for X360 with the 23" Samsung HDTVs, will be quite close to people's eyes - a bit like using a PC monitor - so i guess things will be a bit more noticeable there.
 
I'm no AA-whore, so developers having freedom of choice to do as they like - I'm all for it and I think this is better in the end. Still, I agree with the thinking that Microsoft sort-of misstepped in the whole initial 'mandate' to developers to implement AA in their games. And the fact that the 'free' use of such AA is not taking place seems to imply that somehow, somehwere, all did not go to plan. Maybe devs will work it out such that they are able to wring out the 'free' performance upgrades down the line, but as it stands now - the eDRAM is nice, but seems to not be *as* nice as would have been originally intended, or at least indicated by the hyperbole.
 
With the increase in poly-pushing power there will be an increase in poly-edge aliasing (assuming more objects on screen). So mandating AA is a good thing.

Cheers
Gubbi
 
Gubbi said:
With the increase in poly-pushing power there will be an increase in poly-edge aliasing (assuming more objects on screen). So mandating AA is a good thing.

Cheers
Gubbi

No one's disputing that. What i'm wondering is how much 2xAA is going to help there.
 
I'd say quite the contrary. Three triangle edges formaing a 'curved surface' will have three longer edges where aliasing will appear versus 10 trianlge edge. And jaggies are only noticable on the nearer horizontal/vertical edges. More polys=more polys per curve = less near-horizontal/vertical edges.

Take for example a football (soccerball). A white football on a dark green background will have jaggies (we're zoomed in to really appreciate the quality of normal-mapping and textures on this excellent ball!). If that ball has 12 polygons making up it's circumference, 4 of those edges will be at 45 degrees and not exhibit jaggieness. The other 8 are near H/V edges and the jaggies will be obvious.

If instead the ball's outline consists of enough polys to make a near perfect sphere, the appearance of jaggies is limited to only near the top/bottom and left/right sides.

Stepped lines are worse culprits than stepped curves.

The ony way more polys = more aliasing is if those polys are used for wildly contrasting textured/lit objects. 1000 green orcs of 10,000 polys each in green armour on a green hill is going to have less noticeable aliasing than 10 jesters of 1,000 polys all in bright and different coloured gear in a grey castle.
 
Back
Top