Is free AA really worth it?

Status
Not open for further replies.
digitalwanderer said:
seismologist said:
For me, the lack of AA is barely noticeable at HD resolution.
Methinks you're in a very small minority on that. :?
I kinda' think so too ... here is a screenshot of Top Spin 2 running on Xbox 360 alpha kit with no AA ... this shot is in full 720p resolution...........:
top-spin-2-20050516112630214.jpg

[source: http://xbox360.ign.com/articles/615/615667p1.html
more direct link: http://media.xbox360.ign.com/media/748/748409/img_2815189.html ]
 
Qroach said:
Quite honestly I believe the real reason that MS wanted and EDRAM design was because XGPU's biggest weakness was transparent fillrate.

do you think this weakness will be evident on the Nvidia chip in PS3?

It may very well be a weakness on RSX as it was a weakness on NV2A compared to the PS2's GS because of it's eDRAM. But the RSX is in a more flexible environment than the NV2A was ever in...so the weakness may not be as apparent when it comes to particle effects and transparencies...

OTOH, this is a Xenos strength, so I'm sure devs will exploit this and try to make it apparent...
 
Titanio said:
If by HD resolution he meant 1080p, though..


Well the discussion was focused around the Xbox which doesn't display 1080p anyway. However, I still think its necessary based on my 16x12 PC gaming experience.
 
SanGreal said:
Titanio said:
If by HD resolution he meant 1080p, though..


Well the discussion was focused around the Xbox which doesn't display 1080p anyway. However, I still think its necessary based on my 16x12 PC gaming experience.

Thought I heard it could with a little more effort (tiling)..?
 
While that pic of Top Spin 2 clearly does have jaggies, look at how barely noticeable they are. Only on a few surfaces (such as the net or the top stair) is the aliasing really noticeable.

My point is that when the resolution is bumped up to 1080p, those minimally noticeable jaggies will become even less apparent then they are now. Is 4xAA on a 1080p image really necessary? I think it's complete overkill.

2x or possibly even only 1x should be enough I would think. Agreed?
 
You call that "barely noticeable"???

That's horrible! Next gen games should not look like that crap.

Maybe 4xAA is overkill at 1080p, but how many games will actually output at 1080p?

And how come modern games still suffer from Aliasing at 1600-1200 resolutions on tiny little 19" monitors? You really think it won't be bad on a 50" big screen?

ALL I want this gen is smooth lines and round tires!!! I'll be totally dissapointed if I see jaggies on ANY next gen games.
 
Titanio said:
SanGreal said:
Titanio said:
If by HD resolution he meant 1080p, though..


Well the discussion was focused around the Xbox which doesn't display 1080p anyway. However, I still think its necessary based on my 16x12 PC gaming experience.

Thought I heard it could with a little more effort (tiling)..?

This is the official word:
Xbox 360 does not support 1080p at this time. It’s an incremental improvement at an astronomical expense, and we don’t see consumers clamoring for 1080p TVs yet. We will continue evaluate the market and deliver the capability when and if customers want it.

http://interviews.teamxbox.com/xbox/1190/Xbox-360-Interview-Todd-Holmdahl/p1/



I tried to take some screenshots of BF2 at 16x12 to demonstrate how bad the jaggies can be. Here are some better 720p x360 examples though:

http://media.teamxbox.com/games/ss/1160/full-res/1116057256.jpg



Check out how bad the jaggies are on this shot at way over 1080p:
http://media.teamxbox.com/games/ss/1217/full-res/1117046235.jpg
 
scooby_dooby said:
seismologist said:
For me, the lack of AA is barely noticeable at HD resolution.
So couldn't that die area be used for something more useful?

WHat games are you playing and how big is your TV?

Current HD games are, for the most part, a complete mess as far as aliasing goes.

50" TV I sit about 6 ft away. Mostly playing PC games like Tracmania Sunrise, BF2, HL2..

but I run them all in 1080i so maybe that's helps against aliasing effect.
 
SanGreal said:
Check out how bad the jaggies are on this shot at way over 1080p:
http://media.teamxbox.com/games/ss/1217/full-res/1117046235.jpg

I'm not sure what game that is, but it certainly doesn't look next-gen to start with ;)

I'm basing my opinion on how Heavenly Sword looked at E3 with 1080p and no AA - though I'm sure they had other post-processing going on compared to the above game to help reduce the perception of aliasing e.g. motion blur, DOF etc.

Put it this way, I could excuse no AA at 1080p, at 720p I'd expect it.
 
Oda said:
While that pic of Top Spin 2 clearly does have jaggies, look at how barely noticeable they are. Only on a few surfaces (such as the net or the top stair) is the aliasing really noticeable.

My point is that when the resolution is bumped up to 1080p, those minimally noticeable jaggies will become even less apparent then they are now. Is 4xAA on a 1080p image really necessary? I think it's complete overkill.

2x or possibly even only 1x should be enough I would think. Agreed?
yes, I agree that @ 1080p , AA will be not as important as @ 720p .... I do believe that many developers making games for PS3 will likely chose to use 1080p with no AA (or perhaps only use some Transparency AA), while other developers making games for PS3 will chose to use 720p with some MSAA .......
 
Titanio said:
SanGreal said:
Check out how bad the jaggies are on this shot at way over 1080p:
http://media.teamxbox.com/games/ss/1217/full-res/1117046235.jpg

I'm not sure what game that is, but it certainly doesn't look next-gen to start with ;)
Hehe, I think we'll have to introduce an official B3D next-gen graphics approval award. ;)

But seriously, that's Saints Row I think. That screenshot doesn't look particularly impressive, but its already way more detailed than any comparable game this gen. It's meant to be a GTA clone, including the huge open ended cities, you'll have less detail in this kind of game than in more tightly enclosed game environments.

Have a look at the gameplay video over at http://www.xboxyde.com - if you can still say this game doesn't look next gen after that, then I don't know how to help you. It's easily 10x as detailed as San Andreas, has fully physics on everything and seems to run buttery smooth already...
 
If you think the aliasing is barely noticeable then I do suggest going to an optometrist to get your eyes checked out. One only has to look at the character and realize that it would end up even worse in motion.
 
The 'free' AA in Xenon is good for 2 things:

1) It alleviates bandwidth demands on the main video memory (which is shared in case of the Xbox 360 with the CPU) and therefore makes a lower bandwidth memory interface acceptable (128-bit v 256)

2) AA improves visual quality at any resolution (I don't think anyone can argue against that) and this helps developers design games that use it. They don't have to think about AA. Just design it so it runs properly without AA and the AA should have zero impact (be free).

However, "nothing is free" comes to mind and there is at least one drawback to designing your hardware this way.

By locking AA in you are using transistors (or resources/costs) in a very specific way. These transistors cannot be used to scale performance. Xenos/C1 is the bottleneck and it is now 100M transistors lighter than it could possibly have been without the eDRAM expenditure. I really doubt this will be a problem, but one can imagine situations where a developer may want to push the hardware to new levels and sacrifice AA for it. This is impossible with the Xenos. Thinking about it, I came to the conclusion that this may be a problem in later parts of the lifecycle of the Xbox 360. Whereas the PS3 can start switching off the AA for more raw performance, the Xenos cannot and doing so would just make 1/3rd of the GPU idle (the eDRAM 100M)(well, not really...hyperbole).

I believe EDRAM has typically been very helpful in devices like these. The PS2 can produce some nice visuals with what otherwise seems like very inferior hardware. However, times change and I'm sure it's best to judge on final hardware and, more importantly, software. It may also be that EDRAM helped with things like particle effects and 2D overlays/sprites, but those days are numbered. You don't want some fast good looking 2D fog these days. You want volumetric fog calculated on the GPU.
 
Yup, in motion those little jaggies turn into that annoying edge-flickering making it more noticeable for me and not less.
 
You have to consider that a TV is not going to give you the image quality you see in developer captures. TVs bleed and even the lower resolution interlaced garbage we know as the standard TV helps hide some of these artifacts. I'm not saying AA is not useful on a TV, but you are not going to see the fine lines you see in developer shots on a TV. Imagine stretching those captures to cover the size of your TV screen for starters.
 
Status
Not open for further replies.
Back
Top