Is free AA really worth it?

Status
Not open for further replies.
LOL

11170462352.jpg
 
That image is subjective as hell!

The first thing I noticed was crappy lighting, textures, polygons and BAD ART before I noticed the aliasing! :p
 
scooby_dooby said:
seismologist said:
For me, the lack of AA is barely noticeable at HD resolution.
So couldn't that die area be used for something more useful?

WHat games are you playing and how big is your TV?

Current HD games are, for the most part, a complete mess as far as aliasing goes.

What HD games? There's like 3 in existense, unless you count PC games outputted to a TV.

If by HD resolution he meant 1080p, though..

I believe 1080p is only slightly higher res than 1600x1200.

1920x1080 = 2,073,600 pixels
1600x1200 = 1,920,000 pixels

Based on what I've done in counterstrike source, I'd say that 2x AA is all that's needed at 1600x1200, and above that AA isn't really needed for current games.

That and I've found that not all screens have aliasing as noticable, my LCD tends to have far less noticable jaggies(and the giant pixel effect isn't as bad for low res games) than my CRT.

1x should be enough I would think. Agreed?

1x would be no AA. That would just be one sample of the image.

Personally, I often don't bother with AA in PC games unless I can get it all. Using a lower level of AA, like 2x, may eliminate the aliasing on some parts of the image, but it just makes it even more noticable on the parts that don't have it.

BTW, why is it only now that we're getting HD res? PC games have been running at above 720P since the dreamcast came out.
 
Fox5 said:
BTW, why is it only now that we're getting HD res? PC games have been running at above 720P since the dreamcast came out.


er.... you really have to ask that? How about.... HDTV penetration into the market. Oh and try the memory restriction. ;)
 
digitalwanderer said:
Yup, in motion those little jaggies turn into that annoying edge-flickering making it more noticeable for me and not less.
Listen to this guy. Everything that flickers, crawls or otherwise is a resulting of undersampling the environment needs eliminated, and viewing things in motion only makes it stick out more than in a screenshot. MSAA does wonders for the stability of the image.
 
Alstrong said:
Fox5 said:
BTW, why is it only now that we're getting HD res? PC games have been running at above 720P since the dreamcast came out.


er.... you really have to ask that? How about.... HDTV penetration into the market. Oh and try the memory restriction. ;)

The saturn I believe was running at 800x600 all those years ago and outputting to a low res TV, supposendly it looked better than 640x480 output to the tv.
 
Fox5 said:
The saturn I believe was running at 800x600 all those years ago and outputting to a low res TV, supposendly it looked better than 640x480 output to the tv.
800x600 does look better outputted to the TV than 640x480 if I recall, last time I played around with that I had a Rage Fury (non-Pro).

It looks better, but still like shite....I just can't get over how bloody big TV pixels are, I can't watch 'em anymore. :oops:
 
Inane_Dork said:
digitalwanderer said:
Yup, in motion those little jaggies turn into that annoying edge-flickering making it more noticeable for me and not less.
Listen to this guy. Everything that flickers, crawls or otherwise is a resulting of undersampling the environment needs eliminated, and viewing things in motion only makes it stick out more than in a screenshot. MSAA does wonders for the stability of the image.

Listen to this guy?

digitalwanderer
Democratically Voted Village Idiot.


Joined: 19 Feb 2002
Posts: 15242
Location: Highland, IN USA


I didn't vote but... ;)

Anyway, It's subjective...I grew up on super-scaling blocky sprites on SEGA arcades and to me ART> ALL GRAPHICS.

But technically it's still subjective. It's the 'whole' package, i.e. lighting, animation, textures, polygons, anti-aliasing, colour palette etc. that gets my eyes 'popping'. Look no further than Wind Waker or Okami...
 
Jaws said:
I grew up on super-scaling blocky sprites on SEGA arcades and to me ART> ALL GRAPHICS.
Dude, I grew up on Pong but that don't mean I can't grow and aquire higher standards.

Art is only good if the graphics show it off well for me.
 
seismologist said:
Maybe not ideal from raw performance standpoint. Maybe I'm wrong I havne't really read much about this but I assumed that if this GPU design was so ideal it would be adopted by future PC GPUs. Ihaven't heard anything about ATI moving to EDRAM in their PC product line yet.

Oh I agree that AA is noticeable at HD resolution but to what extent? Whenever I have to make a tradeoff for performance in any of my games AA is usually the first thing to go.


ATI won't use eDRAM in a PC part because off of all the different resolutions available, while the 360 will is a fixed console platform targeting only 720p. To render games at 1600 x 1200 would require a lot more eDRAM to have enough memory for 4 x A.A.

Next-gen games are going to tax GPU's a lot harder obviously. Polygon counts are going to increase, so I seriously doubt an increase in resolution by itself is going help image quality satisfactorily enough as far as aliasing is concerned.
 
digitalwanderer said:
Jaws said:
I grew up on super-scaling blocky sprites on SEGA arcades and to me ART> ALL GRAPHICS.
Dude, I grew up on Pong but that don't mean I can't grow and aquire higher standards.

Art is only good if the graphics show it off well for me.

Exactly my point. It's subjective because different people have different weightings/ priorities.

So what FSAA setting is your Pong setup! :p

I actually grew up with Pac-Man and missed all that crazy Pong era... ;)
 
digitalwanderer said:
Fox5 said:
The saturn I believe was running at 800x600 all those years ago and outputting to a low res TV, supposendly it looked better than 640x480 output to the tv.
800x600 does look better outputted to the TV than 640x480 if I recall, last time I played around with that I had a Rage Fury (non-Pro).

It looks better, but still like shite....I just can't get over how bloody big TV pixels are, I can't watch 'em anymore. :oops:

From my experiences with PC to TV output...any res over 640x480 just makes things blurrier, perhaps similar to XGI's subsampling?
Text became impossible to read over 640x480.
 
Shifty Geezer said:
I'm arguing 1 and 2, but for semantics purposes with jvd. If ATi hadn't included eDRAM, they would have had another 100 million trannies for other 'stuff', which would have had other features - maybe another cluster of 16 ALU's for a fourth unified shader. Whether that would have beena better use of resources, I have no idea. I don't think anyone knows until the realworld practical performance of Xenos is known. But like I said, I think Xenos is a great design and I can't see anything wrong with it myself.
Little late to the party ( fell asleep outside for 3 hours in the sun and burnt my back . hurts alot so i layed on my tummy in the living room most of the day sleeping

But what other stuff ? They could have added fp32 hdr like the rsx has . But they wouldn't have had the bandwidth to run fp32 . So they would have wasted those transistors there instead of using them in the edram daughter die which gives 4x fsaa free .


The xenos is a custom chip from the ground up . Alot of useless features are gone from the chip. Its targeted mainly at 1 res , 720p and to do 4x fsaa with complex sm3.0 shaders with hdr applied .

I don't see where they wasted transitors on edram as it was the way they met those goals .

Useless transitors would be features that the xenos would be unable to use effectively in games or wouldn't be of use . Video acclerator hardware , percision formats that are to demanding on the actuall hardware or other things included in the chip it couldn't use . other throw backs from the pc .
 
Brimstone said:
seismologist said:
Maybe not ideal from raw performance standpoint. Maybe I'm wrong I havne't really read much about this but I assumed that if this GPU design was so ideal it would be adopted by future PC GPUs. Ihaven't heard anything about ATI moving to EDRAM in their PC product line yet.

Oh I agree that AA is noticeable at HD resolution but to what extent? Whenever I have to make a tradeoff for performance in any of my games AA is usually the first thing to go.


ATI won't use eDRAM in a PC part because off of all the different resolutions available, while the 360 will is a fixed console platform targeting only 720p. To render games at 1600 x 1200 would require a lot more eDRAM to have enough memory for 4 x A.A.

Next-gen games are going to tax GPU's a lot harder obviously. Polygon counts are going to increase, so I seriously doubt an increase in resolution by itself is going help image quality satisfactorily enough as far as aliasing is concerned.

Considering AA is one of the main features people go for with more powerful graphics cards, and one of the few ways the performance of the latest graphics cards can be spent in new games, would be money well spent. 16MB eDram for a framebuffer would go along way towards AA performance, and they wouldn't need to pair it with a faster chip. Certainly if a graphics card is going to cost an extra couple hundred dollars, they can add some edram to it.
 
I can't belive people are still talking like their going to see games in 1080p, they me be upscaled to 1080p from 720 or 480p but no developer in their right mind is going to target their game to run at a rez less far less than .01% of the popluation can even view.



I went to Ultimate Electrons today, they had 1 1080p TV for sale out of the 300+ TV's on display and it wsa 13,000$. Best buy and CC had none. Gimme a break
 
wireframe said:
This is impossible with the Xenos. Thinking about it, I came to the conclusion that this may be a problem in later parts of the lifecycle of the Xbox 360. Whereas the PS3 can start switching off the AA for more raw performance, the Xenos cannot and doing so would just make 1/3rd of the GPU idle (the eDRAM 100M)(well, not really...hyperbole)

Wireframe - I'm not sure thats true. The 192 highspeed SIMD units on the EDRAM can be used for anything... plus the Memexport function ... I think that MS just aims for 720p with 4xAA but if devs find a novel use and can bring it to life on the screen MS wouldn't balk...
 
Status
Not open for further replies.
Back
Top