Let's talk FSAA on next generation consoles ok ?

london-boy said:
I THINK that at 720p there will not be a big need for AA. it would only make things look blurry. it would look like GC games compared to the sharpest PS2 games.... i personally prefere the sharp look, not the blurry look. and at 720p the screen will be detailed and sharp enough not to need a lot of AA, which could result in blurring the image and losing detail. but maybe that's just me...
personally, i'm convinced that, at least in europe, there will be NO support for 720p AT ALL... so i'll have to make do with 480p with AA :rolleyes: call me pessimistic...

Well multisampled AA only blends polygon edges. But then again, considering the kind of polycounts we're talking about for next gen... hehehe :)

I don't think AA makes things blurry at all... except for text. Text does get blurred... but a decent AA algo like 3dfx's or ATi's should be programmable, to allow for special-cases - for example, Serious Sam has specific support for 3dfx's T-Buffer which tells the VSA's not to supersample text. :)
 
Dont confuse implementation with principle ... if you really think say X-Men 2, random movie pick with lots of CGI, would have looked better without anti-aliasing you should consider informing yourself better.
 
MfA said:
Dont confuse implementation with principle ... if you really think say X-Men 2, random movie pick with lots of CGI, would have looked better without anti-aliasing you should consider informing yourself better.


talking to me? of course it would have looked worse, assuming the resolution wasn't high enough... i think that at a high enough resolution, AA becomes irrelevant... that's why big movies like Lord of the rings or the new star wars movies are filmed at an insane resolution (can't remember exactly what res it is)...
 
CG in movies is done at an insanely high resolution with 16x stochastic SSAA. You'd be crazy to think that Lord of the Rings or Star Wars would render scenes without using any antialiasing at all.

Don't tell me that AA is irrelevant at high resolutions, especially with the size and resolutions available for TVs. You can see aliasing (albeit small aliasing) at 1280*960 on a 17" monitor, don't fool yourself by saying aliasing will be completely invisible on a 55", 1280*720 HDTV set.

Moreover, what about all the people like me, who aren't willing to pay thousands of bucks for an HDTV, especially since most shows and movies are not HD anyways. I have a rather old but very big TV at home, and the jaggies on the Pillar of Autumn are the size of a small cockroach. Antialiasing is deadly important for next-gen consoles.
 
BoddoZerg said:
CG in movies is done at an insanely high resolution with 16x stochastic SSAA. You'd be crazy to think that Lord of the Rings or Star Wars would render scenes without using any antialiasing at all.

Don't tell me that AA is irrelevant at high resolutions, especially with the size and resolutions available for TVs. You can see aliasing (albeit small aliasing) at 1280*960 on a 17" monitor, don't fool yourself by saying aliasing will be completely invisible on a 55", 1280*720 HDTV set.

Moreover, what about all the people like me, who aren't willing to pay thousands of bucks for an HDTV, especially since most shows and movies are not HD anyways. I have a rather old but very big TV at home, and the jaggies on the Pillar of Autumn are the size of a small cockroach. Antialiasing is deadly important for next-gen consoles.

hey cool sweetie, no need to go all dont-fool-yourself-ey on me :LOL:
of course on a 55" tv aliasing will always be a problem... i was saying that on my little 17" monitor i won't see it....
and anyway, as u said, u r going to buy a standard interlaced TV, and even with AA enabled, the output will still be a crappy 480i. u can put all the antialiasing u want, but it will still look like crap. actually, at 480i it will only get blurrier and blurrier until it looks like a N64 game....
J/K
i'm exaggerating a little, but the principle is the same. 720p with no AA wil ALWAYS look MUCH MUCH better than 480i with AA....
so your point doesnt make much sense.... the jagged nature of 480i comes from the fact that it is a low resolution INTERLACED signal. and AA won't change that... hell, like FAF said some time ago, u might as well just put a flicker filter, which will be better for 480i that just plain AA.
 
i'm sure that at least 4x AA at 720p won't be a problem for next gen hardware in terms of performance...

That's not good enough...

The ps2 didn't accomplish what was needed for hollywood esque cg... sony said they'd deliver with ps3... to me hollywood esque cg is characterized by high quality IQ, moreso than effects...
 
Im not so sure flicker filtering is a good idea on high quality 100/120 Hz TVs ... these should be able to reconstruct most of the vertical high frequency content without introducing flicker (if it is removed by the flicker filter it is gone though).

Good AA however is always beneficial, and the higher the amount of subpixel samples the better ... they use as high a ratio as they can afford in CGI, even with the high resolution of the display, that is why I brought up movies.
 
london-boy said:
hey cool sweetie, no need to go all dont-fool-yourself-ey on me :LOL:
of course on a 55" tv aliasing will always be a problem... i was saying that on my little 17" monitor i won't see it....
and anyway, as u said, u r going to buy a standard interlaced TV, and even with AA enabled, the output will still be a crappy 480i. u can put all the antialiasing u want, but it will still look like crap. actually, at 480i it will only get blurrier and blurrier until it looks like a N64 game....
J/K
i'm exaggerating a little, but the principle is the same. 720p with no AA wil ALWAYS look MUCH MUCH better than 480i with AA....
so your point doesnt make much sense.... the jagged nature of 480i comes from the fact that it is a low resolution INTERLACED signal. and AA won't change that... hell, like FAF said some time ago, u might as well just put a flicker filter, which will be better for 480i that just plain AA.


You said: "i think that at a high enough resolution, AA becomes irrelevant... that's why big movies like Lord of the rings or the new star wars movies are filmed at an insane resolution" That's just stupidity. Unless you can get to a resolution higher than the human eye (you would need a 4000*3000 19" monitor) antialiasing will always be important.

Also, you keep saying that "AA makes things blurry". That's totally false. Multisampling does not have any effect on textures, and supersampling actually helps sharpen textures. The only times AA is blurry is nVidia's Quincunx AA, which uses a blur filter to smooth out jaggies, and the old 3dfx VooDoo4/5 AA, due to driver bugs. (improper LOD bias) If there is no blur filter, and the drivers select the correct mipmap levels, MSAA should not affect "blurriness" at all, and SSAA will actually sharpen the image.

Of course a 480i image will never look very good. That's what HDTV is for. But for those of us who have old televisions, high degree FSAA is overwhelmingly preferable to no AA at all.


PS: AA can also blur text on some PC games. But this is an issue caused by PC games that aren't programmed to handle AA correctly, or nVidia's Quincunx AA. (which is just crap) Console games should not have this problem.
 
BoddoZerg said:
You said: "i think that at a high enough resolution, AA becomes irrelevant... that's why big movies like Lord of the rings or the new star wars movies are filmed at an insane resolution" That's just stupidity. Unless you can get to a resolution higher than the human eye (you would need a 4000*3000 19" monitor) antialiasing will always be important.

Also, you keep saying that "AA makes things blurry". That's totally false. Multisampling does not have any effect on textures, and supersampling actually helps sharpen textures. The only times AA is blurry is nVidia's Quincunx AA, which uses a blur filter to smooth out jaggies, and the old 3dfx VooDoo4/5 AA, due to driver bugs. (improper LOD bias) If there is no blur filter, and the drivers select the correct mipmap levels, MSAA should not affect "blurriness" at all, and SSAA will actually sharpen the image.

gosh u're tough... :LOL:

i said (pretty much): AS A PRINCIPLE, at high enough resolutions AA becomes somewhat irrelevant. and 4000*3000 is a principle. it aint gonna happen but that doesnt make my statement untrue.

the i said: i don't really care because 720p without AA on my monitor will ALWAYS look MUCH better than your 480i even with 210393248x AA in a big screen.

okie dokies?
 
Some quick points:
- HDTV's can nearly be considered "mainstream" now. 32" 16:9 ratio HDTV's can be had at Best Buy for less than $1000 now. Mid-sized projection HDTV's for very fair prices as well ($2000 range).

- HDTV content is readily available FOR FREE in nearly every US market. Federal law mandates that over-the-air network feeds be feeding digital over-the-air signals at some kind of strenght this year. This is being done everywhere. More than 3 out of 4 television shows in the "primetime" slot (evening hours) are broadcast FOR FREE, OVER-THE-AIR, in HDTV at this time. I watch them every night.

- As these two items come together (HDTV hardware and programming), their availability and price will drop like a rock, even faster than they are now.

My point with the above? Don't write HDTV off as some eccentric, elitist technology that isn't available to the masses for barely more than a nice standard television anyway... It's simply not true.

Second, I play most of my computer games at 1024x768 on a 17" monitor. There are WORLDS of difference between no AA and 4x AA on my Radeon 9500 PRO at this resolution and this small screen. There are WORLDS of difference between simple texture filtering and max Aniso Filtering at this resolution.

The next step for console gaming is more lifelike rendering. It will NEVER happen without AA and very high quality, intensive texture filtering. It's like putting an 800 HP engine in a car with 13" tires. What's the sense of being able to push tons of polygons and use super pretty textures if the little tiny polygons are "popping" and the edges of the big polys are jagged?

You may not notice it now because you know you're playing a game, and you expect game type graphics. However, this is the area that next gen consoles HAVE to get right!

I have my XBOX hooked up by component cables (so it's always 480P minimum) to my projector, which lights up a 120" diagonal 16:9 screen. I almost find the pixel and poly popping /jagged edges unplayable after playing games on my HTPC on the same screen with 4X and 6X FSAA / High aniso filtering, etc...

-Chris
 
Id be curious to know to what extent the lens of our eye acts as a low pass filter ... if it effectively acts as such you would have to go quite a bit beyond the resolution of our eyes before AA stopped being necessary (even if it isnt you would of course have to use poisson disc sampling on the monitor just like our eyes use, or take it into account and use even higher resolution uniform sampling ... dont know wether the 4K*3K ratio mentioned does this).

london-boy, you might simply not remember what you said ... but you said something quite a bit different from that.
 
the i said: i don't really care because 720p without AA on my monitor will ALWAYS look MUCH better than your 480i even with 210393248x AA in a big screen.

Well, it's hard not to agree with that. It's just common sense, man. The point is, though, that if a next gen console doesn't perform some serious AA on at least a 720P image, it'll be an embarassment to image quality IMO, and I won't be wasting my time.

2005 is a couple years away, and most people will have enjoyed some for of AA on their computers for many years at that time. Playing only 1280x720 on a bigscreen TV with no AA will look terrible compared to the AVERAGE computer's ability to perform AA at that time...
 
covermye said:
the i said: i don't really care because 720p without AA on my monitor will ALWAYS look MUCH better than your 480i even with 210393248x AA in a big screen.

Well, it's hard not to agree with that. It's just common sense, man. The point is, though, that if a next gen console doesn't perform some serious AA on at least a 720P image, it'll be an embarassment to image quality IMO, and I won't be wasting my time.

2005 is a couple years away, and most people will have enjoyed some for of AA on their computers for many years at that time. Playing only 1280x720 on a bigscreen TV with no AA will look terrible compared to the AVERAGE computer's ability to perform AA at that time...


yeah of course... still, remember that there is NO hdtv in europe. even now. i cant even say *it's too expensive*... it's just not here.
we have plasma displays but they do PC resolutions... there is no 480p or 720p or whatever.... u can only play progressive scan games from ps2 (at 480p or 525p as they like to call it here) since it was completely cut from xbox and GC... wonder how they are going to fix that in a mere 2 years...
wanna bet that i'll be playing games at an embarrassing 480p with AA while u people play at up to 1080i (or p)... how depressing is that.... thats it, im moving to new york... :LOL:
 
covermye said:
Playing only 1280x720 on a bigscreen TV with no AA will look terrible compared to the AVERAGE computer's ability to perform AA at that time...

This entirely depends on the assumption that consumer grade HDTV's will be built to the same performance level of standards to a computer monitor (thus all the comparisons of what you say you can see on your computer monitor vs. what you think you will see on an HDTV may not hold up). I don't see that as such a sure thing- otherwise we would be seeing people replacing their computer monitors with gigantic (relatively) HDTV screens. I can almost assure you that an HDTV will not be adequate for high-resolution text imaging like a better computer monitor is, thus implying the standards are different. For that reason, AA is quite effective on a computer monitor, but not invariably, equally effective on a larger HDTV screen.

Don't let HDTV format spec's fool you. Just because an HDTV display is compatible with a particular HDTV format (of particular resolution) doesn't guarantee that that will be the effective resolution you will see on the output (for example, try comparing a Sony Trinitron monitor to some bargain basement, no name one- even though the supported resolutions are quite comparable, if not equivalent, you may find AA makes a difference in the former and is completely imperceptible in the latter). Depending on the quality and performance potential of the critical display components of the particular HDTV device, you may achieve anywhere from lousy high-resolution output to outstanding high-resolution output (essentially a $40k computer monitor in a TV size). In either case, they both can display something when fed with a 1024x768 signal.

I think the future of "filtering" enhancements lies not in doing the same ole AA techniques at higher resolutions. It will probably involve some sort of MS for textures and selective filtering specifically on polygon edges where overlapped foreground and background poly's contain high color contrast conditions. This sort of adaptive blending should be somewhat "child's play" for hardware resources available in next gen designs. That would represent the "no expense spared" development project, of course. If none of that is done, and the output is just a direct manifestation of the game engine (at HDTV resolutions), I won't lose much sleep over it, either.
 
london-boy said:
u can only play progressive scan games from ps2 (at 480p or 525p as they like to call it here) since it was completely cut from xbox and GC...

Hey, just mod your PAL Xbox and you'll have it outputting 480/720/1080 without a problem.
 
Don't let HDTV format spec's fool you. Just because an HDTV display is compatible with a particular HDTV format (of particular resolution) doesn't guarantee that that will be the effective resolution you will see on the output (for example, try comparing a Sony Trinitron monitor to some bargain basement, no name one- even though the supported resolutions are quite comparable, if not equivalent, you may find AA makes a difference in the former and is completely imperceptible in the latter).

I'm fully aware of the technology (or lack thereof) involved in different quality displays. I'm also fully aware that NO MATTER WHAT, a next generation console with be absolutely worthless in my opinion if its output isn't capable of AT LEAST 480P with high AA levels, period, and a preferred minimum of 720P with AA enabled.

I can't believe you'd argue otherwise, but you (and others) seem to be saying that this concern isn't that big of a deal to you...

I tell you what, the blocky and jagged lines on games like Halo, NFL 2K3, Mech Assault, etc... on XBox are obvious whether I'm playing on one of my 32" interlaced standard def TV's, my 32" HDTV, or my 120" 16:9 projector system. Limited resolution with no AA is obvious no matter what the display technology.

Granted, it looks worse on the standard TV's due to the interlacing. However, it's almost gross to go from playing some average looking computer game like Ghost Recon at 1024x786 with AA on my projection system, then going to Halo. Blurry, blocky, jagged, popping. It's just depressing...
 
I am in complete agreement that the next consoles needs tremendously better FSAA than present generation consoles.

both in the method of FSAA as well as the number of samples of it.

I am hoping for 16x FSAA or at least 4x4 minimum (not 4x) - if you are going to have CGI quality visuals you need good anti-aliasing. otherwise you will have to wait for the next next gen, in 2010-2011.

480p needs 16x or 4x4 FSAA - i suppose with 720p you could get away with less...
 
covermye said:
I tell you what, the blocky and jagged lines on games like Halo, NFL 2K3, Mech Assault, etc... on XBox are obvious whether I'm playing on one of my 32" interlaced standard def TV's, my 32" HDTV, or my 120" 16:9 projector system. Limited resolution with no AA is obvious no matter what the display technology.

It remains to be seen how "truly horrible" this really is on a 32" standard or HDTV vs. how much you personally are making an issue out of this. As for blowing up a 640x480 image to 120", what honestly did you expect??? It's a wonder why "everybody" doesn't complain about this since 120" projection systems are so prevalent these days. :p The notion of "keeping things reasonable" comes to mind.
 
Ug Lee said:
london-boy said:
u can only play progressive scan games from ps2 (at 480p or 525p as they like to call it here) since it was completely cut from xbox and GC...

Hey, just mod your PAL Xbox and you'll have it outputting 480/720/1080 without a problem.



yeah. well that's not the point is it? it's not like Sony or MS or Nintendo will release a next gen console then say *oh well, u can't use HDTV resolutions, but u can still mod it if u want*
we're talking about MAINSTREAM here. modding consoles, apart from being illegal, has got nothing to do with this.
 
randycat99 said:
It remains to be seen how "truly horrible" this really is on a 32" standard or HDTV vs. how much you personally are making an issue out of this. As for blowing up a 640x480 image to 120", what honestly did you expect??? It's a wonder why "everybody" doesn't complain about this since 120" projection systems are so prevalent these days. :p The notion of "keeping things reasonable" comes to mind.

First, I'd ask you to define reasonable. 55" 16:9 HDTV's can be had for a very reasonable price, IMO, at very mainstream type establishments nowadays (Best Buy, Circuit City, etc...). The size of the average television is currently growing, at least in the US. Medium sized projection televisions are 1/2 the price they were just a couple years ago, plus most are now HD capable.

Personally making an issue out of this? I really don't think so. The next gen consoles are comming out in a year or two, and will be designed to be played for probably 5+ years after that... That's 7 or more years into the future these next consoles will have to be serviceable. Now, Europe aside, any "reasonable" person would tell you that HDTV's will be pretty darned common and mainstream by that timeframe. Do you want to be limited to 480P with limited ability as far as AA and texture filtering at that time? I know I don't...

We're not talking about big $$$ to make this (higher res, good AA)happen, people. What will memory cost at the time of these new consoles? The gfx chip will have hellacious fillrate, I can only assume, compared to the output resolution required. All we're lacking is a little more memory storage space and decent (but not outrageous) bandwidth.

I just see the law of diminishing returns kicking in on any other improvement in "eye candy" type stuff till this obvious bottleneck is alleviated.
 
Back
Top