X360 and Anti Aliasing?

keep your love xbd and stop your marginizing game, why act surprised that ati will based their marketing claims off the best lap result?
 
I know seriously :cry:

I've said it before - on multiple forums - and I'll say it again: Xenos is an awesome design, and the way ATI and Microsoft worked in the eDRAM w/the tiling - very very smart.

I'm not trying to denigrate or marginalize Xenos or it's eDRAM in the least.

I am however consistently upset at MS through the years for making claims that allow for only one possibility when in fact they should have been more flexible from the outset (and this is primarily not in the console space that I'm talking about). So, this whole thing with mandated AA (and the hubris associated with it's being free), and then the exemptions from the mandate - to me, it just should have been optional all along. Let devs do what they will with it, and in the end that's where the innovation that will sell games will come from, not from a universal AA marketing push.
 
Last edited by a moderator:
scooby_dooby said:
Ya, but alot of your comments were pretty ridiculous, like minor upscaling would reduce jaggies

I'd say upscaling will make it worse.
 
More polygons in nextgen will not be spent on more detailed models only - I expect a serious increase in general scene complexity, including number of objects. Lots and lots of more polygon edges to antialias.
Also, in my experience, the more realistic an image looks (shading, lighting, complexity) the more disturbing aliasing artifacts will be. I want my AA for nextgen games.

Also, I expect several 2nd - 3rd gen games to use 4x AA on the Xbox.


And Shifty, don't base your judgement on a GF4's AA. Xenos has 1. rotated grid sampling patterns and 2. gamma correction as well, which make a huge difference.
 
Laa-Yosh said:
I'd say upscaling will make it worse.
Try it for yourself and you will see that this isn't always the case. A decent upscaling can help smooth the aliasing out a bit, you can see this effect simply by resizing a screenshot with an image editing program like Photoshop.
 
I've been working in CG professionaly for more than 5 years, and in my experience, no simple upscaling can help aliasing. Even an AAed image will usually look bad when upscaled.
 
Well obviously upscaling isn't any way to make nicely AA'ed CG, but it can produce a bit of an AA effect. I'm just installing Fear now so my first priority is tearing into that, but I'll put together a pic to illustrate my point and post it here later tonight.
 
Look, AA requires more data than what's present in a simple bitmap image. If you don't have that data then it's not antialiasing, because you can't create it out of thin air. So it's just some sort of post filtering/bluringand that can only blur out the jaggies while loosing the sharpness and detail of the original image as well. So are the 'laws' of signal processing...
 
Hold on a sec, when did i ever say that upscaling gives an AA look?! :???:
I was actually re-reading my own posts to see where i wrote such astupid thing, but... i didn't...
 
london-boy said:
Most HDTVs in Europe don't have a native resolution of 1280x720, they're more like 1368x768, which means that we won't get 1:1 pixel mapping and therefore the image we will see will be slightly scaled and therefore "smoother" because of that.

I thought you were saying that the tiny upscale from 1280 to 1368 would somehow reduce Aliasing. Is that not what you meant?

While it might be true 'technically'(not sure) from my experience your eyes will still see same level of jagginess.

For example, on my HDTV, the 480p image from GTA: SA has he same amount of jaggies whether I upscale it slightly to 540p, or all the way to 1080i, the upscaling doesn't reduce any jaggies. If anything it's worse at 1080i(although the difference is negligable)

And I know my TV has decent upscaling software cause it works absolute wonders with SDTV broadcasts.
 
Last edited by a moderator:
scooby_dooby said:
I thought you were saying that the tiny upscale from 1280 to 1368 would somehow reduce Aliasing. Is that not what you meant?

While it might be true 'technically'(not sure) from my experience your eyes will still see same level of jagginess.

For example, on my HDTV, the 480p image from GTA: SA has he same amount of jaggies whether I upscale it slightly to 540p, or all the way to 1080i, the upscaling doesn't reduce any jaggies. If anything it's worse at 1080i(although the difference is negligable)

And I know my TV has decent upscaling software cause it works absolute wonders with SDTV broadcasts.

oooh that! no i meant something else but i'm litterally already in bed (uhm yes i post here even when i'm in bed errrr...) and i'll continue this tomorrow.
 
I find it amusing how some will judge a console's hardware capabilities based on a few launch titles. I guess the PS2's hardware prowess was showcased by games like Ridge Racer 5, Kessen, and The Bouncer.
 
Laa-Yosh said:
Look, AA requires more data than what's present in a simple bitmap image. If you don't have that data then it's not antialiasing, because you can't create it out of thin air. So it's just some sort of post filtering/bluringand that can only blur out the jaggies while loosing the sharpness and detail of the original image as well. So are the 'laws' of signal processing...
I wasn't meaning to come off as breaking any laws of signal processing in what I said; my point is simply that a little up-sampling winds up blurring out the ailasing a bit. So, for instance. ailasing on a 1280x720 render and viewed on a native 1280x720 display will be a bit more pronounced than it would be on a 1366x768 display. Here is a gif to illustrate the effect:

upsampleaaish.gif


Obviously it isn't nearly as an impressive effect as super sampling or even multisampling and up-sampling has its negitive points as well. I would have probably never mentioned the AA effect myself but LB brought it up (or at least I got the impression he did) and I figured I'd back the point as it do a bit to blend out the jaggies.
 
Rockster said:
I find it amusing how some will judge a console's hardware capabilities based on a few launch titles. I guess the PS2's hardware prowess was showcased by games like Ridge Racer 5, Kessen, and The Bouncer.


It must be the attention span or just a lack of foresight or something... *shrug* I guess ADD is something gamers are prone to.


;)
 
kyleb said:
I wasn't meaning to come off as breaking any laws of signal processing in what I said; my point is simply that a little up-sampling winds up blurring out the ailasing a bit. So, for instance. ailasing on a 1280x720 render and viewed on a native 1280x720 display will be a bit more pronounced than it would be on a 1366x768 display. Here is a gif to illustrate the effect:

upsampleaaish.gif


Obviously it isn't nearly as an impressive effect as super sampling or even multisampling and up-sampling has its negitive points as well. I would have probably never mentioned the AA effect myself but LB brought it up (or at least I got the impression he did) and I figured I'd back the point as it do a bit to blend out the jaggies.

Nice example! :) Essentially, it is a method to decorrelate the alignment of high-contrast transitions from aligning with the grid structure of pixels. It's a highly probable scenario [the alignment that most emphasizes aliasing effects] with CG because graphics are inherently drawn in perfect alignment to the pixel grid. So the decorrelation attempts to make it so things never line up exactly and high contrast details can "blend" across 2 or more pixels, rather than one.

Isn't this much the same effect as when you set your PC to output a different resolution to an LCD monitor, that is different than the native resolution (aside from the obvious scaling function)? That's where the scaling/blending modes kick in to map a pixel image that no longer matches the final display pixel grid by an easy multiple. So ultimately, you get some "AA-ish" style blending, but the image quality does seem to take a step back.

HDTV LCD's are not much different if they are in "zoom" mode, rather than "direct". That's when you have all those blending functions adding their own video signature.
 
Rockster said:
I find it amusing how some will judge a console's hardware capabilities based on a few launch titles. I guess the PS2's hardware prowess was showcased by games like Ridge Racer 5, Kessen, and The Bouncer.
The prowess on Xbox 360 should appear at much earlier stages than PS2/PS3 (if not earlier than Xbox 1) if you believe Carmack and others had to say about the Xbox 360 development environment as they seem to be able to invest more time for optimization.
 
Last edited by a moderator:
kyleb said:
I wasn't meaning to come off as breaking any laws of signal processing in what I said; my point is simply that a little up-sampling winds up blurring out the ailasing a bit. So, for instance. ailasing on a 1280x720 render and viewed on a native 1280x720 display will be a bit more pronounced than it would be on a 1366x768 display. Here is a gif to illustrate the effect:
That's not any degree of AA and nothing I'd be happy with. AA should blend between steps whereas that scaling shows a uniform extra pixel layer being added onto some steps, actually making them larger than one pixel deep :oops:
 
Back
Top