What rendering tricks might RSX employ?

Why can't there be a separate antialiasising chip, that would only do the filteering to get rid of AA at the final stage. It could even be on the display device. Something like Philip's PixelPlus tech.
Would such a chip be just impossible on a console, a chip that analyses the image just before it is displayed and filters the jaggies away.
Wasn't it said in some interview that with PS3 you could "upconvert" exsisting video signals to something close to HDTV, would that also be possible with AA.
Doesn't Sony have some tech in their high-end Qualia TV's that enables you to zoom into the pic and it artificially adds detail as you zxoom in, would that tech be possible to be customised to use for antialiasing too?

Why does the antialiasing have to be part of the rendering pipeline (or is it? I'm no expert :) )
 
rabidrabbit said:
Why can't there be a separate antialiasising chip, that would only do the filteering to get rid of AA at the final stage. It could even be on the display device. Something like Philip's PixelPlus tech.
Would such a chip be just impossible on a console, a chip that analyses the image just before it is displayed and filters the jaggies away.
Wasn't it said in some interview that with PS3 you could "upconvert" exsisting video signals to something close to HDTV, would that also be possible with AA.
Doesn't Sony have some tech in their high-end Qualia TV's that enables you to zoom into the pic and it artificially adds detail as you zxoom in, would that tech be possible to be customised to use for antialiasing too?

A separate chip would add to the cost.

Besides, all PixelPlus (and the WegaEngine and all those fancy names you get on TVs) does is upconvert the image at whatever resolution, do some mojo, and downcovert it again. IF (big IF) PS3 renders internally at 1080p all the time, then downconverts is to whatever resolution, it's almost the same effect. Almost.
 
There aren't proper filters that can create true AA. AA is a case of averaging the information from two different surfaces that occupy the same pixel, instead of drawing only one surface. The best you could manage is a blurring, creating an average of adjacent pixels, but then you're losing the clarity of HD!

there is no substitute for multisampling of some form or other - rendering all the surfaces present in a pixel and averaging the output from the amount of each surface present.

What could be possible is edge-only AA. I'm thinking you consider changes in depth, normal, and so forth to determine where jaggies appear and only supersample those pixels. That'd be but a tiny fraction of the whole scene and not need the stupid amounts of bandwidth. That couple with aniso-filtering willl give smooth edges and smooth textures.

I believe edge-AA technoology exists and has been tried in the past. I don't know what the strengths and weaknesses are or why it hasn't become mainstream.
 
Shifty Geezer said:
There aren't proper filters that can create true AA. AA is a case of averaging the information from two different surfaces that occupy the same pixel, instead of drawing only one surface. The best you could manage is a blurring, creating an average of adjacent pixels, but then you're losing the clarity of HD!

there is no substitute for multisampling of some form or other - rendering all the surfaces present in a pixel and averaging the output from the amount of each surface present.

What could be possible is edge-only AA. I'm thinking you consider changes in depth, normal, and so forth to determine where jaggies appear and only supersample those pixels. That'd be but a tiny fraction of the whole scene and not need the stupid amounts of bandwidth. That couple with aniso-filtering willl give smooth edges and smooth textures.

I believe edge-AA technoology exists and has been tried in the past. I don't know what the strengths and weaknesses are or why it hasn't become mainstream.

It has been done before, yes, and it has its advantages (performance obviously) but a lot of troubly disadvantages too (ege detection can be tricky, resulting in jaggy lines, like AA wasn't there)...
So the IHV have been doing it the rough way - so to speak - just AA the whole screen and try to get that as fast as they can.
 
rabidrabbit said:
Why can't there be a separate antialiasising chip, that would only do the filteering to get rid of AA at the final stage. It could even be on the display device. Something like Philip's PixelPlus tech.
Would such a chip be just impossible on a console, a chip that analyses the image just before it is displayed and filters the jaggies away.
Wasn't it said in some interview that with PS3 you could "upconvert" exsisting video signals to something close to HDTV, would that also be possible with AA.
Doesn't Sony have some tech in their high-end Qualia TV's that enables you to zoom into the pic and it artificially adds detail as you zxoom in, would that tech be possible to be customised to use for antialiasing too?

Why does the antialiasing have to be part of the rendering pipeline (or is it? I'm no expert :) )

I think Kutaragi implied that Cell would be used to supersample an SD DVD input stream upconverting it and encoding it as quasi HD onto the hard drive for playback later. Said the same for photos too for instance, using bicubic or other sampling method to make a photo larger yet attempt to get rid of the artifacts that normally occur when you blow an image up. The former was alluded to take place when you weren't actually using the machine.

Edited for new content.

Surprising how obvious the poly's are in the creature. I'm sure we all remember her from E3.

 
Shifty Geezer said:
Yes, I'd take downsampled hires to lowres screens too. But ideally I'd like to see both techs in full effect :D

What does the PS2 render at internally, something like 640x240, which is a part of the reason for its jaggies?

If the new consoles could render at 1080p, that's a huge jump in pixels. But big if.
 
wco81 said:
Shifty Geezer said:
Yes, I'd take downsampled hires to lowres screens too. But ideally I'd like to see both techs in full effect :D

What does the PS2 render at internally, something like 640x240, which is a part of the reason for its jaggies?

If the new consoles could render at 1080p, that's a huge jump in pixels. But big if.

Early PS2 games used that internal resolution. All games released in the last 4 years or so are all at full 640x480 - which allows progressive scan output.
 
Early PS2 games used that internal resolution. All games released in the last 4 years or so are all at full 640x480 - which allows progressive scan output.
No, only a few games have full front buffer. Full frontbuffer is really a waste if 99% of the consoles are hooked up to SDTVs. You lose either 600Kb or 300Kb depending on what bit depth the buffer is.
Some games get progressive by having a 16 bit full hight FB or a 32 bit interlace buffer for SDTV.
 
Higher resolution can't hurt.

All I have is a Pentium 2 350 Mhz PC with GeForce2MX card and Madden looks better than the console version just using 1024x768, no AA, no fancy filtering.

Edges are smoother and sharp, the detail itself giving a "3D" kind of look.
 
It makes sense. Aliasing is simply too few samples. Higher resolution = more samples. No AA at 1080p certainly looks a lot better than no AA at 640x480 (nearly 7 times the samples) ;)

SD owners will indeed be in an interesting position as far as IQ is concerned, regardless of AA used, really.
 
It makes sense. Aliasing is simply too few samples. Higher resolution = more samples. No AA at 1080p certainly looks a lot better than no AA at 640x480 (nearly 7 times the samples)

SD owners will indeed be in an interesting position as far as IQ is concerned, regardless of AA used, really


Hold on, hold on Titanio. I can't seem to understand what you just said. Ok so you are saying that if I have a HDTV that can display 720p and 1080i, even if the PS3 doesn't have AA it would look better on my TV then some regular 25 inch CRT?

If so then great and cheers to next-gen. :D
 
mckmas8808 said:
Hold on, hold on Titanio. I can't seem to understand what you just said. Ok so you are saying that if I have a HDTV that can display 720p and 1080i, even if the PS3 doesn't have AA it would look better on my TV then some regular 25 inch CRT?

If so then great and cheers to next-gen. :D

No, what I'm saying is that if you had a SDTV, you'd essentially be getting AA for free - the image is being scaled down from a high resolution (720p or better yet 1080p) and that's effectively anti-aliasing. Unless I'm misunderstanding something along the way..

On your HDTV, a 720p or 1080p image with no AA would I think look a lot better from a aliasing perspective than 640x480 image with no AA, however, no matter what TV it was on.

A more interesting situation, perhaps, for HDTV owners is with 1080p games on a 720p display - you're still in hidef, but the picture would also be scaled down, introducing some AA too (there's 2.25x the resolution between 720p and 1080p).

edit - I suppose TV size plays a part too, at least when you're talking about hidef - more resolution squeezed onto a smaller screen = higher density of pixels = better looking picture (?) Of course, there is the tradeoff of the smaller screen..
 
rabidrabbit
Senior Member


Doesn't Sony have some tech in their high-end Qualia TV's that enables you to zoom into the pic and it artificially adds detail as you zxoom in, would that tech be possible to be customised to use for antialiasing too?


Yes, that chip made for their WEGA line TV's. I brought that up in another post. But I guess London-boy clarified, what would be that issue with those chips when it came to a game image, instead of real ones on a TV.
It would have been really nice if the tech was good fo in-game use though
 
Why all this talk of AA? Maybe it's just me, but have ye all watched high-rez ps3 vids and still came away thinking AA was needed?, as I've said the high-rez ps3 vids from stuff like heavenly sword have iq that to me, in motion, seemed akin to that used for high-quality cgi. It's only in stills that you can discern small jaggies. Deano said there was no AA, yet it seemed like it had a significant amount of it, what was used then to virtually eliminate almost all aliasing? Is 1080P resolution alone enough to achieve such a result?
 
zidane1strife said:
Why all this talk of AA? Maybe it's just me, but have ye all watched high-rez ps3 vids and still came away thinking AA was needed?, as I've said the high-rez ps3 vids from stuff like heavenly sword have iq that to me, in motion, seemed akin to that used for high-quality cgi. It's only in stills that you can discern small jaggies. Deano said there was no AA, yet it seemed like it had a significant amount of it, what was used then to virtually eliminate almost all aliasing? Is 1080P resolution alone enough to achieve such a result?

Problem is...the PS3 is going to be sold to people that just have regular plain old TVs. No HDTV functions what so ever. Sony can't leave them out in the rain! I have two HDTVs in my house and I think the price to pay to play games like Halo and GT4 is worth the visuals you get from the games.

Other people don't have the same mindset. If Sony releases the PS3 without the same level of AA as its competitors and just relies on higher resolutions to solve the problem then their in for a world of dissapointment. They can't force HDTV on people you have to go with the flow..
 
BlueTsunami said:
Problem is...the PS3 is going to be sold to people that just have regular plain old TVs. No HDTV functions what so ever. Sony can't leave them out in the rain! I have two HDTVs in my house and I think the price to pay to play games like Halo and GT4 is worth the visuals you get from the games.

Other people don't have the same mindset. If Sony releases the PS3 without the same level of AA as its competitors and just relies on higher resolutions to solve the problem then their in for a world of dissapointment. They can't force HDTV on people you have to go with the flow..

The thing is, people without HDTV will actually benefit more from an aliasing perspective, than even those with HDTV (I think?). The hi-res images being output by PS3, at 720p or 1080p will have to be rescaled for a SDTV, and what happens when you scale down a hires image? Anti-aliasing. That's the principle of Supersampled AA.

So people without HDTV will actually benefit more from the higher res in terms of AA than those with HDTV..but of course, they lose the resolution vs HDTV.
 
Problem is...the PS3 is going to be sold to people that just have regular plain old TVs. No HDTV functions what so ever. Sony can't leave them out in the rain! I have two HDTVs in my house and I think the price to pay to play games like Halo and GT4 is worth the visuals you get from the games.

Other people don't have the same mindset. If Sony releases the PS3 without the same level of AA as its competitors and just relies on higher resolutions to solve the problem then their in for a world of dissapointment. They can't force HDTV on people you have to go with the flow..

HDTV won't be forced. SDTV users, like me, will be able to use 360 and PS3 just as well as we can with the current-gen systems. The argument is if if an image were created at 1080p and downsampled to 480p/480i, it would be 'free AA' for SDTV users.
 
zidane1strife said:
Why all this talk of AA? Maybe it's just me, but have ye all watched high-rez ps3 vids and still came away thinking AA was needed?
You watched what, though? A 720p, lossy compressed version of the videos? It's not exactly a fair conclusion. And also, jaggies aren't the only thing AA helps eliminate.

Really, if AA were such a non-issue, why do you think Pixar spends so much detail on massively demanding AA algorithms?
 
Inane_Dork said:
Really, if AA were such a non-issue, why do you think Pixar spends so much detail on massively demanding AA algorithms?

I agree that AA is still needed, but at higher resolutions aliasing is less of a problem. Higher resolution is in fact the most direct solution to aliasing (but of course, a hell of a lot of resolution would be needed before it'd become a non-issue, much more than we have now!).

For those with SDTVs, I'm not if most will notice if AA was used at all or not. For those with HDTVs, resolution will get us "so far" but whether "so far" is enough is debateable. I'd certainly take AA, but I'm not sure if as much is needed at higher resolutions as at lower resolutions. You might still notice the difference between higher and lower levels of AA, but the difference might be a lot smaller than it is between the same levels of AA at lower resolutions. A lot of people may not even notice (certainly the word "jaggie" wasn't really on people's lips after seeing Heavenly Sword at E3, despite it having no AA. A combination of a higher resolution and other post-processing besides AA (DOF, for example) would appear to go a long way. That said, I'd like to see AA in the final game, just to be sure! ;)).
 
Back
Top