Antialiasing VS HD

The use of an active matrix backplane does not mean the hold time is equivalent to the duration of the refresh cycle. For LCDs with a strobing backlight the hold time is the duty cycle of the backlight. OLEDs of course don't have a backlight, but you can still use black frame insertion.

Passive matrix OLEDs do exist, but I don't think they're what you think they are.
 
The question posed was ridiculously broad and made no mention of theoretical limits.
Well, 'thank you'...

Now fi the question is to be presented in a fashion that's technically answerable with some hardware reference 'given this system and this context' then we can go into the pros and cons of different IQ techniques and performance impacts and the best overall compromises. But that's not what the OP was asking.
Since we are here, in the console forum, my question was addressed to PS3/Xbox360 users, mainly. You know, HD consoles, etc... :rolleyes:

By the way, the irony in this reply wasn't meant to be arrogant, just friendly. ;)
 
By the way, the irony in this reply wasn't meant to be arrogant, just friendly. ;)
Well your example of SDTV looking better than aliased HD games isn't a very useful reference point :p.

If you're saying 'do you want 1080p no AA, 720 2xAA or SDTV 8x AA?', that's still pretty different to comparing a TV image. I played Warhawk downscaled to SDTV. WH has 4xAA, very clean game, and on SDTV that means loads of AA (I think it was downscaled. Regardless, there weren't jaggies), but it was terrible. Couldn't make out a thing! Whereas something like Heavy Rain at SDTV, if it was rendered at 640x480 with almost realistic quality and AA, would look better than the current HD but computer-game look. However the difference isn't that great - we haven't got the choice of photorealism at SD res. There are also other issues like framerate and stability. Sacred2 look pretty clean in stills rendered at 1080p. Hardly a jaggie to be seen. But the framerate is all over the shop and with crawling tear. I'd rather that game were 720p and stable.

I can only this question on a game-by-game basis.
 
I guess I'll take whatever is easier for the devs to pull off. My layman's understanding is that it's harder to render at higher resolutions (and keep the fps up) than it is to render with some AA.

To force this thread just slightly off topic, I'll pose a similar image-quality-related question since we have a few devs in here. WHAT THE HELL IS UP WITH THE LACK OF AF IN GAMES? :)
 
The question posed was ridiculously broad and made no mention of theoretical limits. So given a choice between any amount of AA and any amount of reoslution, I'd pick resolution. It is better, as it provide AA at a distance and clarity up close where low resoution is a blur.

Now fi the question is to be presented in a fashion that's technically answerable with some hardware reference 'given this system and this context' then we can go into the pros and cons of different IQ techniques and performance impacts and the best overall compromises. But that's not what the OP was asking.

Well if we're going to keep with the ridiculous then obviously playing something with "inifinite" x "inifinite" resolution trumps everything.

However, back to the realworld, there just is no case on any machine existing today to play a detailed 3D game at playable resolutions on a display where you can actually resolve key details like is that my character or is that just a black dot... In any realworld case, AA still trumps max resolution. Although I suppose a 2560x1600 resolution 5" (yes Five Inch) display may mask most aliasing artifacts. :p

And to the person above that mentioned CRTs. Yes, I have used high end 17" CRTs with 1800x1440 res and 20" CRTs with 2048x1536 res and Aliasing and Aliasing artifacts were still a problem.

I even had the pleasure of using the IBM 24" LCD with 3840x2400 resolution. And guess what? Aliasing and aliasing artifacts were still an issue.

Added to that, now performance was an even worse issue (ignoring the horrible pixel response of the IBM display).

And yes, I realize that if I sat a football field away from my 46" TV that I wouldn't notice aliasing even at 720p, however I also wouldn't be able to discern much beyond a blob of light that changed in intensity every once in a while. :)

Anyways, as I said before, some people aren't bothered by it, don't notice it, or just aren't sensitive to it.

But to other's it drives us absolutely batty.

Regards,
SB
 
The question posed was ridiculously broad and made no mention of theoretical limits.

Really? Then how do you interpret "technical struggle" in the OP? As SB said, in any real world situation AA will always trump resolution in both IQ and performance.
 
Added to that, now performance was an even worse issue (ignoring the horrible pixel response of the IBM display).

You could have a LCD display with 10,000 X 10,000 resolution, but if the hold time is long enough it will blur on your retina during fast moving scenes.

Pixel response isn't the main culprit. OLED has a pixel response time measured in microseconds while LCD is measured in milliseconds. The problem for both is that they "hold each frame" for milliseconds. Each frame gets smeared on the eye.
 
Back
Top