Image Quality and Framebuffer Speculations for WIP/alpha/beta/E3 games *Read the first post*

So screenshots (from Digital Foundry for example) would always look the same, regardless if 'Flicker reduction' or 'Normal' image quality mode was activated?

Will they always look as if 'Flicker reduction' image quality mode (same image quality mode as in Gran Turismo 5 Prologue as you said) was used, even if 'Normal' image quality mode was selected?

yes
 
Last edited by a moderator:

Is that true? In the interlaced AA you describe, wouldn't it be a matter what frame you grab as to what AA is applied to that frame? And you'd never get the full effect of the interlace unless you blended frames together, no? Also, doesn't the "flicker" come from the pixel shift and thus you'd never see that in a single frame?
 
Is that true? In the interlaced AA you describe, wouldn't it be a matter what frame you grab as to what AA is applied to that frame? And you'd never get the full effect of the interlace unless you blended frames together, no? Also, doesn't the "flicker" come from the pixel shift and thus you'd never see that in a single frame?

Whenever you pause it you will still get X number of samples for AA.

So for example. If their temporal AA variation used 2 different patterns of 4 samples each.

In motion it would appear to be either ~8x AA when at 60 fps or over, or increasing amount of flickering for lines the further below 60 fps you go.

When paused, you'd get 4xAA of one pattern or the other, but not both.

BTW - this is a hypothetical case, and not saying that this game is using 4 sample temporal AA. :p

Regards,
SB
 
no performance cost
it's just flickering versus aliasing choice

for compare you can use game pause mode (temporal AA is enable in pause mode) and use TV pause function for disable temporal AA

Thanks. I can cleary see flickering on vertical edges when game is in pause. It`s most noticeable when game is in cockpit view. Therefore I will stick with "flicker reduction" option. I didn`t buy LCD to see any kind of flicker.
 
Even if they have SPU resources to spare, can they change it this late in development?
I doubt any PS3 exclusive game will be able to throw it in this late in the game as I'm sure they would've used the SPUs to capacity or to the best of their ability so wouldn't have any SPU time left.
yes

and in 720p mode too
Interesting, so does that mean 720p mode has something akin to 6x or 8x aa effectively?
 
Question: why do games on the ps3 choose to use quincunx instead of just 2xMSAA and in some cases the 360 counterpart would be 2xMSAA without the blurring? Is there and advantage to performance with Quincunx? It would have been interesting to see how FFXIII would look with quincunx (for laughs).

btw isn't this game out already? :-s
 
We could speculate all day about Quincunx vs 2xAA, but performance-wise it's effectively identical. The sticky post has more information. Perhaps they prefer the higher perceived edge blurring as opposed to texture sharpness. Who knows.

We've had the debate before, and preference discussion isn't part of this thread. *shrug*
 
We could speculate all day about Quincunx vs 2xAA, but performance-wise it's effectively identical.
Both use 2 samples, but quincunx uses 5 taps (vs 2xMSAA uses 2 taps) so in theory quincunx should be a bit slower.
I can code up a test case if desired
 
Well, how much is "a bit slower". Also, you're implementing calculating 5 taps in software vs it being done in hardware on NV hardware or :?: As I understood it, NV implements the filter in the RAMDAC, the tail end of the output.
 
Scaling to 1080p isn't going to enhance the image and will likely reduce the quality due to blurring it. You want to stay as close to the native res of the game as possible.
 
It's a tradeodd between a blurry mess and a blocky one, basically. You've got in effect the same thing as 2D textures. A 2048x2048 texture will look far nicer when filling a 1920x1080 screen then a 512x512 texture. If you have a lower resolution texture, you can either render it point-sampled and maintain sharpness but render little squares, or interpolate it and generate blurring but break the low resolution look.

In this case, we're talking about a 1280x720 'texture' (frontbuffer) stretched across a certain sized screen. Let's pick 32" as there are 1080p and 720p sets of that size. You either have larger pixels at 720p, or a 1280x720 pixel image rendered at 1080p with or without some 'texture filtering' in effect. The better looking really depends on how far you are from the sets. If you're really close (or have a huge screen), the visible pixels may be annoying like pixelated textures are. But you may also prefer the sharpness over the blurring of an upscaled image.

Still, considering the popularity of texture filtering, I think it goes to show that people prefer some upscaling method over pixel resizing. Which means if your going to stretch the image over a huge screen, you'd probably want some upscaling method rather than just rendering larger pixels.
 
Back
Top