Image Quality and Framebuffer Speculations for WIP/alpha/beta/E3 games *Read the first post*

this demo use 1280x1080 like prologue but with a 2x MSAA mix with a 2x temporal SSAA (for simulate 4xAA in 1080p)
they shift the pixel rasterization between odd and even frame for simulate super-sampling like DMC or MGS but they don't blend frame like DMC and MGS (no blurry effect here). they just make use of display persistence and retinal persistence. this cause flickering effect (variable in accordance to type of display) but more accurate image (the temporal AA work on alpha-coverage aliasing and others aliasing where the MSAA don't work)

you can disable this temporal AA if you choose "flicker reduction" in the display option (it's the GT5 prologue display mode)

i don't know what effect is enable with the "sharpen" mode. it seem like the normal mode
 
Last edited by a moderator:
If this is covered elsewhere my apoligies, I could not find it. Take a look at these two GT5 Academy shots:

720p
http://images.eurogamer.net/assets/articles//a/8/7/9/9/6/7/720p_000.jpg.jpg

1080p
http://images.eurogamer.net/assets/articles//a/8/7/9/9/6/7/1080p_000.jpg.jpg


Now my knowledge of graphics programming is almost non-existent, but I thought that 1080p would be sharper because is firstly higher resolution and secondly there is less anti-aliasing. But looking at those images the 720p image is clearly the sharper one, the dash in the 1080p shot looks blurry.

Are they using some cheap and rubbish AA technique in 1080p that is blurring everything? I bought a 1080p monitor and I was looking forward to playing GT5 on it, however looking at those shots it seems like 720p will look better.
 
Really you dont see how blurred the gauges are in the dash? The speedo numbers, tacho writing...? Looks very different in the two shots me to.
 
If this is covered elsewhere my apoligies, I could not find it. Take a look at these two GT5 Academy shots:

720p
http://images.eurogamer.net/assets/articles//a/8/7/9/9/6/7/720p_000.jpg.jpg

1080p
http://images.eurogamer.net/assets/articles//a/8/7/9/9/6/7/1080p_000.jpg.jpg


Now my knowledge of graphics programming is almost non-existent, but I thought that 1080p would be sharper because is firstly higher resolution and secondly there is less anti-aliasing. But looking at those images the 720p image is clearly the sharper one, the dash in the 1080p shot looks blurry.

Are they using some cheap and rubbish AA technique in 1080p that is blurring everything? I bought a 1080p monitor and I was looking forward to playing GT5 on it, however looking at those shots it seems like 720p will look better.

It is blurry becouse it is upscaled. In 1080 mode GT5 has a resolution of 1280x1080 that is upscaled to 1920x1080.
 
this demo use 1280x1080 like prologue but with a 2x MSAA mix with a 2x temporal SSAA (for simulate 4xAA in 1080p)
they shift the pixel rasterization between odd and even frame for simulate super-sampling like DMC or MGS but they don't blend frame like DMC and MGS (no blurry effect here). they just make use of display persistence and retinal persistence. this cause flickering effect (variable in accordance to type of display) but more accurate image (the temporal AA work on alpha-coverage aliasing and others aliasing where the MSAA don't work)

you can disable this temporal AA if you choose "flicker reduction" in the display option (it's the GT5 prologue display mode)

i don't know what effect is enable with the "sharpen" mode. it seem like the normal mode

Good job, can you further investigate difference between "normal" and "sharpen" mode, maybe some difference in scaling? And is 2x temporal SSAA existent in "sharpen" mode?
 
Its a bit disappointing to see that even with ample amount of MSAA available, PD could'nt find a way to smooth out the shadows. :|
 
Its a bit disappointing to see that even with ample amount of MSAA available, PD could'nt find a way to smooth out the shadows. :|

Well, surely I'm wrong, but the demo is only 200 MB, could be the graphic it's just more 'essential' compared to the final build.
 
If this is covered elsewhere my apoligies, I could not find it. Take a look at these two GT5 Academy shots:

720p
http://images.eurogamer.net/assets/articles//a/8/7/9/9/6/7/720p_000.jpg.jpg

1080p
http://images.eurogamer.net/assets/articles//a/8/7/9/9/6/7/1080p_000.jpg.jpg


Now my knowledge of graphics programming is almost non-existent, but I thought that 1080p would be sharper because is firstly higher resolution and secondly there is less anti-aliasing. But looking at those images the 720p image is clearly the sharper one, the dash in the 1080p shot looks blurry.

Are they using some cheap and rubbish AA technique in 1080p that is blurring everything? I bought a 1080p monitor and I was looking forward to playing GT5 on it, however looking at those shots it seems like 720p will look better.


if you compare this both screenshots in 1:1 then it's a very bias comparison, difference of pixels density don't reflect reality
for honest comparison you must upscale 720p screenshot in 1080 (or downscale 1080 in 720) for compare with the same display size like on your TV
 
Last edited by a moderator:
and temporal AA is inevitably disable on screenshot of course.

So screenshots (from Digital Foundry for example) would always look the same, regardless if 'Flicker reduction' or 'Normal' image quality mode was activated?

Will they always look as if 'Flicker reduction' image quality mode (same image quality mode as in Gran Turismo 5 Prologue as you said) was used, even if 'Normal' image quality mode was selected?
 
The one drawback to Temporal AA (ATI had it for many years before finally dropping it), is that the game must maintain a minimum of 60 FPS or you'll start to notice the different sample patterns as a bit of flickering, or other artifacts.

And even at 60 FPS, some people were able to notice this. Once it dropped below 60 fps, the effect became really noticeable. I find it odd they've chosen to go with that method of AA.

On PC, although it had the potential to greatly increase AA, the fact that LCD's were gaining in popularity (capped at 60 hertz), plus the fact that many games didn't have minimum framerates of at least 60 meant that it was rarely if ever used unless you had a CRT monitor + a game that spent most or all of it's time above 60 FPS.

Regards,
SB
 
The one drawback to Temporal AA (ATI had it for many years before finally dropping it), is that the game must maintain a minimum of 60 FPS or you'll start to notice the different sample patterns as a bit of flickering, or other artifacts.

And even at 60 FPS, some people were able to notice this. Once it dropped below 60 fps, the effect became really noticeable. I find it odd they've chosen to go with that method of AA.

Regards,
SB


Is there any performance penalty?
 
Well, surely I'm wrong, but the demo is only 200 MB, could be the graphic it's just more 'essential' compared to the final build.

I don't think the size of the download affects shadows. Aren't they a product of math operations ? There are no "shadow textures".
 
Is there any performance penalty?

The performance penalty is the same as a standard method using the same number of samples. However, the perceived quality would be of something roughly double the sample density depending on whether you can maintain a high enough framerate that your eye doesn't notice the alternating patterns.

So if you had for example alternating patterns of 4 samples, you'd have the performance hit of 4xAA but the potential perceived quality of 8xAA.

It looked and performed fantastic in games as long as framerate never dropped below 60 FPS (for most people). But again, that was not maintainable in the majority of new games.

Potentially you could up the framerate to 120 FPS and alternate 3 different sample patterns for even greater perceived quality.

It's a really fascinating avenue for AA that I love, but it's not particularly practical for gaming where you mave performance dips that suddenly make it noticeable.

Regards,
SB
 
Back
Top