Image Quality and Framebuffer Speculations for Unreleased Games - Part 1 Why this thread: The purpose of this thread is to keep the speculation talk, based on dubious materials, out of the main image quality thread (which can be found here). Basic rules and guidelines: I'll start right off with the TL;DR version: TL;DR: Discuss image quality of renders, based on promotional screenshots and other dubious sources. Don't share your opinion on blurred or shimmery surfaces. "It's obvious for all to see" is not a technical argument. Don't waste your time, and ours (especially), telling us nobody cares about this topic. If you don't care, just don't read and pass your way. Talk should remain technical all the time. What you personally prefer is not the topic at all. Start a new thread, or try to fit that in an existing thread. Just not this one. Yes, you're obvious when you stealth-troll. Stay clear from that thread with that stuff or just don't act surprised when you get your posting rights removed. Full Version: This thread is dedicated for WIP, preview builds and marketing screenshots. The talk must be on a technical level all the time. Spare us the "I actually prefer blurred out edge" lines. Nobody calls you a liar, or disagree, it's just that it has nothing to do with technology matters. At the very least, you could point out to a paper/ongoing scientific discussion on the human eye and brain perception of computer generated graphics... But even then, it would off-topic and would deserve it own tread. You surely have an opinion on the end goal of scrutinising the image quality of games, just as I'm sure you've got an opinion on global warming or the relevance of reality TV shows. It's just that nobody in this thread wants to read about it. I'd suggest you to start a blog on some web2.0 social site and start feeding intelligence agencies databases with your personal information. Or, just go to the General Discussion forum to vent your frustration, if you must. To the fan persons on a mission to prove that one can really have nothing better to do than get internet angry over some video gaming related comment made by some stranger: No, your cunning plan to thinly veil your pathetic love for an expensive consumer electronic toy device over another behind tech-talk, you barely care for, is not obvlious to anyone. Add to that the fact that B3D is not a democracy, so don't except the benefit of doubt when you stealth-troll in this thread. How-To and other useful explanations: How to tell the resolution of a backbuffer based on the outputted image: Someone has yet to take the time to compile all the relative information in a single comprehensive post. So bear with us and track down some of that information in the following posts: http://forum.beyond3d.com/showpost.php?p=1070774&postcount=273 http://forum.beyond3d.com/showpost.php?p=1070972&postcount=282 http://forum.beyond3d.com/showpost.php?p=1071006&postcount=284 http://forum.beyond3d.com/showpost.php?p=1071084&postcount=292 http://forum.beyond3d.com/showpost.php?p=1065791&postcount=225 http://forum.beyond3d.com/showpost.php?p=1065280&postcount=222 http://forum.beyond3d.com/showpost.php?p=1167507&postcount=29 Frame Buffer Calculations & Memory Consumption Back-Buffer(s) = Pixels * FSAA Depth * Rendering Colour Depth (may include multiple render targets for deferred rendering techniques) Z-Buffer = Pixels * FSAA Depth * Z Depth (usually 32-bit depth) Front-Buffer(s) = Pixels * Output Colour Depth (this is what you see, almost always resolved to 8-bit per component rather than 10-bit or 16-bit) Total = Back-Buffer(s) + Z-Buffer + Front-Buffer(s) Note: For Xenos, the back buffer and z-buffer must fit within the 10 MiB (10*1024*1024 bits, 8 bits/byte) to avoid tiling. Frame buffer formats RGBA8 = 8 bits for Red, 8 bits for Green, 8 bits for Blue, 8 bits for alpha = 32 bits per pixel FP10 = 10 bits for RGB components, and 2 bits for alpha = 32 bits per pixel FP16 = 16 bits for each component = 64 bits per pixel (no support for hardware alpha blending on Xenos) NAO32 = LogLUV conversion, In-Depth explanation = 32 bits per pixel (no hardware alpha blending) Example: 1120x585, 2xMSAA, FP10, 32-bit Z-buffer back buffer + Z = 1120*585*2*(32/8 + 32/8) = 10483200 bytes = 9.9975... MiB Aspect Ratio Some of you might be wondering how games like Call of Duty 4 (1.71:1) , Halo 3(1.8:1) , Metal Gear Solid 4 (4:3) can have rendering resolutions that are not 16:9 aspect ratio. All you need to learn about is anamorphic widescreen. The image is squeezed into the rendered resolution but is then stretched to the proper 16:9 presentation. An example of this squeezing can easily be seen in any Doom 3 engine games (Quake 4/Prey/Quake Wars). If you have one of them handy on your PC (latest version will do), try setting your resolution to 960x720 and in the console type r_aspectratio 1 for 16:9 or 2 for 16:10. All you'll see is the in-game view being squeezed/stretched horizontally. On the flip side, if you render the game at 1280x720 while still in 4:3 mode, the Mancubus just might be the fattest enemy you'll ever see. You can help it lose some weight by setting the game to 16:9. And of course, the isomorphic 1280x720 rendition will offer more image clarity than the anamorphic 960x720. Multisample AA Multiple geometry/sub-sample (reddish squares in below images) points with a particular weighting (surrounding the texture sample point, green square in below images) are used to determine the colour of the pixel being rendered. Sample positions can differ between AMD/nVidia. As RSX is based on the G70 architecture, the following sample patterns should apply. In the case of Xenos, it would not be unreasonable to assume that it uses the same patterns as ATI has used in the past (R300+). The result for 2xMSAA is that there may be one intermediary shade in between polygon edge steps; one sample is found to be within one polygon (e.g. colour A), and the second sample is found in another polygon (e.g. colour B). If both sample points have equal weightings, the resultant pixel would be 50% colour A, 50% colour B. Obvious results are obtained when a polygon edge bisects the shortest imaginary line connecting the two geometry sample points. Hence, 2xMSAA for G70 will look slightly different to 2xMSAA for R520 (see sample positions below). For the case of 4xMSAA, there may be more shades in between polygon edge steps due to the higher number of geometry samples, resulting in a smoother transition between steps. With equal weightings to each sub-sample, there will be three intermediary shades. The easiest way to see MSAA level is to have a straight-edged object overlapping another object/background with a high contrast in colours between the two e.g. black object against a white background. Beware of JPG compressed screenshots where pixels near high frequency components (edges) can be distorted. G70 sample patterns 2xMSAA http://www.beyond3d.com/images/reviews/g70/2x.png 4xMSAA http://www.beyond3d.com/images/reviews/g70/4x.png R520 - sample patterns 2xMSAA http://www.beyond3d.com/images/reviews/R520/aa_msaa_samp_2x.gif 4xMSAA http://www.beyond3d.com/images/reviews/R520/aa_msaa_samp_4x.gif Quincunx AA on PS3 - two geometry sample points are used just like 2xMSAA (so the same storage cost), but it also uses 3 samples belonging to neighbouring pixels (regardless of a polygon edge) to the right and below of the original texture sampled point (see sample pattern image for clarity). The result is a blurring of the entire image, but higher perceived polygon AA. Consider a texture of higher frequency components, lots of different colour patterns. The current pixel's two geometry sample points indicate the pixel is entirely within one polygon. However, the three neighbouring sub-samples are still accounted for in the final pixel, hence the overall image blur. Quincunx sample pattern http://www.beyond3d.com/images/reviews/GF4/gf4samplepattern.jpg Comparison between 2xMSAA/QAA/ & blur filters http://upsilandre.free.fr/images/Quincunx.jpg Temporal AA on PS3 (ala Quaz)- odd and even frames are rendered with a half-pixel shift. The current frame is blended with the previous frame to achieve a similar effect as super sample AA for static scenes. In a moving scene, the blending of the odd and even frames produces a persistent blurring of the image. However, this is also advantageous for the edges of alpha-to-coverage primitives, because traditional multisampling does not work*, only super sampling. *see transparency AA or adaptive AA settings on appropriate PC hardware. Black Levels & Output Xbox 360 Standard = 16-235 Intermediate Expanded = 0-255 PS3 Limited Range = 16-235 Full Range = 0-255 Wii Undocumented at the moment (feel free to address the issue) Current list of game rendering resolutions and IQ List for Playstation 3 Games List for Xbox 360 Games Archive of the older threads on the same topic: Thread 1: The Neverending Upscale Discussion Thread * Summary=#457 http://forum.beyond3d.com/showthread.php?t=43330 Thread 2: Neverending Upscaling/Resolutions/AA etc Thread #2 *Rules: post: #616 * http://forum.beyond3d.com/showthread.php?t=46242 Thread 3: Neverending Upscaling/Resolutions/AA etc Thread #3 (Rules Post #1!) http://forum.beyond3d.com/showthread.php?t=48252 Footnotes Thanks to all the people who contributed to the list, took captures and wrote large parts of the explanations. AlStrong, Quaz51/Upsilandre, grandmaster, dot50cal, etc.