Image Quality and Framebuffer Analysis for Available/release build Games *Read the first post*

Are all PS3 shots 540p or just that particular one? Can't believe dynamic res changing is happening in COD games.

Not sure how you got dynamic res out of that statement. The problem with resolution identification at such a high amount of upscaling is that slight changes in step: pixel ratios can give different figures. I've covered this before, but for example, we can take the 540p which is easily checked for say up to 40 pixels (30 steps). Then if you check 48 pixels, you might get 37 steps instead of 36. So it isn't really 540 is it?

Or to take a more obvious example, 5/6 vs 4/5 ratio. You'd need a minimum of 30 pixels to discern between the two. Out of 30 pixels, you'd expect 25 steps for the former, and 24 steps for the latter. Scaled up to 720p, that's the difference between 600p and 576p, respectively.

But anyways, 960x544 and 1040x608* for Black Ops seems close enough. In the end, I have to wonder if memory played a part given that the difference between the two is about 100K pixels (that much saved in ms per frame?) but then knowing how Xenos resolves its MSAA'd render targets to main memory versus any other GPU. Also keep in mind that the 3D mode is renders two separate frames as well.

*4/5 x 5/6 did not seem to hold consistently for the 360 version this time.
 
So 360 gains 18000 pixels while PS3 loses 92000 pixels and gets lower res textures/transparency ? (God forbid if it even gets lower framerates)

Anyways that's quite a bit of a drop in resolution, not sure what they were trying to pull off. If its due to the lighting engine then I don't think I'll call it a fair trade off considering the immense loss of detail. Beside WaW's lighting was close enough to Black Ops' (judging by the PC version). Have to wait for the DF article to get a better perspective.

Also any idea regarding the resolution while running 3D ?
 
surprised DF didnt mention anything about the file sizes... big differences in disc size gonna guess its for load time cinematic and the ps3 version probably inflated the file sizes to hamper piracy (redistribution of large files is more difficult)
 
surprised DF didnt mention anything about the file sizes... big differences in disc size gonna guess its for load time cinematic and the ps3 version probably inflated the file sizes to hamper piracy (redistribution of large files is more difficult)
1. I believe the extra size is due to audio, since the PS3 version has 7.1 audio
2. Inflating file size to hamper piracy isn't a very effective tactic, since the pirates are experts at ripping :)
 
Not sure how you got dynamic res out of that statement. The problem with resolution identification at such a high amount of upscaling is that slight changes in step: pixel ratios can give different figures. I've covered this before, but for example, we can take the 540p which is easily checked for say up to 40 pixels (30 steps). Then if you check 48 pixels, you might get 37 steps instead of 36. So it isn't really 540 is it?

Or to take a more obvious example, 5/6 vs 4/5 ratio. You'd need a minimum of 30 pixels to discern between the two. Out of 30 pixels, you'd expect 25 steps for the former, and 24 steps for the latter. Scaled up to 720p, that's the difference between 600p and 576p, respectively.

But anyways, 960x544 and 1040x608* for Black Ops seems close enough. In the end, I have to wonder if memory played a part given that the difference between the two is about 100K pixels (that much saved in ms per frame?) but then knowing how Xenos resolves its MSAA'd render targets to main memory versus any other GPU. Also keep in mind that the 3D mode is renders two separate frames as well.

*4/5 x 5/6 did not seem to hold consistently for the 360 version this time.
That's quite a detailed explanation Als thanks. I just automatically assumed the same situation as Riddick AA but turns out it's actually rendering at a lower resolution. Didn't think the res would drop this low though.
 
I looked through this thread and the unreleased game thread but didn't see any posts about Naruto Ninja Storm 2. Doesn't seem to be much interest here for the game, but it's quite beautiful and I was wondering what the res was for both version? I thought I read somewhere that the PS3 version was 720p no AA while the 360 version was a bit lower res and again no AA.

Anyone know if this is correct?

If this is the case I'm curious why since a 720p framebuffer should fit inside the eDRAM when AA isn't being used, correct? I was hoping DF would do a face-off on the two versions, but I didn't see that on the site. Would like to know where the bottleneck was for the 360 version.

It's a gorgeous game, but some of the details on the characters are lost when pulled back a bit.
 
I thought I read somewhere that the PS3 version was 720p no AA while the 360 version was a bit lower res and again no AA.

IIRC, it was 1280x720 on PS3, 1280x640 on 360.

If this is the case I'm curious why since a 720p framebuffer should fit inside the eDRAM when AA isn't being used, correct?
Hard to say without knowing their engine techniques.
 
Hey thanks Al, much appreciated. Wish we had more info on the engine since cell shaded games rarely impress me, but this game looks great.
 
But GT5 doesn't use QAA ... it uses regular MSAA. Are you sure you've set up your TV correctly? And what kind of filter settings do you have enabled? Because GT5 supports four (I have it on default, haven't messed with it yet). Some people seem to prefer one of the other settings.

I thought the 1080p mode used QAA and the 720p mode used MSAA? At least I thought that's what AlStrong suggested somewhere, but I couldn't find the post.
 
According to Quaz51, 1080p mode uses 2xMSAA + temporal SSAA (in Normal mode) for a 4xAA effect. If you select blur reduction, it disables the temporal SSAA making it look more like GT5P. I find SSAA makes the image a little blurry.

Here's what he said
this demo use 1280x1080 like prologue but with a 2x MSAA mix with a 2x temporal SSAA (for simulate 4xAA in 1080p)
they shift the pixel rasterization between odd and even frame for simulate super-sampling like DMC or MGS but they don't blend frame like DMC and MGS (no blurry effect here). they just make use of display persistence and retinal persistence. this cause flickering effect (variable in accordance to type of display) but more accurate image (the temporal AA work on alpha-coverage aliasing and others aliasing where the MSAA don't work)

you can disable this temporal AA if you choose "flicker reduction" in the display option (it's the GT5 prologue display mode)

i don't know what effect is enable with the "sharpen" mode. it seem like the normal mode
This is for the Academy TT.
 
There has been a change to the framebuffer setup ?
Also is the framerate reduced to 30FPS while running 3D ?

The QAA appears to be a new decision from what I recall, but I could be wrong. I wasn't looking for it at the time. I haven't seen a framerate analysis run, but FWIW, the 3D mode is rendering two full frames.

edit:

Actually, looking back at the time trial shots, QAA may have already been there. Only reason I checked this time was because it showed up more readily in the 3D mode. With the higher pixel:texel ratio in 1080p mode, the blurring of finer texture details is mitigated somewhat since it's only looking at neighbouring pixels.
 
Back
Top