Image Quality and Framebuffer Speculations for WIP/alpha/beta/E3 games *Read the first post*

Update : Insomniac confirm 720p for R&C. OT My question is: to confirm R&C is a 720p indeed explain framebuffer of 1280x720p etc etc it isn't properly a lie, but just a simple way to say your same thing? :smile: Or is it unfair?

yes R&C use 1280x720p framebuffer... but with 960x704 rendering. they use all MSAA sample with proprietary merge algo for reconstruct a true 1280x720 framebuffer (with little black border) and add proper 720p HUD
i talk OD and QB and it's just what i see and my interpretation. for crack in time i don't know
 
Last edited by a moderator:
yes R&C use 1280x720p framebuffer... but with 960x704 rendering. they use all MSAA sample with proprietary merge algo for reconstruct a true 1280x720 framebuffer (with little black border) and add proper 720p HUD
i talk OD and QB and it's just what i see and my interpretation. for crack in time i don't know
OK but what do you think about the statement? Is it correct or not to mean 1280x720p after all, obtained with a 'particular propetary alg''? :cry:
 
Last edited by a moderator:
I'd say 'no'. You're still taking a smaller number of pixel samples. The end result may look as good and so be a worthwhile optimisation, but on paper I wouldn't call it 720p
 
I'd say 'no'. You're still taking a smaller number of pixel samples. The end result may look as good and so be a worthwhile optimisation, but on paper I wouldn't call it 720p
yes I agree, (though the end result prolly will look slightly worse) but to call it 720p is misrepresentation i.e. bullshit

just reread again -
"merge method of 960x704x2 MSAA samples into a 1280x720 "

I assume thats 1280x720 + noAA
Im not 100% sure what theyre doing but perhaps the resulting image is in fact better quality than doing straight 1280x720 + no AA, i.e. its a bit of a strange example. Im assuming their method is slower than doing straight 1280x720+no AA, thus I assume they believe it looks better
 
Not sure if this is the right place to ask, but how is the technical qualitative analysis of PS3 upsccaler? Where does it stand for games?

I tried Fifa10 demo on my Dell 2407FPW, letting PS3 upscale and the monitor upscale to 1080p. I found PS3 upscaling a little more blurry. If i let PS3 upscale, will i take some performance hit? I would think letting the monitor upscale would introduce a little more input lag.
 
Not sure if this is the right place to ask, but how is the technical qualitative analysis of PS3 upsccaler? Where does it stand for games?

I tried Fifa10 demo on my Dell 2407FPW, letting PS3 upscale and the monitor upscale to 1080p. I found PS3 upscaling a little more blurry. If i let PS3 upscale, will i take some performance hit? I would think letting the monitor upscale would introduce a little more input lag.

I think the real problem it isn't the upscale but the QAA . QAA+upscale is a terrible combination.
 
I'd say 'no'. You're still taking a smaller number of pixel samples. The end result may look as good and so be a worthwhile optimisation, but on paper I wouldn't call it 720p

OK thanks. OT this alg is very interesting : could be very intriguing in a game like modern walfare 2, it's a pity to the third parties not have the access.
 
Last edited by a moderator:
yes I agree, (though the end result prolly will look slightly worse) but to call it 720p is misrepresentation i.e. bullshit

just reread again -
"merge method of 960x704x2 MSAA samples into a 1280x720 "

I assume thats 1280x720 + noAA
Im not 100% sure what theyre doing but perhaps the resulting image is in fact better quality than doing straight 1280x720 + no AA, i.e. its a bit of a strange example. Im assuming their method is slower than doing straight 1280x720+no AA, thus I assume they believe it looks better

Nah, you exaggerate a bit; we don't talking about upscaling here: the game not draw 1280x720p pixels, ok, but the framebuffer and the HUD are, so it isn't properly a misrepresantation. ND never talking of rendering after all.
 
Last edited by a moderator:
yes I agree, (though the end result prolly will look slightly worse) but to call it 720p is misrepresentation i.e. bullshit

just reread again -
"merge method of 960x704x2 MSAA samples into a 1280x720 "

I assume thats 1280x720 + noAA
Im not 100% sure what theyre doing but perhaps the resulting image is in fact better quality than doing straight 1280x720 + no AA, i.e. its a bit of a strange example. Im assuming their method is slower than doing straight 1280x720+no AA, thus I assume they believe it looks better

Why not read the Insomniac papers instead of assuming? The reason for the 704 is that it fits perfectly with 64x64 tiles ie 704/64 = 11 ie no waste. While 720/64 = 11.25 ie its an optimization and with most tvs doing overscan then most of us will not ever see those black borders 8 pixel big on each end....
 
Not sure if this is the right place to ask, but how is the technical qualitative analysis of PS3 upsccaler? Where does it stand for games?

I tried Fifa10 demo on my Dell 2407FPW, letting PS3 upscale and the monitor upscale to 1080p. I found PS3 upscaling a little more blurry. If i let PS3 upscale, will i take some performance hit? I would think letting the monitor upscale would introduce a little more input lag.
You should let your monitor do the upscaling. Scaling on PS3 is mostly a software process and takes memory away from the game.

Many PS3 games that nominally support 1080p output are really only optimized to give owners of these weird only-in-America, only-1080i "HDTVs" a better option than 480p. The resulting 1080p output may look blurrier than 720p.

Just tick all input resolutions in display settings that your monitor can accept (or let it auto-detect, same result). That way any native 1080p games can still look their best, while the overwhelming majority of ~native 720p games can output their ideal resolution directly as well.

The game resolution summary thread has extensive info on how the PS3 scaling works.
 
You should let your monitor do the upscaling. Scaling on PS3 is mostly a software process and takes memory away from the game.

Many PS3 games that nominally support 1080p output are really only optimized to give owners of these weird only-in-America, only-1080i "HDTVs" a better option than 480p. The resulting 1080p output may look blurrier than 720p.

Just tick all input resolutions in display settings that your monitor can accept (or let it auto-detect, same result). That way any native 1080p games can still look their best, while the overwhelming majority of ~native 720p games can output their ideal resolution directly as well.

The game resolution summary thread has extensive info on how the PS3 scaling works.

From the summary, letting PS3 upscale to 1080p. ... i will get more pixel details but at the cost of fillrate and memory..? This sound really good as it adds no additional display lag....the question if any one knows...the extra fillrate and memory..how will it be handled in a game..say Fifa10? Do i get more hiccups in the fps...or some textures/effects will be turned down....? If EA made the engine in mind for 1080p scaling, it is not likely to lose graphics detail...because the extra fillrate/memory has already been reserved?

Hey B3D gurus, i think this is a nice area to conduct an investigation into...to add on the above summary. 720p native vs 1080p PS3 upscaling, what are the costs and benefits! :)
 
Many PS3 games that nominally support 1080p output are really only optimized to give owners of these weird only-in-America and Japan, only-1080i "HDTVs" a better option than 480p. The resulting 1080p output may look blurrier than 720p.

There made it a bit more correct. :) And probably all NTSC using countries. And yes, it's basically required so that NTSC TV's that are HDTV ready (with no 720p support) won't have to default back down to 480p.

In most cases, setting the game to 720p and letting your TV do the upscaling is the better option for PS3.

Regards,
SB
 
From the summary, letting PS3 upscale to 1080p. ... i will get more pixel details but at the cost of fillrate and memory..? This sound really good as it adds no additional display lag....the question if any one knows...the extra fillrate and memory..how will it be handled in a game..say Fifa10? Do i get more hiccups in the fps...or some textures/effects will be turned down....? If EA made the engine in mind for 1080p scaling, it is not likely to lose graphics detail...because the extra fillrate/memory has already been reserved?

Hey B3D gurus, i think this is a nice area to conduct an investigation into...to add on the above summary. 720p native vs 1080p PS3 upscaling, what are the costs and benefits! :)
There's more at the link I gave you. We're retreading an old topic.

Software upscaling isn't about fillrate, it's not done on the GPU at all. Bandwidth usage is very light as well. It's about another chunk of memory to store the upscaled image for scanout.
With a fully general hardware scaler inside the scanout logic itself, the upscaled image would never be stored anywhere. The pixels would be interpolated on the fly, the moment they are sent over the HDMI/whatever output.
 
Ehm, 480x272 no AA? :D


Heh, not quite, that permanent HUD reduces the rendered image by a few lines (not sure how many). It also looks to be 16 bit colour as well (like most PSP titles).

Edit: Just seen some shots witha transparent HUD, guess its running at full native resolution now then.
 
Last edited by a moderator:
From the summary, letting PS3 upscale to 1080p. ... i will get more pixel details but at the cost of fillrate and memory..? This sound really good as it adds no additional display lag....the question if any one knows...the extra fillrate and memory..how will it be handled in a game..say Fifa10? Do i get more hiccups in the fps...or some textures/effects will be turned down....? If EA made the engine in mind for 1080p scaling, it is not likely to lose graphics detail...because the extra fillrate/memory has already been reserved?

Hey B3D gurus, i think this is a nice area to conduct an investigation into...to add on the above summary. 720p native vs 1080p PS3 upscaling, what are the costs and benefits! :)

I would really be interested in a test to see how much display lag is caused by TV scaling 720p to 1080p.

maybe a test of an entry level TV like a 60Hz vizio (which has also been the largest selling brand in the US for a couple years) & a mid tier TV like a samsung.

how would you go about testing that though? I know there's Digital Foundry article on input lag, but would the same methods be able to be used to measure display lag?
 
Has anyone taken a look at Lost Planet 2 yet? There is a PS3 demo coming out next week as well and I'm interested to see if they managed to close the performance gap with the newer version of MT Framework.
 
Back
Top