Changes in display from switching to HDMI *moved*

Disembodied voice of the Mod : Moved from the IQ analysis thread.

I'm not sure if it's appropriate to post this here... (So you should have posted it elsewhere ;)) ..., but I recently got the newer 360 model, mainly for the HDMI. I was using my old 360 with the component cable, which was actually great, other than some minor overscan issues, the color looked vibrant, the image was sharp etc.

So, I was expecting only the best with the HDMI..

And I'm really suprised to find out that it's actually worse on the HDMI. I have no idea what causes this, maybe it's just my TV since it's a Sony Bravia, probably more optimized for the PS3 color space, but the 360 HDMI just looked bad when compared to the PS3.

http://www3.telus.net/public/dhwag/HDMI360vsPS3.jpg

this is the BioShock on exact same brightness setting, and of course the TV setting is exactly the same as well, and I even used the same HDMI cable too (Monster)


What's even more suprising is the Sonic

http://d.hatena.ne.jp/yoda-dip-jp/

the screen captures from our friend in Japan show that it's the PS3 version that's got the washed out color with the blur, but when I was running the demo on my TV, it was exactly the opposite

http://www3.telus.net/public/dhwag/HDMI360vsPS3A.jpg

(The PS3 is set to RGB 'LIMITED' and the 360 Reference set to Standard)


http://www3.telus.net/public/dhwag/HDMI360vsPS3B.jpg

This one's Tomb Raider UW, and the overall IQ difference is simply staggering.


Now this makes me wonder how are things on HDTVs from other brands, like Samsung, LG, Sharp etc.

At least on my Bravia, all these IQ comparisons based on the screen capture seem completely irrelevant :???:

Mod : The analysis avoids completely these variations due to setup, letting us know exactly what the engine is outputting. In terms of colour intensity, contrast, etc., if we want we can use evaluation systems (histograms) to know exactly which has higher contrast, saturation etc. on output without the choice of display or cabling having any effect.
 
Not all devices use the same settings on your TV. Doing a basic calibration (like brightness, contrast etc.) is recommended for every device.
 
Iam very surprised by your bioshock results not about the colors but by the texture detail on the PS3 version.

Unlike the comparison images we were getting some time ago, the detail here seem to be intact and identical to the 360 version.
 
Last edited by a moderator:
I had just figured out what happened,

The 360 was just running at an odd resolution of 1280 x 768 (duh)

So, the image ouput without any of the HDTV image processings,

that's why it didn't look as good as the PS3.

Now the problem's fixed, everything look beautiful :cool:

However it still leaves me wondering how accurate is a capture of PS3 frame buffer?

The standard HDMI RGB signal ranges from 16~235, which is usually why people get washed out color, when they are using RGB 0~255 HDTV or monitor. There's of course the RGB FULL option, which supposedly stretches RGB 16~235 signal to 0~255, and my question is if there's any kind of loss in the process. Because even at the RGB FULL, many of the PS3 captures look plain washed out, and it's definitely far from the color it displays on the HDTV.
 
Iam very surprised by your bioshock results not about the colors but by the texture detail on the PS3 version.

Unlike the comparison images we were getting some time ago, the detail here seem to be intact and identical to the PS3 version.

With the 1.1 patch, the blur filter's gone. However the PS3 version is still running at sub 720P (680P to be exact) So the overall IQ is still lesser than the 360 version.
 
AFAIK, the PS3 properly maps 16->0 and 235->255, but you should only enable this if your display is expecting this range, otherwise you will be clipping detail below 16 and above 235.
 
I was going to ask if you were running the correct output resolution on the X360 when I saw those shots of TRU. It seems my gut feeling was correct.
 
Back
Top