Will next gen consoles focus on improving IQ at current HD resolutions?

It was not about the display quality, just about if chroma subsampling occurs or not. The latter doesn't require a high-end display to be noticed at all. Of course that is only if the source is full 4:4:4 (games and PC desktop for example).

Maybe you just haven't seen a direct comparison of the difference?

When you have displays which are backlit(or worse, edge lit) there just isn't that much variance between neighbouring pixels. Most of the displays are crap if you would want to have invidual and decoupled pixels with good colour reproduction. There are displays that are good, but they are also expensive. If you look at that sonys crystal led display which was shown at CES it would be perfect as each pixel can be invidually set(though I doubt we will see that set being sold or if it will then price will be 5 digits+)
 
And the list of tv's supporting it is rather short on the post #2

And what do you want to express with that?

You don't seem to get that this it not due to incapable panels or something like that in certain HDTVs, but due to processing the picture internally with chroma subsampling.
 
But you have to also realise that most tv's don't have control over invidual pixels. When you put black and white pixel next to each other you are not going to get black and white due to the way pixels are lit. You will get either white and gray or gray and black. This is largely the reason why newer sets which have better backlightning can continue on improving on image quality. Because we don't have good enough control over the invidual pixels colour intensities yuv420-444 becomes largely academic unless you have highend tv which has good colour reproduction. This is also the reason why there was so much raving @ces about the colour reproduction of some new expensive sets like the sony crystal and samsung I linked earlier.
 
To me the more interesting variant of yuv444 has actually 10bit per channel colours as opposed to 8bits per channel. 10bits per channel combined with good display would be really, really nice. It's not so much about chroma even though higher chroma resolution would be nice too.

Where I see 10bit support available is only very highend monitors designed for photoshopping and other image manipulation.
 
Nobody was talking about video footage, which is stored at 4:2:0 almost always anyway (even on Blu-ray Disc) ;).

It's about gaming output (PS3/Xbox 360/PC for example) and PC desktop output as another example, which is full 4:4:4 RGB ;).

Yeah I understand that, I just question if the average joe would notice the difference when playing a pc game on a 4:4:4 display compared to a 4:2:0 display. I can spot it on red's mostly where it does look a bit mucked up, but only if I'm actively looking for it. Like when video editing I can see the issue of having a 4:2:0 source clearly on the reds, but when watching the footage more casually it's harder to spot.
 
Well in some cases the new consoles are too removed from the old to handle direct renders. eg. any game exploiting PS2's incredible overdraw and transparency is going to hit a wall on PS3. A beast with a zillion fur polygons just can't be rendered the same way on PS3.

OT It's quite funny realize only now something like this, because in my memories ps2 hardware at that time was costantly denigrated in every aspect from the most of people compared the competion... pretty curious to see when ps4 will be pretty mature & ps3 definitely dead whether cpu cell will follow the same fate... amd cpu on ps4 scare me enough,
 
Last edited by a moderator:
Even at full 1080p, with overscan and all user accessible post-processing disabled, unfortunately many HDTVs today apply chroma subsampling to 4:4:4 input signals, thus not reproducing full color resolution, which far too often can not be bypassed. And in the few cases where it can be bypassed, it far too often introduces some other disadvantages.

As PS3, Xbox 360 and PC games for example output full 4:4:4 (RGB is always 4:4:4), unfortunately many HDTVs today do not reproduce them pixel perfectly because they subsample from 4:4:4 down to 4:2:2 (or maybe even 4:2:0), regardless if the output is 720p or 1080p and regardless if overscan is enabled or disabled.
I wonder if people can really notice 4:4:4/4:2:2 that easily. Most hd video cameras in the < $5000 range are 4:2:2 or worse and I don't think people can tell much.

Nobody was talking about video footage, which is stored at 4:2:0 almost always anyway (even on Blu-ray Disc) ;).

It's about gaming output (PS3/Xbox 360/PC for example) and PC desktop output as another example, which is full 4:4:4 RGB ;).
Yeah I understand that, I just question if the average joe would notice the difference when playing a pc game on a 4:4:4 display compared to a 4:2:0 display. I can spot it on red's mostly where it does look a bit mucked up, but only if I'm actively looking for it. Like when video editing I can see the issue of having a 4:2:0 source clearly on the reds, but when watching the footage more casually it's harder to spot.

By the way, just came across the following illustration:


4:4:4:

rgbk3olr.png


4:2:0:

yuv420amq5v.png


:eek:

You still think it wouldn't make a big difference :???:?
 
Last edited by a moderator:
Joker didn't say it doesn't make a difference. He said it makes a difference that he doubts anyone will really notice. Your example is an unrealistic worst case situation, not very present in real games, and hence chroma subsampling won't be noticeable by many on a lot of occasions in actual use. That's why these solutions are used, because Joe Public doesn't massively notice. Clearly if monitors were full of blurry text, people would be shopping elsewhere. ;)
 
Back
Top