Digital Foundry Article Technical Discussion Archive [2015]

Status
Not open for further replies.
I don't understand how that relates to what I said. To clarify and expand on my comment; While CRTs don't have a fixed resolution and therefore don't always require scaling of alternate resolutions like fixed pixel displays do, in order to avoid having to scale you would have to render and output to a supported resolution. Given that the next standard resolution below 1920X1080 is 1280X720 you would be making a big reduction in overall resolution in order to avoid scaling. So, not having fixed pixel displays wouldn't really make for less of an issue.

Without fixed pixel displays, the likelihood of additional resolutions being supported would be greater. There are actually lots of different "standard" digital resolutions on PC, and supporting them on TVs would be trivial.

1600 x 900 is actually a standard PC res, and for a while 1600 x 900 panels were fairly commonplace. 1366 x 768 was another one. So was 1024 x 768, actually. A CRT with only the standard VGA and DVI resolutions supported as input would be hugely better than any LCD TV in terms of resolution support.

The choice-limited 720p and 1080i/p were standards created without anyone giving even the merest hint of a shit about the demands of realtime graphics and native resolutions. :(
 
I don't understand how that relates to what I said.
You said that the presence of scaling artifacts wouldn't be a difference that we'd be discussing. But if we had a very small number of supported signal resolutions (like we do today), developers would potentially still sometimes pick in-between resolutions (like 900p), which would have scaling artifacts since the image would have to be scaled to the signal.

Without fixed pixel displays, the likelihood of additional resolutions being supported would be greater.
Maybe. The HD CRTs that were actually made don't inspire much confidence, though.
 
You said that the presence of scaling artifacts wouldn't be a difference that we'd be discussing. But if we had a very small number of supported signal resolutions (like we do today), developers would potentially still sometimes pick in-between resolutions (like 900p), which would have scaling artifacts since the image would have to be scaled to the signal.

I just thought it was odd that you addressed that to me given the post I was responding to was where the idea was actually put forward, but whatever. I did know this as should be clear from my expanded response. *If* scaling never happened then your only choice left to increase performance would be to dramatically reduce resolution and we'd be having a discussion about that is the point I was trying to make.

Maybe. The HD CRTs that were actually made don't inspire much confidence, though.

I agree. I expect TVs would still likely have only supported as many resolutions as they had to to be compatible with the primary media formats.
 
You have to send them a resolution they support, though. Are you advocating sending out 720p and scaling that?

You can send a 1080 variant like 1440x1080 which many of the original hd video cameras used to record at, and let it horizontally scale. I suspect that would be extremely difficult for most people to detect from 1920x1080. That would free up a healthy amount of console power which could be used for other more visible things.
 
Yes, that's what I was getting at. There was no scaling as such, the scanout simply converted the colour across that element of the image to the appropriate period of the signal output. Horizontal scaling - or rather stretching - was automatic.

So the Megadrive's 256 and 320 pixel horizontal modes both appeared perfect. No scaling artefacts. Lovely!
I don't see anything preventing the RGB signal being mathematically transformed into an analogue one and processed into an upscaled signal exactly as worked on CRT. But the result is blur! 320x256 worked because the pixels were frickin massive. We could certainly upscale chunky pixels nicely - see modern pixel-art games. An analogue 1600x900 image stretched to a 1920x1080 panel is going to look blurrier than native no matter how you upscale it.
 
Maybe. The HD CRTs that were actually made don't inspire much confidence, though.

True, but they were frikkin old and almost all of them didn't even support native 720. I think 1440 x 1080i was about the level most of them topped out at (575 vertical lines displayable). There were a couple of full 1080 CRTs, iirc, but they were huge and very expensive.

PC style resolutions should have been supported by all TVs IMO. HD concoles, laptops, and screencasts weren't so important when HD was defined though.

I don't see anything preventing the RGB signal being mathematically transformed into an analogue one and processed into an upscaled signal exactly as worked on CRT. But the result is blur

Well on a properly set up CRT, the blur is only towards the edge of the frikkin massive pixels. And on something like a Trinitron (especially a monitor) with it's mask, the blurred area at its optimally supported resolutions was hidden.

320x256 worked because the pixels were frickin massive. We could certainly upscale chunky pixels nicely - see modern pixel-art games. An analogue 1600x900 image stretched to a 1920x1080 panel is going to look blurrier than native no matter how you upscale it.

But that's the point we were discussing: fixed resolution displays have inherent weaknesses when you move away from the fixed resolution! A large CRT with a tightly focused beam and high density phosphors would be vastly superior at handling common, PC spec resolution like 1600 x 900 (or 1440 x 1080, or 1366 x 768, or 1024 x 768, or etc ....)
 
Sorry but I used to have what, at the time, was a 'large' CRT. 32" or more, can't remember, tiny by today's standards. The thing was absolutely humongous and took up half of my old lounge, it weighted a ton and could not be carried by one person and sucked as much electricity as you would need to run a small village.

The world is a much, much better place without CRTs, whatever the benefits in image quality were, which by now have largely been surpassed, and for many years, by good flat panels.
 
I didn't say I wanted to go back to having TVs that were to big to wrestle!

Some kind of none fixed res display would be nice though. Or failing that, so high res it no longer matters, I guess ...
 
I didn't say I wanted to go back to having TVs that were to big to wrestle!

Some kind of none fixed res display would be nice though. Or failing that, so high res it no longer matters, I guess ...
You did!!! :yep2:

I think good 4K sets will alleviate this 'problem'. It's not really a problem on current good 1080p panels, but surely 4k sets with a decent scaler will be great, which is why I think native 4k in next gen will not be the norm as well-upscaled 1440p and 1080p with great IQ will be just fine for a little while.
 
You did!!! :yep2:

I think good 4K sets will alleviate this 'problem'. It's not really a problem on current good 1080p panels, but surely 4k sets with a decent scaler will be great, which is why I think native 4k in next gen will not be the norm as well-upscaled 1440p and 1080p with great IQ will be just fine for a little while.
With what happened in this gen regarding the whole resolutiongate since 2013 (it's sill not over) I would tend to think that both companies (particularly Microsoft) will actively and specifically aim for decent 4K gaming for their next hardware and they'll probably be very vocal about it (marketing and others PR).
 
You can't aim for a resolution in hardware. If you can produce x quality graphics at 4k, and that's your target, you can produce 4x graphics at quarter that res. And if Joe Blogs can't even see the difference in resolution, why not? Bestest graphics with imperceptible blur versus less nice pin-sharp graphics which look worse...
 
With what happened in this gen regarding the whole resolutiongate since 2013 (it's sill not over) I would tend to think that both companies (particularly Microsoft) will actively and specifically aim for decent 4K gaming for their next hardware and they'll probably be very vocal about it (marketing and others PR).
Nah, 4k as standard in 5 or 6 years will not happen. And I daresay it is not needed. Look at what The Order is doing at 1080p (black bars and all). We'll get a lot of uber clean 1080p graphics, a lot of higher res or maybe supersampled, and some less demanding games at 4k. Maybe. My crystal ball showed me.
 
I think good 4K sets will alleviate this 'problem'. It's not really a problem on current good 1080p panels, but surely 4k sets with a decent scaler will be great
There are 4K sets with decent scalers (I have one) but you're not going to get one at bargain basement prices. As time has permitted, I've been paying close attention to how the TV is handling upscales and can confirm it's not just a linear upscale because upon inspecting frozen images there are no detectible blocks of pixels as you would expect, indeed it seems the 'xreality pro engine' (XRP) is trying to provide a smooth upscale because very few pixels appear to be sharing the colours of their immediate neighbours - admittedly I'm not checking all 8 million pixels ;)

Sony's explanation seems to confirm this is what it's doing. The XRP processor only appears on Sony's higher-end sets. I'm sure Samsung (and others) have equivalent technologies on their higher-end sets.
 
I want one.

help-me-im-poor.jpg
 
Sony's explanation seems to confirm this is what it's doing. The XRP processor only appears on Sony's higher-end sets. I'm sure Samsung (and others) have equivalent technologies on their higher-end sets.

I would say so... I got one of those Samsung ones^^

Just putting Samsungs top end 4K tvs next to the low end ones is... visibly different, even at just 1080P (which in theory should just be a pixel quadrupling).

I've bought a new AVR as well... which has a 30Hz 4K upscaler. That one upscales MUCH sharper than my TV from 1080P... but it seems strange. But since it's just 30Hz, it's useless anyways (didn't buy it for the upscaler).
 
I would say so... I got one of those Samsung ones^^

Just putting Samsungs top end 4K tvs next to the low end ones is... visibly different, even at just 1080P (which in theory should just be a pixel quadrupling).

What these scalers and image processors are doing almost sounds like witchcraft. This is Sony's page (linked above in the previous post) states:

The X-Reality PRO Engine analyzes each pixel of the video signal, comparing it to neighboring pixels across the picture height, picture width and also pixels at the same screen location from preceding and following video frames. The system analyzes the picture across three dimensions: Width and Height (including diagonal) and Time. Because it analyzes vertical, horizontal and even diagonal motion from frame to frame, it knows that a white blob in the foreground of a building is actually a seagull flying across.

The X-Reality PRO Engine compares patterns in the image against patterns in the internal database to determine the best possible hue, saturation and brightness for each pixel. In this way, the system restores detail lost in compression and transmission. So fine texture in plants and clothes is optimized. The system also reduces noise and reproduces optimum contrast, color and sharpness.

Impressive even if only half true! But yeah, the difference between my old 1080p Sony and this 4K Sony is astonishing. You expect fundamentally better screen technology, i.e.better contrast and colour reproduction because technology marches on, but it's just that the additional detail is perceptible. I'm not claiming I can pick out a pixel from my sofa and, hand on heart, swear it wasn't there on the earlier TV but everything just looks better. The appropriate technical language fails me I'm afraid. :yep2:
 
What these scalers and image processors are doing almost sounds like witchcraft. This is Sony's page (linked above in the previous post) states:

The X-Reality PRO Engine analyzes each pixel of the video signal, comparing it to neighboring pixels across the picture height, picture width and also pixels at the same screen location from preceding and following video frames. The system analyzes the picture across three dimensions: Width and Height (including diagonal) and Time. Because it analyzes vertical, horizontal and even diagonal motion from frame to frame, it knows that a white blob in the foreground of a building is actually a seagull flying across.

The X-Reality PRO Engine compares patterns in the image against patterns in the internal database to determine the best possible hue, saturation and brightness for each pixel. In this way, the system restores detail lost in compression and transmission. So fine texture in plants and clothes is optimized. The system also reduces noise and reproduces optimum contrast, color and sharpness.

Impressive even if only half true! But yeah, the difference between my old 1080p Sony and this 4K Sony is astonishing. You expect fundamentally better screen technology, i.e.better contrast and colour reproduction because technology marches on, but it's just that the additional detail is perceptible. I'm not claiming I can pick out a pixel from my sofa and, hand on heart, swear it wasn't there on the earlier TV but everything just looks better. The appropriate technical language fails me I'm afraid. :yep2:

The question, though, is how much latency does all this processing add. When gaming, this is something I almost assuredly would be turning off.
 
You can't aim for a resolution in hardware. If you can produce x quality graphics at 4k, and that's your target, you can produce 4x graphics at quarter that res. And if Joe Blogs can't even see the difference in resolution, why not? Bestest graphics with imperceptible blur versus less nice pin-sharp graphics which look worse...

It's the exact same argument as framerate. People out there claim they can't perceive framerate differences, so why not make prettier graphics that run at 10fps instead of 60fps? Because we don't base our choices around people who will be ignorant of them regardless.

Resolution is a guaranteed improvement to image quality and the CU differential vs xbone makes 1080 a no brainer on PS4 this generation. DF is fighting windmills.
 
It's the exact same argument as framerate. People out there claim they can't perceive framerate differences, so why not make prettier graphics that run at 10fps instead of 60fps? Because we don't base our choices around people who will be ignorant of them regardless.

There is a baseline after which most people are satisfied and/or can no longer appreciate the difference. Like with music, super audio cd's may sound better but many people can't tell the difference between them and mp3 level of quality. With tv many can't tell the difference between a typical 1080p satellite signal and more highly compressed netflix content. With framerate you can easily run tests and see that 10fps would be unacceptable to most. With resolution it's the same thing, run blind tests and see what happens. You find what works best for most people and/or what maximizes your visual bang for the buck given the nature of fixed hardware. Is it wiser to spend 25% of your computational resources on something only 5% of the people will notice, or spend 25% on something 75% of the people would notice?
 
Status
Not open for further replies.
Back
Top