Digital Foundry Article Technical Discussion Archive [2015]

Discussion in 'Console Technology' started by DSoup, Jan 2, 2015.

Thread Status:
Not open for further replies.
  1. function

    function None functional
    Legend Veteran

    Joined:
    Mar 27, 2003
    Messages:
    5,759
    Likes Received:
    4,105
    Location:
    Wrong thread
    Without fixed pixel displays, the likelihood of additional resolutions being supported would be greater. There are actually lots of different "standard" digital resolutions on PC, and supporting them on TVs would be trivial.

    1600 x 900 is actually a standard PC res, and for a while 1600 x 900 panels were fairly commonplace. 1366 x 768 was another one. So was 1024 x 768, actually. A CRT with only the standard VGA and DVI resolutions supported as input would be hugely better than any LCD TV in terms of resolution support.

    The choice-limited 720p and 1080i/p were standards created without anyone giving even the merest hint of a shit about the demands of realtime graphics and native resolutions. :(
     
  2. HTupolev

    Regular

    Joined:
    Dec 8, 2012
    Messages:
    936
    Likes Received:
    564
    You said that the presence of scaling artifacts wouldn't be a difference that we'd be discussing. But if we had a very small number of supported signal resolutions (like we do today), developers would potentially still sometimes pick in-between resolutions (like 900p), which would have scaling artifacts since the image would have to be scaled to the signal.

    Maybe. The HD CRTs that were actually made don't inspire much confidence, though.
     
  3. mrcorbo

    mrcorbo Foo Fighter
    Veteran

    Joined:
    Dec 8, 2004
    Messages:
    3,997
    Likes Received:
    2,806
    I just thought it was odd that you addressed that to me given the post I was responding to was where the idea was actually put forward, but whatever. I did know this as should be clear from my expanded response. *If* scaling never happened then your only choice left to increase performance would be to dramatically reduce resolution and we'd be having a discussion about that is the point I was trying to make.

    I agree. I expect TVs would still likely have only supported as many resolutions as they had to to be compatible with the primary media formats.
     
  4. joker454

    Veteran

    Joined:
    Dec 28, 2006
    Messages:
    3,819
    Likes Received:
    139
    Location:
    So. Cal.
    You can send a 1080 variant like 1440x1080 which many of the original hd video cameras used to record at, and let it horizontally scale. I suspect that would be extremely difficult for most people to detect from 1920x1080. That would free up a healthy amount of console power which could be used for other more visible things.
     
  5. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    43,576
    Likes Received:
    16,034
    Location:
    Under my bridge
    I don't see anything preventing the RGB signal being mathematically transformed into an analogue one and processed into an upscaled signal exactly as worked on CRT. But the result is blur! 320x256 worked because the pixels were frickin massive. We could certainly upscale chunky pixels nicely - see modern pixel-art games. An analogue 1600x900 image stretched to a 1920x1080 panel is going to look blurrier than native no matter how you upscale it.
     
  6. function

    function None functional
    Legend Veteran

    Joined:
    Mar 27, 2003
    Messages:
    5,759
    Likes Received:
    4,105
    Location:
    Wrong thread
    True, but they were frikkin old and almost all of them didn't even support native 720. I think 1440 x 1080i was about the level most of them topped out at (575 vertical lines displayable). There were a couple of full 1080 CRTs, iirc, but they were huge and very expensive.

    PC style resolutions should have been supported by all TVs IMO. HD concoles, laptops, and screencasts weren't so important when HD was defined though.

    Well on a properly set up CRT, the blur is only towards the edge of the frikkin massive pixels. And on something like a Trinitron (especially a monitor) with it's mask, the blurred area at its optimally supported resolutions was hidden.

    But that's the point we were discussing: fixed resolution displays have inherent weaknesses when you move away from the fixed resolution! A large CRT with a tightly focused beam and high density phosphors would be vastly superior at handling common, PC spec resolution like 1600 x 900 (or 1440 x 1080, or 1366 x 768, or 1024 x 768, or etc ....)
     
    chris1515 likes this.
  7. London Geezer

    Legend Subscriber

    Joined:
    Apr 13, 2002
    Messages:
    23,824
    Likes Received:
    9,796
    Sorry but I used to have what, at the time, was a 'large' CRT. 32" or more, can't remember, tiny by today's standards. The thing was absolutely humongous and took up half of my old lounge, it weighted a ton and could not be carried by one person and sucked as much electricity as you would need to run a small village.

    The world is a much, much better place without CRTs, whatever the benefits in image quality were, which by now have largely been surpassed, and for many years, by good flat panels.
     
    Malo, DSoup, Billy Idol and 1 other person like this.
  8. function

    function None functional
    Legend Veteran

    Joined:
    Mar 27, 2003
    Messages:
    5,759
    Likes Received:
    4,105
    Location:
    Wrong thread
    I didn't say I wanted to go back to having TVs that were to big to wrestle!

    Some kind of none fixed res display would be nice though. Or failing that, so high res it no longer matters, I guess ...
     
  9. London Geezer

    Legend Subscriber

    Joined:
    Apr 13, 2002
    Messages:
    23,824
    Likes Received:
    9,796
    You did!!! :yep2:

    I think good 4K sets will alleviate this 'problem'. It's not really a problem on current good 1080p panels, but surely 4k sets with a decent scaler will be great, which is why I think native 4k in next gen will not be the norm as well-upscaled 1440p and 1080p with great IQ will be just fine for a little while.
     
  10. Globalisateur

    Globalisateur Globby
    Veteran Regular Subscriber

    Joined:
    Nov 6, 2013
    Messages:
    4,352
    Likes Received:
    3,218
    Location:
    France
    With what happened in this gen regarding the whole resolutiongate since 2013 (it's sill not over) I would tend to think that both companies (particularly Microsoft) will actively and specifically aim for decent 4K gaming for their next hardware and they'll probably be very vocal about it (marketing and others PR).
     
  11. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    43,576
    Likes Received:
    16,034
    Location:
    Under my bridge
    You can't aim for a resolution in hardware. If you can produce x quality graphics at 4k, and that's your target, you can produce 4x graphics at quarter that res. And if Joe Blogs can't even see the difference in resolution, why not? Bestest graphics with imperceptible blur versus less nice pin-sharp graphics which look worse...
     
    shredenvain likes this.
  12. London Geezer

    Legend Subscriber

    Joined:
    Apr 13, 2002
    Messages:
    23,824
    Likes Received:
    9,796
    Nah, 4k as standard in 5 or 6 years will not happen. And I daresay it is not needed. Look at what The Order is doing at 1080p (black bars and all). We'll get a lot of uber clean 1080p graphics, a lot of higher res or maybe supersampled, and some less demanding games at 4k. Maybe. My crystal ball showed me.
     
    shredenvain likes this.
  13. DSoup

    DSoup Series Soup
    Legend Veteran

    Joined:
    Nov 23, 2007
    Messages:
    15,474
    Likes Received:
    11,588
    Location:
    London, UK
    There are 4K sets with decent scalers (I have one) but you're not going to get one at bargain basement prices. As time has permitted, I've been paying close attention to how the TV is handling upscales and can confirm it's not just a linear upscale because upon inspecting frozen images there are no detectible blocks of pixels as you would expect, indeed it seems the 'xreality pro engine' (XRP) is trying to provide a smooth upscale because very few pixels appear to be sharing the colours of their immediate neighbours - admittedly I'm not checking all 8 million pixels ;)

    Sony's explanation seems to confirm this is what it's doing. The XRP processor only appears on Sony's higher-end sets. I'm sure Samsung (and others) have equivalent technologies on their higher-end sets.
     
    London Geezer likes this.
  14. London Geezer

    Legend Subscriber

    Joined:
    Apr 13, 2002
    Messages:
    23,824
    Likes Received:
    9,796
    I want one.

    [​IMG]
     
    Cyan likes this.
  15. TheWretched

    Regular

    Joined:
    Oct 7, 2008
    Messages:
    830
    Likes Received:
    23
    I would say so... I got one of those Samsung ones^^

    Just putting Samsungs top end 4K tvs next to the low end ones is... visibly different, even at just 1080P (which in theory should just be a pixel quadrupling).

    I've bought a new AVR as well... which has a 30Hz 4K upscaler. That one upscales MUCH sharper than my TV from 1080P... but it seems strange. But since it's just 30Hz, it's useless anyways (didn't buy it for the upscaler).
     
    DSoup likes this.
  16. DSoup

    DSoup Series Soup
    Legend Veteran

    Joined:
    Nov 23, 2007
    Messages:
    15,474
    Likes Received:
    11,588
    Location:
    London, UK
    What these scalers and image processors are doing almost sounds like witchcraft. This is Sony's page (linked above in the previous post) states:

    The X-Reality PRO Engine analyzes each pixel of the video signal, comparing it to neighboring pixels across the picture height, picture width and also pixels at the same screen location from preceding and following video frames. The system analyzes the picture across three dimensions: Width and Height (including diagonal) and Time. Because it analyzes vertical, horizontal and even diagonal motion from frame to frame, it knows that a white blob in the foreground of a building is actually a seagull flying across.

    The X-Reality PRO Engine compares patterns in the image against patterns in the internal database to determine the best possible hue, saturation and brightness for each pixel. In this way, the system restores detail lost in compression and transmission. So fine texture in plants and clothes is optimized. The system also reduces noise and reproduces optimum contrast, color and sharpness.

    Impressive even if only half true! But yeah, the difference between my old 1080p Sony and this 4K Sony is astonishing. You expect fundamentally better screen technology, i.e.better contrast and colour reproduction because technology marches on, but it's just that the additional detail is perceptible. I'm not claiming I can pick out a pixel from my sofa and, hand on heart, swear it wasn't there on the earlier TV but everything just looks better. The appropriate technical language fails me I'm afraid. :yep2:
     
    Cyan likes this.
  17. mrcorbo

    mrcorbo Foo Fighter
    Veteran

    Joined:
    Dec 8, 2004
    Messages:
    3,997
    Likes Received:
    2,806
    The question, though, is how much latency does all this processing add. When gaming, this is something I almost assuredly would be turning off.
     
    Globalisateur likes this.
  18. forumaccount

    Newcomer

    Joined:
    Jan 30, 2009
    Messages:
    140
    Likes Received:
    86
    It's the exact same argument as framerate. People out there claim they can't perceive framerate differences, so why not make prettier graphics that run at 10fps instead of 60fps? Because we don't base our choices around people who will be ignorant of them regardless.

    Resolution is a guaranteed improvement to image quality and the CU differential vs xbone makes 1080 a no brainer on PS4 this generation. DF is fighting windmills.
     
  19. joker454

    Veteran

    Joined:
    Dec 28, 2006
    Messages:
    3,819
    Likes Received:
    139
    Location:
    So. Cal.
    There is a baseline after which most people are satisfied and/or can no longer appreciate the difference. Like with music, super audio cd's may sound better but many people can't tell the difference between them and mp3 level of quality. With tv many can't tell the difference between a typical 1080p satellite signal and more highly compressed netflix content. With framerate you can easily run tests and see that 10fps would be unacceptable to most. With resolution it's the same thing, run blind tests and see what happens. You find what works best for most people and/or what maximizes your visual bang for the buck given the nature of fixed hardware. Is it wiser to spend 25% of your computational resources on something only 5% of the people will notice, or spend 25% on something 75% of the people would notice?
     
    Cyan, iroboto and Shifty Geezer like this.
  20. Shortbread

    Shortbread Island Hopper
    Legend Veteran

    Joined:
    Jul 1, 2013
    Messages:
    5,399
    Likes Received:
    4,534
    Don't worry, all the current 4K sets are going to get pushed out (become cheaper) with the pending arrival of HDR (buzzword) 4K sets.
     
    Cyan likes this.
Loading...
Thread Status:
Not open for further replies.

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...