Best 4K HDR TV's for One X, PS4 Pro

Discussion in 'Console Industry' started by Rangers, Apr 29, 2017.

  1. BRiT

    BRiT (╯°□°)╯
    Moderator Legend Alpha Subscriber

    Joined:
    Feb 7, 2002
    Messages:
    12,509
    Likes Received:
    8,716
    Location:
    Cleveland
    If you are buying into 1080p, maybe if you can find a hell of a bargain on a used Plasma set. I highly doubt those exist as prices on current day to day 1080p sets are insanely cheap, so its almost not worth selling. I imagine most of the plasma sets are put to use in a second room or handed down to family/friends.
     
  2. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    40,734
    Likes Received:
    11,208
    Location:
    Under my bridge
    1080p won't look bad on a 4K set. In fact it should look better at the same screen size than 1080p native. As others say though, probably worth saving your money because new tech like real HDR and decent quality 4K will be available in a couple of years.
     
  3. RenegadeRocks

    Legend

    Joined:
    Oct 16, 2005
    Messages:
    10,102
    Likes Received:
    1,094
    I just played Andromeda at native 4k on my 1060 at High. Not 60 fps but not unplayable either. 1060 can do a lot, I would say.
     
    jayco likes this.
  4. jayco

    Regular

    Joined:
    Nov 18, 2006
    Messages:
    848
    Likes Received:
    81
    That is actually very good. I will try using DSR to see where the card can go. Thanks.

    The thing is, I may be able to get a TV (Plasma HDReady) from my brother as he is moving to another house and he may not need it, but otherwise I have no TV. I really was thinking on spending quite some money on an LG OLED, but after reading about HDMI 2.1 I really don't want to make that type of purchase now. Mmm... decisions. xD
     
    #324 jayco, Jul 2, 2017
    Last edited: Jul 2, 2017
  5. Nisaaru

    Regular

    Joined:
    Jan 19, 2013
    Messages:
    879
    Likes Received:
    198
    IMHO there's just no pressure to commit to a TV now. Even when the X1X is available waiting for HDMI2.1 products for such long term investments like a new AV/TV makes the most sense to me or buy a cheap 4K LCD if you really "believe" you "need" one. At least HDMI2.1 finally defines a standard which should hold for at least another decade while HDMI1.4/2.0 were just a short time fixes.
     
    #325 Nisaaru, Jul 3, 2017
    Last edited: Jul 3, 2017
    BRiT likes this.
  6. Rangers

    Legend

    Joined:
    Aug 4, 2006
    Messages:
    12,322
    Likes Received:
    1,120

    Neither set is calibrated they are both straight from box. also, difference between HDR/SDR is similar on the same TV (the HDR one) without changing any setting but HDR (so obviously same calibration settings I guess).
     
  7. Rangers

    Legend

    Joined:
    Aug 4, 2006
    Messages:
    12,322
    Likes Received:
    1,120

    Rtings 10% window brightness ratings should cover that no? Vizio does fine. https://www.google.com/search?q=viz...0j69i57j0l4.2583j0j7&sourceid=chrome&ie=UTF-8

    The old 2016 P Series hit 487 cd/m^2 in 10% windows.

    http://www.rtings.com/tv/reviews/vizio/m-series-2017

    The 2017 Vizio M series can hit nearly 800 cd/m^2 in 10% window.

    Perhaps the author tested an old firmware. On many set reviews you see Rtings note that local dimming actually dims small highlights, maybe this is the same as what the article is talking about. But it's hardly exclusive Vizio problem (if at all?) but across many brands (Samsung, Lg) with their lower end sets. I assume it's an algorithm issue not a hardware one and could be "fixed" with a FW update for sets that behave this way.
     
  8. mrcorbo

    mrcorbo Foo Fighter
    Veteran

    Joined:
    Dec 8, 2004
    Messages:
    3,578
    Likes Received:
    1,986
    Nope. This is a function of the number of local dimming zones. Algorithms could yield some improvement, but nothing significant.
     
    Silent_Buddha and BRiT like this.
  9. mrcorbo

    mrcorbo Foo Fighter
    Veteran

    Joined:
    Dec 8, 2004
    Messages:
    3,578
    Likes Received:
    1,986
    Just ordered the TCL 55P605 from Best Buy and it's scheduled to be available at my local store on Saturday. I'm going to move the Hisense 50H8C I'm currently using as a main TV to another room to replace a 42" Panasonic Viera Plasma.

    I could have continued to live with the Hisense, but that TV didn't support WCG and I wanted to have that part of the HDR experience.

    Come HDMI 2.1 (whenever there's a TV model that checks all of my boxes), I'll upgrade from the TCL TV in the main setup and also upgrade the Denon receiver to an HDMI 2.1 model and both the TCL and my existing Denon will move to the second setup.
     
    BRiT likes this.
  10. brunogm

    Newcomer

    Joined:
    Apr 26, 2013
    Messages:
    138
    Likes Received:
    22
    This legacy is the new "interlaced" versus "progressive";
    REC.2020 is really limited not even allows full range.

    BT.2100 allows fullrange, but again the HDR curve salad means each HDR10, HLG or Dolby PQ have different choices on this.

    Narrow range were used to calibrate noise at analog camera pipelines, with test signals " blacker than black" and superwhite;

    Of course some type of test signal is needed for calibration!
    My issue is why throw away so much codewords at this and not simply maintain the 32 codes from 8bit.
    (Quantize linear sensor noise of course but is not usable range)

    Then the PQ2084 would have had JND performance near 12bit narrow range at only 10bit "cost" In a couple years Dolby will "launch" 12 bit PQ to solve these issues, they've chosen to maintain...

    Sent from my Moto G using Tapatalk
     
    Scott_Arm and BRiT like this.
  11. brunogm

    Newcomer

    Joined:
    Apr 26, 2013
    Messages:
    138
    Likes Received:
    22
    Vision is PQ2084 the max is 10k nits but very coarse. Only helps highlights.

    HLG is 5000nits max but nicer curve on the shadows.


    Sent from my Moto G using Tapatalk
     
    Scott_Arm likes this.
  12. brunogm

    Newcomer

    Joined:
    Apr 26, 2013
    Messages:
    138
    Likes Received:
    22
    Maybe devs and gamers are trying to find the best hardware to calibrate for...

    There is a history about how BT.1886 was standardized... IIRC

    Famous director/color grading person had a professional CRT calibrated for the best filmic look used for many years in the industry with analog film production at a bunch of films "De facto standard" color grading.

    Then a issue happened for the premiere of a film, using a new Digital Projector assuming sRGB 2.2 gamma. Many days of discussion and trying to please the creator at the venue and finally the solution was measure the director CRT.

    So the device was actually gamma 2.37, then easy fix at the venue and more measurements later BT.1886 is now the "standard ".

    So in a way sRGB needs revision too...

    Maybe some studio/dev with a color grade chamber could test if fullrange mode active at tv/console with the now common in game brightness pattern calibration; has better detail resolve in shadows/highlights.

    Sent from my Moto G using Tapatalk
     
    Scott_Arm and BRiT like this.
  13. Scott_Arm

    Legend

    Joined:
    Jun 16, 2004
    Messages:
    13,277
    Likes Received:
    3,726

    I read something like movies were being mastered for 4k nits in Dolby Vision, and 1k nits for HDR10, but both allowed for screens up to 10k nits. I'm not sure what that means in practice. Future room for growth as tv capabilities improve?
     
  14. Rangers

    Legend

    Joined:
    Aug 4, 2006
    Messages:
    12,322
    Likes Received:
    1,120
    I mean I'm no doctor, but being my 350 nit screen looks very bright to me (brighter than my old 250 nit SDR of course), 1k must be crazy, 4k like, must literally threaten vision I would think, heh. 10k? Are we going to be blinding people when the sun comes onscreen?

    Since few HDR TV's seem to even hit 1K currently seems we'd have a very long way to go. Not sure it would be worth the presumably extreme expense of pumping out laser-esque light beams from small clusters of LED's, either
     
    #334 Rangers, Jul 4, 2017
    Last edited: Jul 4, 2017
    RenegadeRocks likes this.
  15. Rangers

    Legend

    Joined:
    Aug 4, 2006
    Messages:
    12,322
    Likes Received:
    1,120

    Some very simple math says a 55" TV with 32 local dimming zones could have a zone for each ~38 sq inches of area. So about a 6" X 6" patch. Seems like a pretty small area compared to the van in a tunnel example in the original article. 72 zones on the TCL would be even much better.
     
  16. London-boy

    London-boy Shifty's daddy
    Legend Subscriber

    Joined:
    Apr 13, 2002
    Messages:
    21,511
    Likes Received:
    5,156
    It's really not that bad. 1800 nits can be quite dazzling but in no way what I'd call damaging. In fact I'm pretty sure it doesn't increase in a straight line, but logarithmic or whatever. I think. So 4000 nits is not 10 times brighter than 400.
     
    #336 London-boy, Jul 4, 2017
    Last edited: Jul 4, 2017
    RenegadeRocks, Scott_Arm and BRiT like this.
  17. brunogm

    Newcomer

    Joined:
    Apr 26, 2013
    Messages:
    138
    Likes Received:
    22
    Teorically, but other aspects are not future proof. Like the change from 10 to 12 bit.
    The tradeoffs are not clear for the early adopter.
    Most likely UHD Premium "2" will change everything and provided the short window of firmware support for the TVs.

    The effective long term support is slim. As an example "x.v.Color and 10,12,16 bit HDMI formats are badly supported in the desktop. Even PRO cards there is not a battle tested environment of tools. That allow maximize today displays

    A possible solution would be enable HDMI 1.3 features and expand the Dolby vision /HDR10 bit for this linear 16/12 with a LUT to provide compatibility for 1080p and early 4k displays.


    Sent from my Moto G using Tapatalk
     
  18. mrcorbo

    mrcorbo Foo Fighter
    Veteran

    Joined:
    Dec 8, 2004
    Messages:
    3,578
    Likes Received:
    1,986
    But within each dimming zone there only needs to be one high-contrast feature to break the TVs ability to represent it properly. You either try to represent the highlight properly by cranking up the backlight in that zone, raising the brightness of any darker features there, or darken the backlight in that zone to show the darker feature properly, which would dim the highlight, or some of both.
     
    Silent_Buddha and BRiT like this.
  19. mrcorbo

    mrcorbo Foo Fighter
    Veteran

    Joined:
    Dec 8, 2004
    Messages:
    3,578
    Likes Received:
    1,986
    Your eyes adjust, just as they do to natural light. Bias lighting behind the screen helps.
     
  20. RenegadeRocks

    Legend

    Joined:
    Oct 16, 2005
    Messages:
    10,102
    Likes Received:
    1,094
    agree there..i was thinking the same. This ku7000 tires me much faster than my previous monitor and my wife keeps telling me to turn down the brightness as her eyes start hurting soon. I wonder how ppl with 1k nits play !!?

    Sent from my SM-N920G using Tapatalk
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...