1080p HDR image better than 4k non HDR ?

Discussion in 'Console Technology' started by ultragpu, Sep 10, 2016.

  1. mrcorbo

    mrcorbo Foo Fighter
    Veteran

    Joined:
    Dec 8, 2004
    Messages:
    3,947
    Likes Received:
    2,690
  2. mrcorbo

    mrcorbo Foo Fighter
    Veteran

    Joined:
    Dec 8, 2004
    Messages:
    3,947
    Likes Received:
    2,690
    This is correct. Also, the difference between 4:4:4 and 4:2:2 at 4K resolution, at typical TV screen sizes, and at typical TV viewing distances is insignificant. It matters much more for monitors, but they will be able to use DisplayPort instead which, as of 1.4, will be able to do 4K60 4:4:4 at 10 bit with HDR.
     
  3. orangpelupa

    orangpelupa Elite Bug Hunter
    Legend Veteran

    Joined:
    Oct 14, 2008
    Messages:
    8,421
    Likes Received:
    1,824
    I agree with that article. HDR looks much better but with caveats : need good panel.

    With bad panel, the TV will still be able to receive HDR but it won't be able to display it with great HDR quality.
     
  4. novcze

    Regular

    Joined:
    Jan 5, 2007
    Messages:
    596
    Likes Received:
    119
    ehh, better LCD panels have contrast ratio about 3000:1, right ... and HDR TV's with LCD panels are supposed to do 20 000:1 to get Ultra HD Premium certification. IMO not possible without local dimming or should I say local brightening?
     
  5. orangpelupa

    orangpelupa Elite Bug Hunter
    Legend Veteran

    Joined:
    Oct 14, 2008
    Messages:
    8,421
    Likes Received:
    1,824
    Anybody can slap HDR feature to their TV with only 1000:1 contrast ratio and color gamut that is no better than SDR TV.

    For HDR label that means quality, you need to look for DOLBY VISION logo / feature.

    Or as you said, look for UHD premium label instead of HDR label
     
  6. ultragpu

    Legend Veteran

    Joined:
    Apr 21, 2004
    Messages:
    6,242
    Likes Received:
    2,298
    Location:
    Australia
    Ha, the 930D is hardly a High end model these days specifically due to its Edge lit nature. A Full Array local dimming set such as Sony's 940D, Z9D are vastly superior in producing a better HDR picture still. People also have to remember HDR's extreme peak brightness is a torture test to a TV's ability to reduce blooming and haloing, so an edge lit tv by nature is very limited to faithfully deliver a convincing HDR picture.
    I can definitely understand why the quality or wow factor of HDR has been somewhat an ongoing argument and it is mainly due to the set that they've been viewing on. All I can say is that with the right kind of TV, HDR picture is vastly superior to a SDR picture could ever dreamed of. And for the best HDR experience currently available, you should be looking at this.
    http://www.hdtvtest.co.uk/news/kd65zd9-201610164372.htm
     
  7. fehu

    Veteran Regular

    Joined:
    Nov 15, 2006
    Messages:
    1,754
    Likes Received:
    746
    Location:
    Somewhere over the ocean
    Just to say...
    After reading this thread I changed my mind from buying 4k+hdr to decent full hd (sony WD75) or even lower end.
    Probably hdr premium will never come down to level with mid-budget tvs.
     
  8. mrcorbo

    mrcorbo Foo Fighter
    Veteran

    Joined:
    Dec 8, 2004
    Messages:
    3,947
    Likes Received:
    2,690
    I'm not sure that's a good idea either, though. Keep in mind that a "bad" 4K HDR TV can be a very good HD SDR TV.
     
    ToTTenTranz likes this.
  9. mrcorbo

    mrcorbo Foo Fighter
    Veteran

    Joined:
    Dec 8, 2004
    Messages:
    3,947
    Likes Received:
    2,690
    IMHO spending that much money on a TV at this particular time would be foolish, though. There seems to be way too much movement in the tech at the moment. Because of this, I expect that the best HDR experience that's currently available will be a very average HDR experience in fairly short order and isn't worth paying a premium for. For me to pay a premium for an HDR TV, that HDR TV needs to deliver an HDR experience that's caveat-free and we're not there yet.
     
    iroboto, pjbliverpool and BRiT like this.
  10. fehu

    Veteran Regular

    Joined:
    Nov 15, 2006
    Messages:
    1,754
    Likes Received:
    746
    Location:
    Somewhere over the ocean
    My other option was a samsung EU40KU6000 that I discovered be too bad for sub 4k, and not that good for 4k.
    (By yesterday evening I was competent in any 2015/2016 samsung's model, and some from lg and hisense :p )

    For the sub 500€ tier I can't find nothing good enought, or something without edge led.

    The question must be: given this realistic budget, is better a 1080p with good contrast, or a 4k with low contrast and nit that make 8 bit hdr indistinguishable from sdr?
     
  11. novcze

    Regular

    Joined:
    Jan 5, 2007
    Messages:
    596
    Likes Received:
    119
    I would prefer contrast ratio (i.e. the lowest black level possible) before badly implemented "new cool" features. You should also look for a flicker free (or PWM free) TV.


    Funny Story: I had to buy J6300 last year and ended up modding its LED backlight driver because they use low frequency PWM (120Hz) which introduced ghosting on everything that wasn't running with motion interpolation (120 fps which matched frequency of PWM). Now I have fixed white level at 120 cd/m2 without any PWM. I hope that OLED will come down in price before my next TV purchase.
     
  12. mrcorbo

    mrcorbo Foo Fighter
    Veteran

    Joined:
    Dec 8, 2004
    Messages:
    3,947
    Likes Received:
    2,690
    Sometimes I forget that buying advice based on US product availability and pricing is not valid everywhere.
     
  13. pjbliverpool

    pjbliverpool B3D Scallywag
    Legend

    Joined:
    May 8, 2005
    Messages:
    7,829
    Likes Received:
    1,142
    Location:
    Guess...
    Given the choice I think I'd still go for an LG OLED. Most of my serious film watching experiences are done in the dark anyway. But as mrcorbo says, now's a good time to wait, as difficult as that may be. 1 more year, 2 at max....
     
    ToTTenTranz and orangpelupa like this.
  14. orangpelupa

    orangpelupa Elite Bug Hunter
    Legend Veteran

    Joined:
    Oct 14, 2008
    Messages:
    8,421
    Likes Received:
    1,824
    I was using Samsung eh6030 for a few years. Now I'm using LG uh6100.

    My conclusion, avoid the majority of IPS panel due to its horrendous contrast ratio. And also It's better to choose the ones with
    * higher contrast ratio
    * better color gamut
    * better uniformity
    ** view angle (try it yourself in the show floor)

    I'm really disappointed with uh6100 with its shitty black, shitty colors, shitty uniformity. Despite this is a 4k (more like 3k, due to rgbw subpixel) HDR TV with 10 bit panel.

    Basically I think HDR 4K 10 bit is a marketing trick to tricks people thinking this new TV must look better than their old ones without those technologies.

    The real spec that is not mentioned anywhere in the marketing materials is muuuuuch more determining visual quality.

    Edit :
    Basically I think the TV manufacturers use these new technologies to simply made a TV that can receive them but not properly output them. Resulting in older, less featured TV, but with better panel quality can beat them to pulp.
     
    #114 orangpelupa, Nov 26, 2016
    Last edited: Nov 26, 2016
    fehu likes this.
  15. ultragpu

    Legend Veteran

    Joined:
    Apr 21, 2004
    Messages:
    6,242
    Likes Received:
    2,298
    Location:
    Australia
    Sure you can view it in the dark but that doesn't mean you're getting the intended minimum 1000 nits brightness that's mastered in all 4k HDR blu rays, Oled would clip highlights at 700 nits and over so you loose details. You also don't get that HDR trademark pop from the highlights so to me it's kinda pointless to go Oled if HDR expeience is what you're after.
    The grail for UHD HDR experience is pretty much 4000 nits and 100% Rec 2020. No way in hell Oled could reach that brightness any time soon, they're still rocking on 600-700 nits. Sony's Z9D is half way there and its prototype has already reached that brightness. Both would take awhile to reach 100% Rec 2020 tho. Qled is promising by combining both the high brightness of Quantum Dot Led and the self immersive nature of Oled, but it's a new untested tech and could potentially bring forth a slew of issues, not to mention the premium price. At this rate it's gonna take at least 2-3 years to actually find an affordable and near grail quality HDR set. But when you do get one, the next best thing is only a year away anyway. So what are you going to do:)?
     
    BRiT likes this.
  16. HTupolev

    Regular

    Joined:
    Dec 8, 2012
    Messages:
    936
    Likes Received:
    564
    Can't that be fixed for practical intents and purposes by just reducing the multiplicative factor on the TV's brightness? Then you'd get less nits, but 700 nits in a dark room will still look very bright. Certainly brighter than 1000 nits in a bright room.
     
    orangpelupa likes this.
  17. orangpelupa

    orangpelupa Elite Bug Hunter
    Legend Veteran

    Joined:
    Oct 14, 2008
    Messages:
    8,421
    Likes Received:
    1,824
    Yeah like the way games can scale themselves to full or limited RGB.
     
  18. ultragpu

    Legend Veteran

    Joined:
    Apr 21, 2004
    Messages:
    6,242
    Likes Received:
    2,298
    Location:
    Australia
    Nope, HDR content automatically force maxing out the light output. DolbyVision is a turn around but there's no physical media for it yet. Also the thing is 700 nits is the peak brightness at a 2-5% "window" size, anything above 20% you'd start loose brightness which means on average Oled is quite a bit dimmer and reaches nowhere near its peak brightness. All that is not even taking into account of ABL "automatic Brightness Limiter" which annoyingly kicks in whenever a brightish picture shows up, you would see something like a 150 nits sky for example. And that's terrible for playing games where graphics whores like to gawk at them nicely rendered sky boxes, snow levels, stencil art etc. It's a no go mate.
     
  19. orangpelupa

    orangpelupa Elite Bug Hunter
    Legend Veteran

    Joined:
    Oct 14, 2008
    Messages:
    8,421
    Likes Received:
    1,824
    so the only "real HDR" available right now is an LCD with FALD?
     
  20. ultragpu

    Legend Veteran

    Joined:
    Apr 21, 2004
    Messages:
    6,242
    Likes Received:
    2,298
    Location:
    Australia
    To the best of my knowledge it is the most qualified by raw specs. The latest HDR shootout for four Flagship models has awarded both the first and second place to FALD LED LCD.
    http://www.hdtvtest.co.uk/news/shootout-hdr-201608054331.htm
    1. Samsung KS9500 aka KS9800
    2. Panasonic TX-65DX902B
    3. LG E6
    4. Sony XD94
    Bear in mind this test did not include the latest flagship model Z9D from Sony while the XD94 only has a 800 nits peak brightness so naturally it falls short to the much brighter Samsung, Pana and its bigger brother Z9D (brightest of them all). Even E6's 700 nits edged it out by one vote due to Oled's other advantages.
    So yeah currently if you want the best HDR experience you'd have to go a high end FALD LED set.
     
    BRiT likes this.
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...