Best 4K HDR TV's for One X, PS4 Pro [2017-2020]

Status
Not open for further replies.
Yikes. That sounds like a more difficult problem to overcome than on the plasma's.
Burn-in on plasma is just as likely as the OLEDs. Plasmas have comparable half life to OLED tech. The pixels degrade over time, lowering their average luminescence. This can be localized to individual pixels, just like OLED. I've gamed "responsibly" on plasma for more than 8 years, without a trace of burn-in - just a little discoloration (mura). My oled is exhibiting no signs of screen mura (green and pink discoloration), unlike the AMOLED displays in my galaxy nexus, note 2, and s6 edge. My nexus 6p has no signs of burn-in or mura though.
 
I just noticed that my PC is sending only a left audio channel to my 49XE9005 via HDMI. Native TV apps and PS4 all output full stereo to my headphones.

No matter what I do on PC, it sends only left channel.

I'll chalk this up to a random HDMI fuckup.
 
I think most do except for the TCL's, unless I'm not understanding your question.
Ok, thanks! I will check your link. :D (Reason I ask is because my current TV only outputs stereo, even when the source is a dolby digital soundtrack.)

Besides, as long as you don't have a CEC implementation problem, wouldn't you rather use ARC anyway?
I probably would, if I knew what ARC was, but if it involves HDMI then my soundbar doesn't have such an input (thanks a lot for that, Sonos...)
 
Tangentially related question, that I didn't think would deserve it's own thread, so here it is:

What HDMI cable should one get? Are there any HDMI 2.1 'ready' cables available? If not, what cable would you suggest as a nice stop-gap measure? The current HDMI cables I'm using are very old and limited to 10.2 GBps (Amazon Link). While I like how easy they are work with because of how thin they can be, that's not an absolute requirement for me.
 
Tangentially related question, that I didn't think would deserve it's own thread, so here it is:

What HDMI cable should one get? Are there any HDMI 2.1 'ready' cables available? If not, what cable would you suggest as a nice stop-gap measure? The current HDMI cables I'm using are very old and limited to 10.2 GBps (Amazon Link). While I like how easy they are work with because of how thin they can be, that's not an absolute requirement for me.
I'm using the Amazon Basic cable, it's not lacking.
the AmazonBasics High-Speed HDMI Cable with Ethernet meets the latest standards, which means it considerably expands bandwidth up to 18 Gbps, offers 4K@50/60 (2160p) video resolution (four times more clarity than 1080p/60), and supports the wide-angle theatrical 21:9 video aspect ratio.
You can get two of them for not much more than the price of one. https://www.amazon.com/AmazonBasics-High-Speed-HDMI-Cable-Standard/dp/B014I8T4MO
You inspired me to search HDMI 2.1. It seems a new cable will be required for all that bandwidth. https://www.cnet.com/news/hdmi-2-1-what-you-need-to-know/
Though maybe not, at least not for those "only" looking for 4K at 120 fps. That's just double what we're getting now.
  • The physical connectors and cables look the same as today's.
  • Improved bandwidth from 18Gbps (HDMI 2.0) to 48Gbps (HDMI 2.1).
  • Can carry resolutions up to 10K, frame rates up to 120fps.
  • New cables required for higher resolutions and/or frame rates.
  • Spec is still being finalized, expected to publish April-June 2017.
  • First products could arrive late 2017, but many more will ship in 2018.

P.S. I'm using this on a "4K" Sony, and it's hooked up to GTX 1070.
 
For long runs premium cables are often required, though. I had to get a Monoprice certified cable to enable a 4k60 signal from my PC to TV, which was a 15+' run.
 
Hey guess what

Remember my 1080p monitor with fake HDR that gave shitty results when I connected my vanilla PS4 with Horizon?

It actually works. Tested Resi 7 with it and the difference is very noticeable. Cant go back to SDR now. The visuals are unbelievably better with it. The picture comes to life

edit: Resi 7 was tested on PC at 4k resolution. Thats the only way the HDR mode works correctly with this monitor
 
Last edited:
How are you running a 1080P monitor at 4K...? I suppose I must be missing something, but I'm not sure what! :D
 
How are you running a 1080P monitor at 4K...? I suppose I must be missing something, but I'm not sure what! :D
Basically the monitor has different color modes of which 2 are Game HDR and Movie HDR. Once I choose any of these two and go to the display settings of my computer, it lets me choose above the native resolution up to 4K. The rest of the modes do not allow above 1080p. If I dont choose the 4k resolution HDR doesnt display correctly. In all the resolution options 1920x1080 is shown as native.
I have no idea how this technology works, and although others noted that this 1080p monitor lacks the necessary specs to display correctly HDR, it accepts 4K signal in HDR mode and the difference in color is quite noticeable.
 
Weird! As hell. But, nice that it works, when it works (and your PC can run 4K at a good enough clip...) :)
 
Basically the monitor has different color modes of which 2 are Game HDR and Movie HDR. Once I choose any of these two and go to the display settings of my computer, it lets me choose above the native resolution up to 4K. The rest of the modes do not allow above 1080p. If I dont choose the 4k resolution HDR doesnt display correctly. In all the resolution options 1920x1080 is shown as native.
I have no idea how this technology works, and although others noted that this 1080p monitor lacks the necessary specs to display correctly HDR, it accepts 4K signal in HDR mode and the difference in color is quite noticeable.

So it will accept a 4K HDR signal and simulate 10 bit color depth while down rezzing?

It would seem odd for a PC monitor or a TV to tie its native 4K rez to HDR (or fake HDR). There is a ton of 4K non HDR content out there.
 
So it will accept a 4K HDR signal and simulate 10 bit color depth while down rezzing?

It would seem odd for a PC monitor or a TV to tie its native 4K rez to HDR (or fake HDR). There is a ton of 4K non HDR content out there.

it *isn't* 4k native. It's 1080p native w/ HDR, but the HDR only works if you send a 4k signal and let it downscale. It's bonkers.
 
I believe the UHD Blu Ray spec. does include 1080p HDR.

Not sure if ATSC 3.0 does.

But remains to be seen which formats from these standards the studios and networks support.
 
it *isn't* 4k native. It's 1080p native w/ HDR, but the HDR only works if you send a 4k signal and let it downscale. It's bonkers.
Yes that.

I cant even call it fake HDR at this point. It looks that good. I have to compare with another HDR display.
 
hmmm... would an Xbox One S work with this? doesn't that output a 4K signal?
I am not sure if it always outputs 4k signal. I think it outputs only if the content is 4k. But I am curious if it would work with such a monitor if it outputs 4K signal at all times.
 
I am not sure if it always outputs 4k signal. I think it outputs only if the content is 4k. But I am curious if it would work with such a monitor if it outputs 4K signal at all times.

It upscales, so it would upscale to 4K and then your monitor would downscale it back to 1080p. :LOL:
 
It upscales, so it would upscale to 4K and then your monitor would downscale it back to 1080p. :LOL:

That's what I was thinking would happen...but I don't believe regular PS4 outputs a 4K signal correct? Too bad someone couldn't test. This might be a very cheap way to get 1080p "HDR".
 
Status
Not open for further replies.
Back
Top