The 1ms marketing is pure nonsense. Monitors have a small input lag advantage over TVs. Not something anyone here would ever be likely to notice.
Personally I cannot stand the light bleed on the LG monitor. Black is never, ever 'black'. Hate that. Colours, contrast, everything IQ related really, the TV wins hands down. An OLED would absolutely crush the monitor even more.
HDR on the LG is basically saturation turned up to 1,000,000 and not really HDR. While the ZD9 is, well, stunning in HDR.
Yeah and I'm sure that the VA panel on the TV is what helps the IQ, but at the same time it's what give it the little 'trails' when high contrast objects move on the screen. It's a trade off I can deal with for now as I only notice it with 2D side-scrolling games where big dark parts of the screen pan across the screen on white backgrounds.Sounds like either a TN or IPS panel? The above is why I'd almost certainly go with a VA which have much better contrast levels and much blacker blacks. Colour re-production isn;t as good as an IPS but that's less of a concern for gaming IMO.
Well apparently this LG monitor we have is HDR600? And yes, it gets bright, but there is simply no deep darkness in the image. So even with HDR1000, unless miniLED or FALD is used for LCD screens, that problem remains. And probably gets even worse as the screen is pushing even more brightness than 600 nits.HDR on monitors is a bit of a minefield. Many will claim to have an HDR mode which is basically just an image adjustment. Others use the "Vesa DisplayHDR" standard but that in itself has different levels which are basically useless at the bottom end ranging up to very good TV quality HDR at the high end. From an HDR perspective I'd accept nothing less than DisplayHDR 600 (at a push), but I'd prefer to go for DisplayHDR 1000. That couples with a good quality, high contrast VA panel should be very comparable in image quality to a high end TV (not OLED)..
Yeah and I'm sure that the VA panel on the TV is what helps the IQ, but at the same time it's what give it the little 'trails' when high contrast objects move on the screen. It's a trade off I can deal with for now as I only notice it with 2D side-scrolling games where big dark parts of the screen pan across the screen on white backgrounds.
Well apparently this LG monitor we have is HDR600? And yes, it gets bright, but there is simply no deep darkness in the image. So even with HDR1000, unless miniLED or FALD is used for LCD screens, that problem remains. And probably gets even worse as the screen is pushing even more brightness than 600 nits.
since you are into ultrawide monitors...have you checked this one I mentioned in a different thread? (below is a copy-paste from that post)
the PERFECT monitor?
HDR1000, 240fps, 49" ultrawide (32:9).
5120x1440p resolution, sheer love.
https://displaysolutions.samsung.com/monitor/detail/1644/C49G95T
the screenshots there, that level still looks amazing.
The 32 inch size limit and LFC requirements will prevent you from getting anything other than an over priced, sub par monitor unfortunately. Nvidia's GSYNC HDR monitors are your only real option. But keep in mind there are no consumer grade HDR monitors you can even buy that come close to what you can get on a TV. Even the best ones still use 8 bit panels and suffer from very noticeable haloing due to a lack of zones.I'm no esports gamer (not even close) but when I loaded up a game on my LG OLED for the first time the lag was definitely noticeable in gaming mode. In non-gaming mode it was unbearable. Now that I've gotten used to it, it doesn't concern me to much in non-twitch games but it's definitely noticeable.
Still, I haven't upgraded from my crappy old 1080p monitor yet and while a non 21:9 screen is pretty much a deal breaker I'd be open minded about getting a TV instead if it would meet all my other criteria at a lower price and/or superior image quality (good speakers would also be a big bonus). Any suggestions? It must have:
- Desk sized form factor so no larger than about 32" in 16:9 (or 35" in 21:9)
- Minimum 120hz refresh rate (real, not interpolated)
- 4K in 16:9
- Decent quality HDR with a bare minimum 600 nits (preferably 1000+) and multiple local dimming zones
- Variable refresh support in HDR mode that goes into the sub 30fps region (using LFC if necessary) and that is compatible with both AMD and Nvidia GPU's - although given the latter may be impossible I'd consider one or the other.
- Response time preferably <6ms but I have a little flexibility here
- Contrast ratio of at least 3000:1 (but again flexible if the picture is otherwise excellent)
The ps2 didnt have the 30fps problem though, ratched was 60 and great gfx for the time, so did mgs2, zoe2, jak series, god of war 1/2 both of which had best ps2 graphics or close, Black etc.
Even certain remakes of those are downgraded to 30.
The 32 inch size limit and LFC requirements will prevent you from getting anything other than an over priced, sub par monitor unfortunately.
Nvidia's GSYNC HDR monitors are your only real option.
But keep in mind there are no consumer grade HDR monitors you can even buy that come close to what you can get on a TV.
Even the best ones still use 8 bit panels and suffer from very noticeable haloing due to a lack of zones.
"Over priced" and "sub par" compared to what?
Why? There are plenty of very high quality Freesync monitors available, some of which also support Gsync along with LFC.
This is simply not true unless by "TV" you mean "top end OLED TV", and even then, it's a big stretch to say that no monitor could "come close". Take this for example:
https://www.samsung.com/uk/monitors/monitor-cg95/
A QLED DisplayHDR 1000 HDR10+ supporting set with a 1ms response time. That's going to be producing a better picture than 99.9% of "HDR" TV's on the market. And it can do it with VRR and a 240hz max refresh.
Again not true. There are plenty of monitors at the high end with multiple dimming zones and at least 3 with full FALD with more incoming. Haloing (at a nitpicking level) even with FALD will be visible on pretty much every TV on the market that isn't running an OLED panel, and most people can't tell the difference between 8 bit and 10 bit anyway, especially in games, let alone 8-bit+FRC which supports a 10bit signal and is the minimum requirement for DisplayHDR600 and above.
Besides which, the overwhelming majority of TV's on the market are also 8bit LCD's with global or edge dimming so I'm not quite sure what you're comparing too? You seem to want to compare the highest end TV's on the market to mid range monitors while ignoring everything at the high end. If you're intention is to compare all monitors to multi-thousand pound OLED TV's or the highest end FALD QLEDs then I'll grant, the TV's win on image quality. But describing monitors in general as "sub-par" when it's perfectly possible to get models that will outperform 99.9% of HDR TV's on the market in picture quality while simultaneously sporting monitor exclusive advantages like very low response times and very high refresh rates at form factors not available on the TV market seems a bit silly.
The 32 inch size limit and LFC requirements will prevent you from getting anything other than an over priced, sub par monitor unfortunately. Nvidia's GSYNC HDR monitors are your only real option. But keep in mind there are no consumer grade HDR monitors you can even buy that come close to what you can get on a TV. Even the best ones still use 8 bit panels and suffer from very noticeable haloing due to a lack of zones.
Compared to a TV of a similar price.
And again the 1 ms G2G is marketing nonsense.
It also only supports GSYNC/FreeeSync at >59 FPS.
High end monitors do not have enough zones. TVs have quite a bit more zones which helps to mitigate haloing. OLED can be had for a similar price to these high end monitors.
Unless of course you want a display with lower input latency and higher refresh rates than TV's offer, or in a form factor smaller than 40" that still sports high end capabilities. In those scenario's you're paying a premium for a monitor because it's sporting features that simply don't exist in the TV market.
I'll grant you, that if the above is of no concern and you're happy with a 16:9 aspect ratio then TV's are no brainers over monitors every time. And I'll also agree that the premium you pay for those features in a monitor if you want to retain a high end TV like experience is extremely high.
But monitors are still the only game in town if you want/need them. They also seem to be catching up with TV's in HDR credentials with FALD becoming more common and the latest DisplayHDR 1400 spec. Granted you're sill paying a heavy premium for that quality in combination with monitor refresh rates and response times but hopefully with those technologies becoming more common, prices will start to become more reasonable. That's exactly the reason I haven't bought one yet despite itching for a 21:9 form factor. Maybe I should just bite the bullet, accept that I can't have the best of all worlds in the near future for a non-ridiculous price and get a more reasonably priced monitor with "fake HDR". Remove high end HDR from the mix and monitors become a far, far better value proposition (vs HDR monitors) while retaining all the other unique monitor advantages.
Not really, response time's pretty important if you want to avoid ghosting while gaming where you won't have access to the usual image processing tech which TV's use to combat it. Obviously an OLED in the same price range is going to outperform this monitor in that regard, but no other TV display tech is.
It's predecessor supports LFC so I'd assume this does too even though it's not specifically called out on that page. If not then, I agree VRR starting at 60hz it one step up from useless.
Out of interest how many zones do you consider enough zones and what TV's (outside of OLED) are you thinking of when you say they have quite a bit more zones?
with you always.Eh, not so much unless you're only looking at TN panel monitors.
Uncalibrated color accuracy and color uniformity are generally superior on the better monitors. Once calibrated, however, TVs and Monitors can be virtually indistinguishable although the best monitors will generally have better color uniformity than the best TVs. The exceptions come into play with studio monitors that costs significantly more than your standard consumer monitors.
Latency is important. Far more important when using a PC than a console due to the fact that there is a 1:1 mapping between mouse and hand movements unlike console controllers. Basically once you get below a certain point with a controller, it's harder to notice the lag because there isn't a 1:1 connection between moving the analog stick and the response on the screen.
However, even with a controller there is a noticeable effect when it comes to pressing a button (for example for a precision frame perfect jump or move) and seeing the feedback of that button press on the screen. It's something most people can adjust to, but if you go from a very low latency control -> display feedback look to a higher latency one, the differences are very noticeable as your timing will be completely off and everything feels like you are controlling someone or something moving slight off to what you expect.
As to Image Quality, in high processing mode (high latency mode on TVs) the IQ can be better for content in motion as the TV spends significant processing time improving the visuals. Disable that for low latency gaming mode on the TV and almost all of that disappears leaving it at the same place as a typical average PC monitor.
But, here's the kicker, if you want all that fancy processing that TVs do to improve image quality in movies, you can have that on PC as well through multiple various media players, filters, etc. The one place TVs still have a significant advantage in processing is motion interpolation (smooth video) as they have dedicated chips for that. But that is something you'll never use in games as it comes with a hefty processing (latency) penalty. As well, having interpolated frames forced onto a game isn't necessarily going to make the gameplay experience better even if the latency for it wasn't relatively massive.
I use a TV as my main display for cost. If I was as serious about gaming as I was in the past and my reactions were like they were 20 years ago, I wouldn't touch a LCD TV for a gaming monitor with a 10 foot pole.
Regards,
SB