What do you prefer for games: Framerates and Resolutions? [2020]

What would you prioritize?


  • Total voters
    42
Looks like one of my bathrooms never thought of that.
Still think uc4 holds up well to todays games, even for outside environments too, probably unmatched for underwater gfx.
 
The 1ms marketing is pure nonsense. Monitors have a small input lag advantage over TVs. Not something anyone here would ever be likely to notice.

I'm no esports gamer (not even close) but when I loaded up a game on my LG OLED for the first time the lag was definitely noticeable in gaming mode. In non-gaming mode it was unbearable. Now that I've gotten used to it, it doesn't concern me to much in non-twitch games but it's definitely noticeable.

Still, I haven't upgraded from my crappy old 1080p monitor yet and while a non 21:9 screen is pretty much a deal breaker I'd be open minded about getting a TV instead if it would meet all my other criteria at a lower price and/or superior image quality (good speakers would also be a big bonus). Any suggestions? It must have:

  • Desk sized form factor so no larger than about 32" in 16:9 (or 35" in 21:9)
  • Minimum 120hz refresh rate (real, not interpolated)
  • 4K in 16:9
  • Decent quality HDR with a bare minimum 600 nits (preferably 1000+) and multiple local dimming zones
  • Variable refresh support in HDR mode that goes into the sub 30fps region (using LFC if necessary) and that is compatible with both AMD and Nvidia GPU's - although given the latter may be impossible I'd consider one or the other.
  • Response time preferably <6ms but I have a little flexibility here
  • Contrast ratio of at least 3000:1 (but again flexible if the picture is otherwise excellent)
 
We have a more-than-decent Sony ZD9 in our living room, and a more-than-decent LG HDR monitor, both hooked up to PS4 Pros. And both have their own advantages. The ZD9 by far wins on the image quality department, there is simply no comparison, thanks to the FALD screen.

Personally I cannot stand the light bleed on the LG monitor. Black is never, ever 'black'. Hate that. Colours, contrast, everything IQ related really, the TV wins hands down. An OLED would absolutely crush the monitor even more. HDR on the LG is basically saturation turned up to 1,000,000 and not really HDR. While the ZD9 is, well, stunning in HDR.

But I'll of course concede that the ZD9 has quite a bit more blurring in motion (especially black objects moving around the screen, which leave a visible trail if I really pay attention) while the LG is pretty smooth.

The LG has very low latency but if I'm honest, and in theory the ZD9 isn't the fastest TV out there, but I do not see the difference at all so maybe I'm just one of those people with slow brains who don't perceive differences in milliseconds.
 
Personally I cannot stand the light bleed on the LG monitor. Black is never, ever 'black'. Hate that. Colours, contrast, everything IQ related really, the TV wins hands down. An OLED would absolutely crush the monitor even more.

Sounds like either a TN or IPS panel? The above is why I'd almost certainly go with a VA which have much better contrast levels and much blacker blacks. Colour re-production isn;t as good as an IPS but that's less of a concern for gaming IMO.

HDR on the LG is basically saturation turned up to 1,000,000 and not really HDR. While the ZD9 is, well, stunning in HDR.

HDR on monitors is a bit of a minefield. Many will claim to have an HDR mode which is basically just an image adjustment. Others use the "Vesa DisplayHDR" standard but that in itself has different levels which are basically useless at the bottom end ranging up to very good TV quality HDR at the high end. From an HDR perspective I'd accept nothing less than DisplayHDR 600 (at a push), but I'd prefer to go for DisplayHDR 1000. That couples with a good quality, high contrast VA panel should be very comparable in image quality to a high end TV (not OLED)..
 
Sounds like either a TN or IPS panel? The above is why I'd almost certainly go with a VA which have much better contrast levels and much blacker blacks. Colour re-production isn;t as good as an IPS but that's less of a concern for gaming IMO.
Yeah and I'm sure that the VA panel on the TV is what helps the IQ, but at the same time it's what give it the little 'trails' when high contrast objects move on the screen. It's a trade off I can deal with for now as I only notice it with 2D side-scrolling games where big dark parts of the screen pan across the screen on white backgrounds.


HDR on monitors is a bit of a minefield. Many will claim to have an HDR mode which is basically just an image adjustment. Others use the "Vesa DisplayHDR" standard but that in itself has different levels which are basically useless at the bottom end ranging up to very good TV quality HDR at the high end. From an HDR perspective I'd accept nothing less than DisplayHDR 600 (at a push), but I'd prefer to go for DisplayHDR 1000. That couples with a good quality, high contrast VA panel should be very comparable in image quality to a high end TV (not OLED)..
Well apparently this LG monitor we have is HDR600? And yes, it gets bright, but there is simply no deep darkness in the image. So even with HDR1000, unless miniLED or FALD is used for LCD screens, that problem remains. And probably gets even worse as the screen is pushing even more brightness than 600 nits.
 
Thanks DF nice coverage. The amount of comments on the yt video of people wanting 60fps is massive, some get over 1000 likes. A 1440p upscaled to 4k seems a great alternative?
 
I couldn't be happier that majority of Sony AAA studios from the reveal event are sticking to 30fps and pushing for maximum graphics first and foremost. They've obviously done the math, weighed the odds and came to the conclusion that 60fps is not worth it to their creative vision nor the general audience. With twice amount of render budget, aside from the increased graphics you get to open up new ideas, designs, interaction and more which simply couldn't be done on a 60fps budget without severely downgrading. That said I can easily see a Performance mode made available for catering the hardcore if demand is high enough.
Can't wait to see what a next gen wide linear 30fps title from Naughty Dog and SSM would look like when an open world 30fps game like Horizon Forbidden West already looks this insane.
 
Yeah and I'm sure that the VA panel on the TV is what helps the IQ, but at the same time it's what give it the little 'trails' when high contrast objects move on the screen. It's a trade off I can deal with for now as I only notice it with 2D side-scrolling games where big dark parts of the screen pan across the screen on white backgrounds.

Yes VA does have lower response times but that's where the advantage of monitors comes in as they're generally quite a bit lower in VA monitors at around 5ms on average than VA based TV's which are likely to be 2-3x that.

Well apparently this LG monitor we have is HDR600? And yes, it gets bright, but there is simply no deep darkness in the image. So even with HDR1000, unless miniLED or FALD is used for LCD screens, that problem remains. And probably gets even worse as the screen is pushing even more brightness than 600 nits.

HDR600 depending on the implementation will often just use edge lit local dimming and has a minimum contrast ratio requirement of 6000:1 so certainly with an IPS or TN you could be limited on the darkness. A VA should help again in that respect, however the big win is with HDR1000 which specifies a minimum contrast ratio of 20,000:1 which is likely going to require FALD. HDR1000 also comes with other requirements like 10-bit colour support and 90%+ DCI-P3 coverage which align it fairly closely with the "Ultra HD Premium" standard in the TV world. I'd say that puts it a pretty decent step above most TV's on the market in terms of image quality and certainly competing with high end TV's outside of OLEDs.
 
Isn't Spiderman have rather forgiving timmings? Those with high end TV probably can enable motion interpolation to get instant 60 fps and its still good enough to play (maybe need a bit of getting used to).

I used to do that in 30 fps games with bad (or no) motion blur like ni no kuni.

Edit: whoa DF took so long to render that. I wonder why he didn't use SVP or Splash. It can do real time motion interpolation with good result.

Edit: btw this is an example of real time motion interpolation of TLOU2 recorded gameplay video


For more action screen (spoiler TLOU2 last boss battle) youtu. be/94xv9CfaAg4
 
Last edited:
since you are into ultrawide monitors...have you checked this one I mentioned in a different thread? (below is a copy-paste from that post)

the PERFECT monitor?

HDR1000, 240fps, 49" ultrawide (32:9).

5120x1440p resolution, sheer love.

https://displaysolutions.samsung.com/monitor/detail/1644/C49G95T

Yes that monitor is actually insane. And the price doesn't look too bad for what you get either if you can find it on sale anywhere (which I can't, but did at least find a price here).

My only concerns would be whether the form factor and curvature is actually too wide, and the support level from games for that. And it also probably wouldn't fit on my desk! Still, I'd love to give one a try to find out.

And the spec has literally everything you could ever want. If they released 34-35" version of this in 21:9, at a lower price I'd be all over it in a flash.
 
the screenshots there, that level still looks amazing.

That ending scene was just mind bendingly awesome graphics. I thought it was video for a split second before the movement on dog made me realise it's video. Still remember it so well as I was like WTF is going on and then, oh, it's still game. The first and only time this has happened to me playing games on any platform.
 
I'm no esports gamer (not even close) but when I loaded up a game on my LG OLED for the first time the lag was definitely noticeable in gaming mode. In non-gaming mode it was unbearable. Now that I've gotten used to it, it doesn't concern me to much in non-twitch games but it's definitely noticeable.

Still, I haven't upgraded from my crappy old 1080p monitor yet and while a non 21:9 screen is pretty much a deal breaker I'd be open minded about getting a TV instead if it would meet all my other criteria at a lower price and/or superior image quality (good speakers would also be a big bonus). Any suggestions? It must have:

  • Desk sized form factor so no larger than about 32" in 16:9 (or 35" in 21:9)
  • Minimum 120hz refresh rate (real, not interpolated)
  • 4K in 16:9
  • Decent quality HDR with a bare minimum 600 nits (preferably 1000+) and multiple local dimming zones
  • Variable refresh support in HDR mode that goes into the sub 30fps region (using LFC if necessary) and that is compatible with both AMD and Nvidia GPU's - although given the latter may be impossible I'd consider one or the other.
  • Response time preferably <6ms but I have a little flexibility here
  • Contrast ratio of at least 3000:1 (but again flexible if the picture is otherwise excellent)
The 32 inch size limit and LFC requirements will prevent you from getting anything other than an over priced, sub par monitor unfortunately. Nvidia's GSYNC HDR monitors are your only real option. But keep in mind there are no consumer grade HDR monitors you can even buy that come close to what you can get on a TV. Even the best ones still use 8 bit panels and suffer from very noticeable haloing due to a lack of zones.
 
The ps2 didnt have the 30fps problem though, ratched was 60 and great gfx for the time, so did mgs2, zoe2, jak series, god of war 1/2 both of which had best ps2 graphics or close, Black etc.

Even certain remakes of those are downgraded to 30.

Isn't shadow of colossus and zone of Enders runs at 30 fps and lower?

IIRC these also runs at 30fls

Radiata stories
Xenosaga series
Star ocean blah blah blah (long title)
Valkyria profile blah blah (side scrolling but 3d)
Max Paine

Hmm, not sure final fantasy x and x-2 runs at 30 or 60 fps

Oh almost forgot.
Growlanser,
Atelier iris
Grandia 2 or 3
 
Last edited:
The 32 inch size limit and LFC requirements will prevent you from getting anything other than an over priced, sub par monitor unfortunately.

"Over priced" and "sub par" compared to what?

Nvidia's GSYNC HDR monitors are your only real option.

Why? There are plenty of very high quality Freesync monitors available, some of which also support Gsync along with LFC.

But keep in mind there are no consumer grade HDR monitors you can even buy that come close to what you can get on a TV.

This is simply not true unless by "TV" you mean "top end OLED TV", and even then, it's a big stretch to say that no monitor could "come close". Take this for example:

https://www.samsung.com/uk/monitors/monitor-cg95/

A QLED DisplayHDR 1000 HDR10+ supporting set with a 1ms response time. That's going to be producing a better picture than 99.9% of "HDR" TV's on the market. And it can do it with VRR and a 240hz max refresh.

Even the best ones still use 8 bit panels and suffer from very noticeable haloing due to a lack of zones.

Again not true. There are plenty of monitors at the high end with multiple dimming zones and at least 3 with full FALD with more incoming. Haloing (at a nitpicking level) even with FALD will be visible on pretty much every TV on the market that isn't running an OLED panel, and most people can't tell the difference between 8 bit and 10 bit anyway, especially in games, let alone 8-bit+FRC which supports a 10bit signal and is the minimum requirement for DisplayHDR600 and above.

Besides which, the overwhelming majority of TV's on the market are also 8bit LCD's with global or edge dimming so I'm not quite sure what you're comparing too? You seem to want to compare the highest end TV's on the market to mid range monitors while ignoring everything at the high end. If you're intention is to compare all monitors to multi-thousand pound OLED TV's or the highest end FALD QLEDs then I'll grant, the TV's win on image quality. But describing monitors in general as "sub-par" when it's perfectly possible to get models that will outperform 99.9% of HDR TV's on the market in picture quality while simultaneously sporting monitor exclusive advantages like very low response times and very high refresh rates at form factors not available on the TV market seems a bit silly.
 
"Over priced" and "sub par" compared to what?

Compared to a TV of a similar price.

Why? There are plenty of very high quality Freesync monitors available, some of which also support Gsync along with LFC.



This is simply not true unless by "TV" you mean "top end OLED TV", and even then, it's a big stretch to say that no monitor could "come close". Take this for example:

https://www.samsung.com/uk/monitors/monitor-cg95/

A QLED DisplayHDR 1000 HDR10+ supporting set with a 1ms response time. That's going to be producing a better picture than 99.9% of "HDR" TV's on the market. And it can do it with VRR and a 240hz max refresh.

That monitor is 1700$ for an edge lit LCD incapable of quality HDR. And again the 1 ms G2G is marketing nonsense. Its not going to produce a better picture than pretty much any 1700$ TV. It also only supports GSYNC/FreeeSync at >59 FPS.


Again not true. There are plenty of monitors at the high end with multiple dimming zones and at least 3 with full FALD with more incoming. Haloing (at a nitpicking level) even with FALD will be visible on pretty much every TV on the market that isn't running an OLED panel, and most people can't tell the difference between 8 bit and 10 bit anyway, especially in games, let alone 8-bit+FRC which supports a 10bit signal and is the minimum requirement for DisplayHDR600 and above.

Besides which, the overwhelming majority of TV's on the market are also 8bit LCD's with global or edge dimming so I'm not quite sure what you're comparing too? You seem to want to compare the highest end TV's on the market to mid range monitors while ignoring everything at the high end. If you're intention is to compare all monitors to multi-thousand pound OLED TV's or the highest end FALD QLEDs then I'll grant, the TV's win on image quality. But describing monitors in general as "sub-par" when it's perfectly possible to get models that will outperform 99.9% of HDR TV's on the market in picture quality while simultaneously sporting monitor exclusive advantages like very low response times and very high refresh rates at form factors not available on the TV market seems a bit silly.

An 8 bit panel is certainly a lot more noticeable than a 10-15 ms difference in input lag. High end monitors do not have enough zones. TVs have quite a bit more zones which helps to mitigate haloing. OLED can be had for a similar price to these high end monitors.
 
The 32 inch size limit and LFC requirements will prevent you from getting anything other than an over priced, sub par monitor unfortunately. Nvidia's GSYNC HDR monitors are your only real option. But keep in mind there are no consumer grade HDR monitors you can even buy that come close to what you can get on a TV. Even the best ones still use 8 bit panels and suffer from very noticeable haloing due to a lack of zones.

FALD actually makes haloing significantly more visible versus a solid backlight which is what you should use with a PC regardless of whether you are using a TV or monitor anyway. With a PC or any source which is predominantly black with high contrast areas (like a starfield), local dimming of any sort (edge or FALD) should immediately be disabled, IMO, as it always results in unpleasant haloing.

Only solid backlights or OLED currently remove haloing artifacts.

Regards,
SB
 
Compared to a TV of a similar price.

Unless of course you want a display with lower input latency and higher refresh rates than TV's offer, or in a form factor smaller than 40" that still sports high end capabilities. In those scenario's you're paying a premium for a monitor because it's sporting features that simply don't exist in the TV market.

I'll grant you, that if the above is of no concern and you're happy with a 16:9 aspect ratio then TV's are no brainers over monitors every time. And I'll also agree that the premium you pay for those features in a monitor if you want to retain a high end TV like experience is extremely high.

But monitors are still the only game in town if you want/need them. They also seem to be catching up with TV's in HDR credentials with FALD becoming more common and the latest DisplayHDR 1400 spec. Granted you're sill paying a heavy premium for that quality in combination with monitor refresh rates and response times but hopefully with those technologies becoming more common, prices will start to become more reasonable. That's exactly the reason I haven't bought one yet despite itching for a 21:9 form factor. Maybe I should just bite the bullet, accept that I can't have the best of all worlds in the near future for a non-ridiculous price and get a more reasonably priced monitor with "fake HDR". Remove high end HDR from the mix and monitors become a far, far better value proposition (vs HDR monitors) while retaining all the other unique monitor advantages.

And again the 1 ms G2G is marketing nonsense.

Not really, response time's pretty important if you want to avoid ghosting while gaming where you won't have access to the usual image processing tech which TV's use to combat it. Obviously an OLED in the same price range is going to outperform this monitor in that regard, but no other TV display tech is.

It also only supports GSYNC/FreeeSync at >59 FPS.

It's predecessor supports LFC so I'd assume this does too even though it's not specifically called out on that page. If not then, I agree VRR starting at 60hz it one step up from useless.

High end monitors do not have enough zones. TVs have quite a bit more zones which helps to mitigate haloing. OLED can be had for a similar price to these high end monitors.

Out of interest how many zones do you consider enough zones and what TV's (outside of OLED) are you thinking of when you say they have quite a bit more zones?
 
Unless of course you want a display with lower input latency and higher refresh rates than TV's offer, or in a form factor smaller than 40" that still sports high end capabilities. In those scenario's you're paying a premium for a monitor because it's sporting features that simply don't exist in the TV market.

I'll grant you, that if the above is of no concern and you're happy with a 16:9 aspect ratio then TV's are no brainers over monitors every time. And I'll also agree that the premium you pay for those features in a monitor if you want to retain a high end TV like experience is extremely high.

But monitors are still the only game in town if you want/need them. They also seem to be catching up with TV's in HDR credentials with FALD becoming more common and the latest DisplayHDR 1400 spec. Granted you're sill paying a heavy premium for that quality in combination with monitor refresh rates and response times but hopefully with those technologies becoming more common, prices will start to become more reasonable. That's exactly the reason I haven't bought one yet despite itching for a 21:9 form factor. Maybe I should just bite the bullet, accept that I can't have the best of all worlds in the near future for a non-ridiculous price and get a more reasonably priced monitor with "fake HDR". Remove high end HDR from the mix and monitors become a far, far better value proposition (vs HDR monitors) while retaining all the other unique monitor advantages.

Realistically there is little benefit to refresh rates above 120. The only situations will be for e-sports games at lowest settings, and if this is what you are playing you'd just buy a cheapo TN panel since picture quality is of no relevance to you anyway. For everyone else there's just not enough performance to make it usable.

Not really, response time's pretty important if you want to avoid ghosting while gaming where you won't have access to the usual image processing tech which TV's use to combat it. Obviously an OLED in the same price range is going to outperform this monitor in that regard, but no other TV display tech is.

I'm not debating the value of lower response times, I'm saying the methodology used to garner this 1 ms metric is misleading and not representative of real world performance.

It's predecessor supports LFC so I'd assume this does too even though it's not specifically called out on that page. If not then, I agree VRR starting at 60hz it one step up from useless.

Out of interest how many zones do you consider enough zones and what TV's (outside of OLED) are you thinking of when you say they have quite a bit more zones?

There isn't a set number, it's a case of more being better. PC monitors top out at 384, TVs at 792 last time i checked.
 
Eh, not so much unless you're only looking at TN panel monitors.

Uncalibrated color accuracy and color uniformity are generally superior on the better monitors. Once calibrated, however, TVs and Monitors can be virtually indistinguishable although the best monitors will generally have better color uniformity than the best TVs. The exceptions come into play with studio monitors that costs significantly more than your standard consumer monitors.

Latency is important. Far more important when using a PC than a console due to the fact that there is a 1:1 mapping between mouse and hand movements unlike console controllers. Basically once you get below a certain point with a controller, it's harder to notice the lag because there isn't a 1:1 connection between moving the analog stick and the response on the screen.

However, even with a controller there is a noticeable effect when it comes to pressing a button (for example for a precision frame perfect jump or move) and seeing the feedback of that button press on the screen. It's something most people can adjust to, but if you go from a very low latency control -> display feedback look to a higher latency one, the differences are very noticeable as your timing will be completely off and everything feels like you are controlling someone or something moving slight off to what you expect.

As to Image Quality, in high processing mode (high latency mode on TVs) the IQ can be better for content in motion as the TV spends significant processing time improving the visuals. Disable that for low latency gaming mode on the TV and almost all of that disappears leaving it at the same place as a typical average PC monitor.

But, here's the kicker, if you want all that fancy processing that TVs do to improve image quality in movies, you can have that on PC as well through multiple various media players, filters, etc. The one place TVs still have a significant advantage in processing is motion interpolation (smooth video) as they have dedicated chips for that. But that is something you'll never use in games as it comes with a hefty processing (latency) penalty. As well, having interpolated frames forced onto a game isn't necessarily going to make the gameplay experience better even if the latency for it wasn't relatively massive.

I use a TV as my main display for cost. If I was as serious about gaming as I was in the past and my reactions were like they were 20 years ago, I wouldn't touch a LCD TV for a gaming monitor with a 10 foot pole.

Regards,
SB
with you always.

Sure good monitors are insanely priced, but you pay that for a good reason.

Sometimes I've made sacrifices to play 165fps which is what my monitor gives, and I've never regretted it.

In some game even having a GTX 1080 I went to the extreme of setting the game to 784x576 pixels -or something like that-, to achieve max framerate and having the GPU totally silent.

The day I played Doom 3 at 240fps on a 240Hz monitor I had for a while, my jaw dropped to the ground.

90 fps should be the bare minimum to avoid dizziness.
 
Back
Top