Digital Foundry Article Technical Discussion [2024]

In the Metro Exodus Enhanced Edition I hardly saw any difference between VRS 2x on and off. But I would have to test it in more detail.
 
1:40:12 Supporter Q3: Is the time of 1080p monitors over?
1080p can go once Nvidia/AMD release midrange GPU's that aren't like $500+. When mainstream 'affordable' GPU's being sold for like $300 nowadays are really lower end GPU's, then yes, 1080p is gonna have to stick around.

As for VRS, I do wonder if it's being used in more games than we realize, especially as a software solution. I remember when 'async compute' was a sort of feature item listed for games, or even set as a toggle in many earlier examples, before it largely just 'disappeared'. Except it never disappeared, it simply became standard practice to use it. I imagine VRS is quite hard to 'detect' if there's no ability to toggle it, and using it perhaps just for small performance gains with minimal quality difference could be a useful optimization tool.
 
What would 1080p being over actually mean?

I haven't listened to this yet so I don't know the specific context here but in general when the 1080p discussion gets brought up I do feel it's another example of the disconnect in that enthusiasts often don't seem to realize that the PC DIY retail segment in itself is only a small subset of overall PC gaming hardware and the smallest subset.

I believe roughly half of discrete GPUs (might be slightly more) go into laptops, which means they might not even be connected to a stand alone monitor.

As an aside circling back to the previous point this doesn't mean half of all discrete GPUs are retail DIY either as desktop discrete GPU sales are for the majority going to be through prebuilts and system integrators.
 
What would 1080p being over actually mean?

I haven't listened to this yet so I don't know the specific context here but in general when the 1080p discussion gets brought up I do feel it's another example of the disconnect in that enthusiasts often don't seem to realize that the PC DIY retail segment in itself is only a small subset of overall PC gaming hardware and the smallest subset.

I believe roughly half of discrete GPUs (might be slightly more) go into laptops, which means they might not even be connected to a stand alone monitor.

As an aside circling back to the previous point this doesn't mean half of all discrete GPUs are retail DIY either as desktop discrete GPU sales are for the majority going to be through prebuilts and system integrators.

DF is referring to 1080p monitors. Their disappearance doesn’t necessarily mean the resolution of 1080p won’t have any utility going forward. Popular twitch based titles will always have a population that favors frame rates over resolution.
 
Last edited:
What would 1080p being over actually mean?

I haven't listened to this yet so I don't know the specific context here but in general when the 1080p discussion gets brought up I do feel it's another example of the disconnect in that enthusiasts often don't seem to realize that the PC DIY retail segment in itself is only a small subset of overall PC gaming hardware and the smallest subset.

I believe roughly half of discrete GPUs (might be slightly more) go into laptops, which means they might not even be connected to a stand alone monitor.

As an aside circling back to the previous point this doesn't mean half of all discrete GPUs are retail DIY either as desktop discrete GPU sales are for the majority going to be through prebuilts and system integrators.
DF themselves talked about a new dual OLED monitor, native 4K 240Hz, and native 1080P 480Hz, which imho could be the best solution, since most people like 1080p when it's native and there are thing you can't achieve at 4K but you can at 1080p. Dunno how they have dual native resolution but I guess they are just limiting the window size within the display?

The native 4K 50" Samsung TV I have, has a feature that I find super useful (thanks to a B3D forumer who suggested me), and it's that you can set a Ultrawide 32:9 native (3840x1080p) resolution, which even Windows recommends as the resolution you should use and it's the most useful thing I've ever found on a display.

I always use 2 monitors and with that feature it's problem solved. Plus being a TV it has built-in sound, so no speakers, no 2 displays anymore.

Another great thing about that feature is that you can play games at regular 4K 16:9 or at 32:9 (3840x1080). Some games work better with 16:9, others at 32;9. Some of those games compatible with 32:9 Ultrawide can look really great, not to mention the improved horizontal fov. The image gets so wide in some games you see a lot more.

Also productivity wise, splitting the image in half is a breeze, and don't require any tinkering like giving more space to one side than the other and so on.

The res is native 'cos they just use two huge black bars at the top and the bottom, thus you also save energy. It's very rare that I use 4K native, having 3840x1080p native but sometimes it's useful for certain things.

The DF video where they mention the dual native 1080p 480Hz / native 4K 240Hz monitor.

 
DF themselves talked about a new dual OLED monitor, native 4K 240Hz, and native 1080P 480Hz, which imho could be the best solution, since most people like 1080p when it's native and there are thing you can't achieve at 4K but you can at 1080p. Dunno how they have dual native resolution but I guess they are just limiting the window size within the display?
You can double up pixels on 2160p to get 1080p. It'd be exactly the same as that sized screen at 1080p with 2x2 2160p pixels occupying the same FOV as 1 larger 1080p pixel. Basically have upscaling turned to nearest neighbour and job done!
 
You can double up pixels on 2160p to get 1080p. It'd be exactly the same as that sized screen at 1080p with 2x2 2160p pixels occupying the same FOV as 1 larger 1080p pixel. Basically have upscaling turned to nearest neighbour and job done!
so do you mean that four pixels show the same colour, thus they act as one pixel?

I also wonder how are they going to achieve 4K 240Hz with either Displayport 1.4 or HDMI 2.1.
 
so do you mean that four pixels show the same colour, thus they act as one pixel?

I also wonder how are they going to achieve 4K 240Hz with either Displayport 1.4 or HDMI 2.1.

I don't think monitor is being released until later this year so I don't think anyone knows the specifics yet. But presumably it would operate 4 pixels as 1 in 1080p mode.

HDMI 2.1 with DSC in theory can support 4k240hz with HDR.
 
I don't think monitor is being released until later this year so I don't think anyone knows the specifics yet. But presumably it would operate 4 pixels as 1 in 1080p mode.

HDMI 2.1 with DSC in theory can support 4k240hz with HDR.
they will probably use Displayport 2.1a then, I guess. Although the specs say the monitor has Displayport 1.4, achieving 240Hz without black frame insertion might not be possible.
 

Sucks we're going to be saddled with another generation of Unreal Engine games which constantly stutter to shit on PC. The game (Suicide Squad) apparently "precompiles" upon launch too.. but it doesn't do anything.

I'm just so tired of studios/publishers releasing games like this. It's so god damn frustrating. It's honestly hard to look forward to any game that is known to be using that engine these days.


If there was a position within Digital Foundry to solely create videos focusing specifically on stuttering and hitching in PC games and continually holding publishers/studios accountable until they fixed it.. I'd apply for it. :mad:
 
Last edited:
wonder how it does look vs a native 1080p screen then. Still, imho, the smartest solution to having 2 monitors in one, and I mean it literally, is doing the same that Samsung did with their TVs. You have two displays in one, by all means, not only you get the display "surface" of two displays in Ultrawide mode, but you get 4K native, and native 32:9 (3840x1080) and it also has a 21:9 resolution (2560x1080, this one is not native though), so yeah the best of both worlds.

You have to enable Game mode and source as PC and there you have it, 4K or 32:9 resoluton. In fact Windows set 3840x1080 as the recommended resolution.

For me is a win win 'cos I always used 2 displays and now I don't use them anymore. So less energy needed, a single screen and both native 4K 16:9 and native 32:9 (3840x1080).

Also it saves me a lot of processing power when switching to 32:9 as you are halving the resolution from 3840x2160 to 3840x1080, without losing detail (in fact you win a lot of lateral detail) 'cos they are natively rendered. What's not to love? :)

Another advantage is that you can switch from both resolutions seamlessly, both in Windows itself via Settings when the game is in Borderless Mode or in-game if the game is set to Full Screen (my least favourite option but it does the job).

Pics from my mid to low range phone on how it looks on the 50" TV.

Native 4K with XeSS Balanced -via mod-, RT High.

B4mbRCf.jpeg


32:9 3840x1080, same settings as the 4K image.

Sw7L5EZ.jpeg


When I save enough in the very distant future -not in a hurry-, I want to get and OLED monitor with a similar functionality, since for me it has been like a great discovery thanks to a B3D forumer (bless him, I didn't know I had this option on the TV) and this TV pensioned off :) my much loved 165Hz monitor from 2019 and my 32" native 1080p TV from 2013. I love my 165Hz display, but the advantages of this kind of display are so great by comparison.., and it comes with sound, so no external speakers.
 
Last edited:
The best way to deal with IPS glow is to calibrate it to ~100 nits.

Failing that you'll need to move to an IPS monitor with 500+ dimming zones and FALD.
yep. Back in the day, a gaming laptop I had with which I completed Doom 2016 on the laptop's GTX 1050Ti had this issue, or more like the blacks looked grey. IPS screens never appealed to me since then.

Aside from my amoled phone screen, I never had an OLED screen, it has to be amazing. My current TV has a VA panel and FALD and an okay HDR (unlike my monitor, whose HDR I set to off 'cos HDR 500 or not, I don't care, it doesn't look good but SDR looks fine. On the contrary my TV's SDR looks overbright and hurts the eyes, but when you enable to HDR it looks much better), and the blacks are very decent, I know I am missing compared to an OLED but it's not like I am in a hurry to get one.

@troyan kinda saddening, and while I always had my doubst about UE for some reason, maybe caused because of the days where UE was incompatible with Xenos AA implementation and generic looking marines in every games alon the grey-ish color in most games, I was kinda hyped about UE5, and now I am saddened that games like a future The Witcher have to run in that :/
 
Back
Top