Digital Foundry Article Technical Discussion [2024]

Status
Not open for further replies.
I wasn't talking about brightness. I was replying to davis.anthony's position...


What is this new display tech that doesn't use sample-and-hold

The sample and hold problem is related to brightness like troyan pointed out. Once you have pixels that are bright enough you can reduce the “hold” that causes persistence blur. The tech to reduce this hold already exists - BFI. The missing piece is bright enough pixels that don’t need to stay on for an entire refresh interval in order to produce sufficient average brightness.

A 120hz display that’s “on” for 1ms and “off” for 7.33ms will likely look and feel better than a 480hz sample and hold display that’s “on” for the entire 2ms refresh interval with frame gen enabled. But it needs to be bright enough during that 1ms.
 
Last edited:
We've got nvidia pulsar coming, which will allow strobing to operate with variable refresh rate. Other attempts at that haven't been great. Strobing is difficult to do without crosstalk even at fixed refresh rates. With strobing you want vsync, so you end up with vsync penalty at the fixed refresh rate. So it has image quality problems, input latency problems and all of the issues of an LCD panel. I have no idea if there are any strobing technologies that work with local dimming and hdr. I doubt Nvidia pulsar will touch that.

BFI loses half your brightness. So on OLED panels you lose half of your refresh windows to displaying a black image and you lose half your brightness. So HDR is out. Panels are getting brighter, but it's incremental because they need to work around the lifespan of the pixels, burn-in etc. The other thing with BFI is you can't really play around with your duty cycle like you can with strobing. With BFI you're stuck at 50%. You display an image half the time and the display is off half the time. With strobing you can do things like have the backlight on 20% of the time and off 80% of the time to get even greater reductions to motion-blur. It has diminishing returns on LCD because of ghosting and loss of brightness.

Micro-led is just stagnating. They can't get it working except on extremely large and extremely expensive displays. Five years, maybe ten years they'll get something like that working and have much more brightness to handle BFI. The biggest driver of panel tech is film and tv, so that's where the investment is. It's not coming from niche PC gaming.

What we're left with is more of a brute force approach on PC. You render really high frame rates on existing panel technology. The big wins in power efficiency will come from frame generation until some better solution comes along. A 480 Hz OLED panel will have 2.08ms MPRT, which is very close to the MPRT of a CRT display (something like 1.5 ms). The fact that it's sample and hold really doesn't matter at that point. The motion blur on a 480Hz OLED and a CRT are going to be very very close, but the OLED will have much better brightness, HDR and better motion-handling. There are 480Hz OLEDs that do BFI at 240Hz. They should have 1.04ms MPRT, but they'll sacrifice HDR and lose some motion-handling. It's a very good option if you want to play in SDR only and reduce the burden on your pc. I honestly think the best option is to target 240 Hz or higher and frame gen as close to 480 Hz as you can get to preserve HDR.

That's the reality of the landscape. If there was better display tech out there we'd be hearing about it. I know the panel makers have roadmaps quite far out. You can see leaks on tftcentral etc. Don't think there's anything revolutionary coming soon. I really hope something surprises me in the next five years. All I know is high frame rates are worth it if you can afford to do it. John mentions lowering the Quake 2(?) RTX down to 720p to get 480Hz saying it's worth it. I feel the same way about it.
 
The sample and hold problem is related to brightness like troyan pointed out. Once you have pixels that are bright enough you can reduce the “hold” that causes persistence blur. The tech to reduce this hold already exists - BFI. The missing piece is bright enough pixels that don’t need to stay on for an entire refresh interval in order to produce average brightness.

A 120hz display that’s “on” for 1ms and “off” for 7.33ms will likely look and feel better than a 480hz sample and hold display that’s “on” for the entire 2ms refresh interval with frame gen enabled. But it needs to be bright enough during that 1ms.


I think we're quite a ways from getting OLEDs that are bright enough to handle 3 black frames to 1 displayed frame unless you're solely looking at SDR. In that case you'd be buying a 480Hz panel, and displaying a 120Hz signal with 1/4 of the panels brightness capability. BFI is a pretty good option for SDR on an OLED, if you want to halve your refresh rate and preserve the motion blur capability. Hopefully they'll get BFI working with VRR sooner than later.
 

DF Direct Weekly #188: Death Stranding Xbox, God of War Ragnarök PS5 Pro, Switch 2 Back Compat!​

0:00:00 Introduction
0:01:08 News 1: God of War Ragnarök PS5 Pro patch tested!
0:13:36 News 2: Nintendo confirms Switch 2 backwards compatibility
0:25:27 News 3: Death Stranding lands on Xbox
0:38:11 News 4: No Man’s Sky patched for PS5 Pro
0:47:49 News 5: Sony INZONE M10S impressions
1:05:53 News 6: Sonic Generations can run at 60fps on Switch
1:11:42 Supporter Q1: Will you use the 9800X3D for high-end gaming benchmarks?
1:16:21 Supporter Q2: Will PS6 use 3D V-Cache?
1:17:32 Supporter Q3: Will Alex and John switch to 9800X3D?
1:22:15 Supporter Q4: Why is Game Boost falling short of the promised 45% increase to PS5 Pro raster performance?
1:29:21 Supporter Q5: Would AI frame extrapolation make native frame-rate unimportant?
1:32:30 Supporter Q6: What do you want out of a Steam Deck 2?
1:38:32 Supporter Q7: Why didn’t you spend more time analyzing the PS5 Pro box?

Losing 10 to 20fps or 2m/s when using PSSR does that indicate it's running on the compute units?
 
Losing 10 to 20fps or 2m/s when using PSSR does that indicate it's running on the compute units?
No, no scaler is free regardless if it runs on shader units or matrix crunchers. Gains depend on several factors including how much you gain by lowering the rendering resolution (if you even lower it)
 
Spider Man 2 upgrades for PS5 Pro. Ray Traced shadows, ambient occlusion and improved reflections. Also screen space global illumination. Bodes well for the upcoming PC version.

Two things to note: reflections are still not on par with PC versions of previous Spider-Man games, suggesting PS5 Pro still not powerful enough as PC ray tracing hardware, also screen space global illumination suggests abundance of compute resources more than ray tracing resources. Or maybe I am reading too much into this.

Secondly, PS5 Pro has too many graphics options now! Almost like a stripped down PC version!

 
Last edited:
Ragnarök performance on the Pro is pretty interesting in that unlocked VRR mode. Obviously a more direct comparison would be needed using the same scene and exact matched settings but based on the 80-100fps seen in the DF TAA test it could be performing somewhere around the 3080-4070Ti window. Obviously this could be way off based on scene but it would be great to see a proper head to head here.

Granted we should expect higher performance on a Sony native game ported to PC vs your average cross platform 3rd party title.

 
No, no scaler is free regardless if it runs on shader units or matrix crunchers.
Well a dedicated hardware scaler is and that is vaguely implied by one of the sources. however we can't 100% discount that from this info as we don't know what else is affecting render performance.
 
Ragnarök performance on the Pro is pretty interesting in that unlocked VRR mode. Obviously a more direct comparison would be needed using the same scene and exact matched settings but based on the 80-100fps seen in the DF TAA test it could be performing somewhere around the 3080-4070Ti window. Obviously this could be way off based on scene but it would be great to see a proper head to head here.

Granted we should expect higher performance on a Sony native game ported to PC vs your average cross platform 3rd party title.

The game on PS5 already performs close to a 3070 Ti. With the Pro’s better GPU, we should be getting into 4070 Ti territory indeed.
 
Does DF have the ability to pixel count VR games ? Because there is something going on with GT7 VT on Pro. It definitely looks sharper than on OG PS5 even without a patch
 
As a longtime video enthusiast, I can assure you there’s no magic display tech on the horizon. Micro LED is nowhere close to the commercial realities for even enthusiast so iterations of OLED is our best bet.

The mass consumer just wants a bigger set and quality is well down on the list and definitely does not want to pay a premium. So good luck in introducing that new technology.
 
Well a dedicated hardware scaler is and that is vaguely implied by one of the sources. however we can't 100% discount that from this info as we don't know what else is affecting render performance.
A dedicated scalar is not free either because at a certain point in the rendering everything must pause for upscaling to occur before proceeding further.
A dedicated AI hardware can be used in parallel to the standard rendering pipeline for things like async queue on denoising for RT, frame generation, physics etc. But for anti-aliasing and upscaling, things are largely going to wait until that part of the rendering is completed before moving into post-processing.
 
As a longtime video enthusiast, I can assure you there’s no magic display tech on the horizon. Micro LED is nowhere close to the commercial realities for even enthusiast so iterations of OLED is our best bet.

The mass consumer just wants a bigger set and quality is well down on the list and definitely does not want to pay a premium. So good luck in introducing that new technology.
MiniLED looks quite good, the differences between my QM8 2024 and my older A90J are pretty minor and the brightness advantage of miniLED more than makes up for it.
 
Well a dedicated hardware scaler is and that is vaguely implied by one of the sources. however we can't 100% discount that from this info as we don't know what else is affecting render performance.
They're not realtime either are they? So they'd still add a tad of latency if nothing else?
MiniLED looks quite good, the differences between my QM8 2024 and my older A90J are pretty minor and the brightness advantage of miniLED more than makes up for it.
MiniLED is just another type of backlighting for LCD. MicroLED like OLED has no backlight
 
A dedicated scalar is not free either because at a certain point in the rendering everything must pause for upscaling to occur before proceeding further.
I'm thinking at the end after all rendering and before final output, like a TV upscaler. That'd include upscaling the UI. Alternative you'd have it sit after render of the framebuffer with a separate UI buffer, which existed in hardware on PS360.
 
I'm thinking at the end after all rendering and before final output, like a TV upscaler. That'd include upscaling the UI. Alternative you'd have it sit after render of the framebuffer with a separate UI buffer, which existed in hardware on PS360.
It would still add input lag if nothing else. That's why TV 'Game modes' are supposed to turn the scaler off, too (or at least as much of it as they can).
 
Status
Not open for further replies.
Back
Top