Digital Foundry Article Technical Discussion [2024]


Alex takes a look at Forza Motorsport on PC with a RT mod enhancing the graphics.


Edit: Now that I'm watching the video, it's just baffling to me why some of these improvements aren't just officially supported and enabled by the game. The grass distance for one... like why is that not the default Ultra setting? Why can't we have options to push fidelity out further. Yes I know people would just complain about ultra being too demanding.. but this is a form of thinking that we have to begin to change in PC gaming IMO. We should normalize "High" settings which targets current top end GPU hardware, and "Ultra" which targets future GPUs.
Sure it looks a lot better with ray tracing, but Gran turismo 7 cars and materials rendering are still much better even without ray tracing.
 
Yea but we're talking about Forza here.

Forza Horizon feels like the series to get excited about if you're looking for a GT competitor. Fable looks phenomenal, so I'm excited to see what's transferred over to Horizon 6 (even if it'll assumedly come out before Fable)
 
Sure it looks a lot better with ray tracing
I wouldn't even say that. It looks better in some small ways, but overall presentation still looks very similar. Like for me, the performance hit would not be worth it to me here if I didn't have a bunch of spare headroom, even assuming all the noise/issues were completely ironed out. Heck, I'd probably sooner have the increased draw distances than the RTGI.

And as you say, the real gap in visual quality seems to come from Forza's inferior material shaders for the cars, which hasn't really changed much with this outside some small differences in reflectivity. I think FM easily looks better than GT in terms of overall environments, though.
 

DF Direct Weekly #172: PS5 Pro Still Coming in 2024? No Man's Sky Mega Update, Intel CPU Instability​


0:00:00 Introduction
0:01:24 News 01: Will PS5 Pro release in 2024?
0:11:59 News 02: No Man's Sky updated with new visual features
0:20:12 News 03: Intel CPU stability issues may run deep
0:34:54 News 04: GPU Open tech presentations with Alex
0:48:58 News 05: John's next project: PlayStation vs Saturn
0:57:48 News 06: Can #StutterStruggle be overcome in Unreal Engine?
1:09:56 News 07: Gears of War Ultimate Edition suffers poor performance on PC
1:17:58 News 08: Playing Death Game Hotel with SWERY
1:25:05 Supporter Q1: What would you like to see in a hypothetical DirectX 13?
1:30:20 Supporter Q2: Could a new Nvidia Shield device enhance game streaming with local processing?
1:36:58 Supporter Q3: Are large decreases in frame-rates still impactful when targeting high FPS?
1:45:54 Supporter Q4: What should we expect from the mooted handheld Xbox?
1:53:15 Supporter Q5: Is releasing a game as an early access title a good idea?
1:58:01 Supporter Q6: Will Switch 2's hardware decompression engine make its way to PC?
2:01:47 Supporter Q7: Have any pivotal decisions altered your gaming destinies?
 
Supporter Q1: What would you like to see in a hypothetical DirectX 13?

Strict hardware requirements for GPU's to achieve the required certification, like we had in DX7, 8 and 9 era.

DX12 has been a mess in regard, with some older GPU's being DX12 but not fully DX12 compliant, and then there was the bullshit with DX12U, why not call it DX13?

And a minimum VRAM amount for GPU's to be set.
 
Strict hardware requirements for GPU's to achieve the required certification, like we had in DX7, 8 and 9 era.

DX12 has been a mess in regard, with some older GPU's being DX12 but not fully DX12 compliant, and then there was the bullshit with DX12U, why not call it DX13?

And a minimum VRAM amount for GPU's to be set.
New numbered DirectX versions traditionally came with new Windows versions, and back in March 2020 Microsoft was still claiming (and the DirectX team likely still believing) that there wouldn't be a Windows 11. Also, what should the minimum VRAM be? 8GB is still perfectly acceptable for 1080P, which is what the majority of gamers are still playing at. And DirectX is used for more than just the latest and greatest AAA games.
 
New numbered DirectX versions traditionally came with new Windows versions, and back in March 2020 Microsoft was still claiming (and the DirectX team likely still believing) that there wouldn't be a Windows 11.

Since the last DX11 we have had Windows 8, 8.1, 10 and 11.

Microsoft were well behind before they even got to Windows 10.

Also, what should the minimum VRAM be? 8GB is still perfectly acceptable for 1080P, which is what the majority of gamers are still playing at. And DirectX is used for more than just the latest and greatest AAA games.

8GB is acceptable now (just about), but DX13 isn't coming out now.
 
New numbered DirectX versions traditionally came with new Windows versions, and back in March 2020 Microsoft was still claiming (and the DirectX team likely still believing) that there wouldn't be a Windows 11. Also, what should the minimum VRAM be? 8GB is still perfectly acceptable for 1080P, which is what the majority of gamers are still playing at. And DirectX is used for more than just the latest and greatest AAA games.
8gb is acceptable for e-sport, but these are not the games that will be the focus with a new version of directx.
 
Since the last DX11 we have had Windows 8, 8.1, 10 and 11.

Microsoft were well behind before they even got to Windows 10.



8GB is acceptable now (just about), but DX13 isn't coming out now.
8gb is acceptable for e-sport, but these are not the games that will be the focus with a new version of directx.
Microsoft set 4GB system memory as the minimum requirement for Windows 11. That's not even enough to browse the web with multiple tabs open. In all likelihood, if DirectX 13 comes out anytime in the next decade and has a minimum VRAM requirement, it would only require 6GB, or even less. That's not even counting APUs, SoCs, and iGPUs.

The pretense that DirectX 11 and 12 would coexist died with DX12U. Any hypothetical DX13 would completely replace DX12U, so it will have to scale up and down across the entire range of Windows devices and applications.
 
Last edited:
Strict hardware requirements for GPU's to achieve the required certification, like we had in DX7, 8 and 9 era.
LOL, not even Microsoft themselves have extensively documented specifications for those iterations of DirectX. Developers were still relying on hardware vendors to cover up the many 'holes' left up by Microsoft either due to unclear rules/wording or the lack of conformance testing ...

Microsoft only started taking serious steps during D3D10 to start formalizing and do major clean ups for their API specifications ...
DX12 has been a mess in regard, with some older GPU's being DX12 but not fully DX12 compliant, and then there was the bullshit with DX12U, why not call it DX13?
The feature matrix of extensions has been a thing since around D3D9 so I don't know what you find so offensive about it with D3D12 ...

'Ultimate' being advertised as an extension rather than a clean break up was a purely pragmatic design choice. None of the new features really necessitated a clean sheet foundation. The mesh shading pipeline is just a compute-like shading stage that can send the rasterizer it's own data, sampler feedback is mostly a software implemented feature, variable rate shading is just some special pixel shader state/mode, and even RT reused prior infrastructure/concepts such as PSOs and bindless so we didn't really need an entirely new original API to express these features when they didn't fundamentally alter the way we do GPU programming ...
 
LOL, not even Microsoft themselves have extensively documented specifications for those iterations of DirectX. Developers were still relying on hardware vendors to cover up the many 'holes' left up by Microsoft either due to unclear rules/wording or the lack of conformance testing ...

Microsoft only started taking serious steps during D3D10 to start formalizing and do major clean ups for their API specifications ...

It was pretty simply, you couldn't say your Dx7 GPU was a Dx8 one could you?

You couldn't claim your Dx10 GPU was a Dx11 one could you?

The feature matrix of extensions has been a thing since around D3D9 so I don't know what you find so offensive about it with D3D12

By allowing 2012 Dx11 GPU's to be 'classed' as Dx12 ones is wrong and causes confusion in the consumer space, leads to people thinking they don't need to upgrade to a more modern GPU and can/could slow down adoption rate of newer GPU's.

So yea, there's a big potential for a knock in effect.

'Ultimate' being advertised as an extension rather than a clean break up was a purely pragmatic design choice.

Which further adds to the confusion with the consumers as the hardware capabilities between a Dx12 GPU and a Dx12U one are very very different.

None of the new features really necessitated a clean sheet foundation.

Hardware RT
VRS
Mesh Shaders
Sampler Feedback

That is a lot of hardware feature set additions and certainly could have easily justified being a new Dx.

The mesh shading pipeline is just a compute-like shading stage that can send the rasterizer it's own data, sampler feedback is mostly a software implemented feature, variable rate shading is just some special pixel shader state/mode, and even RT reused prior infrastructure/concepts such as PSOs and bindless so we didn't really need an entirely new original API to express these features when they didn't fundamentally alter the way we do GPU programming ...

All of which are not supported in Dx12 GPU's.

Arguably the largest addition to Dx11 was Tessellation, and yet Microsoft didn't just release it as Dx10.2.
 
MS has a reason why they stopped numbering when they reached DX12. LOL, maybe some old, knowledgeable forumers can answer that, remember?
 
MS has a reason why they stopped numbering when they reached DX12. LOL, maybe some old, knowledgeable forumers can answer that, remember?
IIRC in the old days the DX was bound to Windows OS versions largely, with each release being iterative improvements over the previous. But DX1 through 11 are largely the same render path, at least there's significantly more similiarity between 9 through 11. In the end MS stopped after DX11, to keep that as a high level API. And DX12 as their low level API. If there were a DX13, I would agree with Lurkmass, it would be a different rendering path than we have today. I assume at this point in time, they will just add more to DX12 and DX11 unless there is something out there that would suggest a completely new render path would make more sense in the future.

There's not really any rhyme or reason behind the numbering system here - it's really just an extension of marketing, but it seems to have settled to high level (DX11) and low level (DX12)
 
IIRC in the old days the DX was bound to Windows OS versions largely, with each release being iterative improvements over the previous. But DX1 through 11 are largely the same render path, at least there's significantly more similiarity between 9 through 11. In the end MS stopped after DX11, to keep that as a high level API. And DX12 as their low level API. If there were a DX13, I would agree with Lurkmass, it would be a different rendering path than we have today. I assume at this point in time, they will just add more to DX12 and DX11 unless there is something out there that would suggest a completely new render path would make more sense in the future.

There's not really any rhyme or reason behind the numbering system here - it's really just an extension of marketing, but it seems to have settled to high level (DX11) and low level (DX12)
None of the DX12U features have been added to DX11. Like the Khronos Group with OpenGL, it seems that Microsoft has decided to leave its high-level API behind.
 
Back
Top