davis.anthony
Veteran
Raytracing + Mesh shading could’ve easily been marketed as DX13.
I agree completely.
Raytracing + Mesh shading could’ve easily been marketed as DX13.
Many Direct3D APIs have always had some semblance of hardware backwards compatibility. The only major D3D API iteration in recent memory that tried to break this trend was D3D10 with very radical changes (no fixed function & separate shader/state objects) to state management a long with groundbreaking new features both for the worse (geometry shaders/stream out) and for the better (compute shaders/dual source blending) later on (D3D10.1) ...It was pretty simply, you couldn't say your Dx7 GPU was a Dx8 one could you?
You couldn't claim your Dx10 GPU was a Dx11 one could you?
Ultimately, it's up to the developers to define their hardware requirements to drive up new hardware adoption. The gfx API isn't the end game here but it's the software ...By allowing 2012 Dx11 GPU's to be 'classed' as Dx12 ones is wrong and causes confusion in the consumer space, leads to people thinking they don't need to upgrade to a more modern GPU and can/could slow down adoption rate of newer GPU's.
So yea, there's a big potential for a knock in effect.
It's not about the 'number' of new features since there were already tons of small feature updates ever since the initial release of D3D12 and before 'Ultimate'. An API defining feature is about pushing the limits of existing programming models of which all of the above doesn't do very much ...Which further adds to the confusion with the consumers as the hardware capabilities between a Dx12 GPU and a Dx12U one are very very different.
Hardware RT
VRS
Mesh Shaders
Sampler Feedback
That is a lot of hardware feature set additions and certainly could have easily justified being a new Dx.
Microsoft couldn't release a hypothetical 'D3D10.2' because they abandoned the reference counting behaviour in D3D11 but I assure you that both D3D10 and D3D11 are undeniably more similar API designs than you would initially believe and that's the assessment shared by other graphics programmers as well if you ask them ...All of which are not supported in Dx12 GPU's.
Arguably the largest addition to Dx11 was Tessellation, and yet Microsoft didn't just release it as Dx10.2.
I think perhaps one day it should. But as it is implemented today, perhaps not? As Lurkmass writes, they haven't changed the way they submit work to the GPU.You don't think ray/path tracing falls in to that?
With how the CPUs are making the calls to the GPUs today, no I don't think it would fall into that. Work submission, memory management, synchronization is completely different on DX12.None of the DX12U features have been added to DX11. Like the Khronos Group with OpenGL, it seems that Microsoft has decided to leave its high-level API behind.
I dont know man, I pretty much completely ignore 'game requirements' cuz I dont think a lot of thought or testing really gets put into them in the first place. And it's extremely rare that some game releases with a minimum requirement for an extremely modern GPU. Which also says a lot about how slowly most of these technologies tend to get adopted.Isn't this kind of a moot discussion in the consumer context at this point? Almost every consumer game requirements I see - and hell even the UE5 feature requirements for developers - are listed in terms of "this generation or newer of AMD, this generation or newer of NVIDIA, etc", not in terms of DirectX feature level. When there are (significantly) fewer IHVs than relevant features, as long as IHVs continue to make future generations and driver updates support a superset of features then it's just far easier and clearer to specify IHV hardware generations and driver versions anyways. This of course also relies on IHVs not to fuck up their marketing names to the point where they rename older stuff to newer stuff with fewer features; this has been slightly better in recent years though.
Frankly I don't think consumers need to know or care about things like "DirectX 12 Ultimate" or "Feature Level 12_2" and what it means. They just look at the requirements and see if their GPU is older or newer than the given generation listed.
Sure but that's kind of a tangential discussion to the whole issue of DirectX versions. DirectX versions are at best going to give you some indication of whether a game will run *at all*; that's the best and most they have ever done. Minimum game requirements should do that and potentially give you some idea of whether the experience will be reasonable at all. Certainly I would recommend most folks go further and look into how *well* a game runs in terms of performance and visuals on their hardware if they are concerned that their hardware is on the line, but that's really a separate question than the DirectX version question of "will it run at all".You only need to worry about anything if you've got an older GPU, and even then, you should be looking at actual user impressions about this to judge whether you'll be good or not.
Right, and like most games, it stated it's system requirements in terms of GPU generations which is far more useful than if they had tried to describe to people DirectX versions and feature levels, which are not even something that is advertised on GPUs these days anyways.Alan Wake 2 is probably one of the only examples I can think of where any of this would have caught any significant number of people out.
But if the other guys have blast processing and you don’t, you’re screwed!Yea, and I think IHVs and Microsoft with Xbox have bungled it up even worse in recent years. By all means, market your hardware to consumers with supported graphical features (RT, DLSS, FSR, FG, ect) but cut it out with hyping up API tech and features (Mesh shaders, SFS, VRS, DStorage, ect) which quite frankly, just sets up those who are less informed (like myself in some of those cases) to have unrealistic expectations of what will be possible, or even when/if it will ever be utilized. We've learned by now that just because something is supported doesn't mean it will actually be utilized in a meaningful capacity.
Consumers just need to know the GPU model/generation a game supports.
Well that's your fault.. you could have at least tuned the flux capacitor and overdrive module for maximum standard processing!But if the other guys have blast processing and you don’t, you’re screwed!
I'm not usually one of those 'this looks last gen' type of hyperbolic cynics but......this looks last gen.FSR3 on consoles.
I'm not usually one of those 'this looks last gen' type of hyperbolic cynics but......this looks last gen.
It looks like UE4, runs like UE5.
It's an easy cherry-pick/snipe, but worth mentioning if you watched the whole DF video the takeaway is pretty clearly that many of those settings combinations - particularly that one - should not exist in the game, especially on console. I'd argue the frame gen modes should not exist at all given the results from the DF video. The way these tweets are worded is pretty misleading, intentionally or otherwise.
Pal was always higher resolution, with better colour resolution to boot.Part of the problem for consoles is once upon a time everyones tv had the same resolution ntsc or pal now thats no longer the case
20% higher framerate is more important than 20% higher resolution. Plus, 60 is divisible by 1,2,3,4,5,6,10,12,15,20,30, and 60. 50 is only divisible by 1,2,5,10,25,50. This give a bit more leverage when updating below the refresh, especially when you miss an update.But NTSC was higher frame rate, and succeeded despite being shit.