Nvidia's standard license it's RTX SDK (including DLSS) requires 2 things marketing/branding related things -
So in reality, it is nVidia's requirements for implementing DLSS that limit its implementation
Exactly. Who would sponsor a game, to then at the same time allow the competitor to market with the same game, where they basically didn't contribute anything through development? It simply does not work in practice.
So either two things are going to happen with Starfield. Either nVidia dropped the two requirements to allow DLSS, or, AMD is going to allow nVidia to use Starfield for promotions.
It doesn't require any of this, there are dozens of DLSS enabled games where they don't even mention NVIDIA's name. The latest of which is Immortals of Aveum and The Last of Us Part 1. Jedi Survivor implemented DLSS3 no less without even mentioning NVIDIA.
I mentioned earlier in here that AMD likely words it's sponsorship agreement here in a plausibly deniable way that works against the implementation of competitor technologies without explicitly doing so or even explicitly mentioning them. For instance I suspect there is no specific lines in there with respect to DLSS.
Nvidia's standard license it's RTX SDK (including DLSS) requires 2 things marketing/branding related things -
1) The game must have in it's credits, splash screen/title cards, and box (if applicable) attribution to Nvidia and it's logo. Keep in mind this doesn't specify you need one specific splash screen for Nvidia.
2) Allow Nvidia the right to use the game as promotional material for RTX subject to what is "commercially reasonable."
AMD's agreement (at least now) likely probibits the above, which they can plausibly argue fits an exclusive marketing partnership.
Yes it's based on lawyering and technicalities but it is what it is. As with these things one's personel viewpoint is likely going to be swayed by their feelings to the parties involved.
I'm sure the requirements can be negotiated, for both Nvidia and AMD, as there are certainly titles that contain these technologies that don't have any slash screens or any mention whatsoever of them. A recent example is Diablo 4 which has DLSS but pays no mention to Nvidia anywhere, other than the "NVIDIA DLSS" option in the settings.
As if this can't get any funnier, Assassin's Creed Mirage will only support XeSS at launch as the game is partnered with Intel.
Assassin’s Creed Mirage PC Requirements Revealed - NVIDIA DLSS, AMD FSR, Intel XeSS All Supported
[UPDATED] The official blog post has been changed to say that all main upscalers (NVIDIA DLSS, AMD FSR, and Intel XeSS) are actually supported in Assassin's Creed Mirage's PC version. [ORIGINAL STORY] Assassin’s Creed Mirage arrives in just a few weeks, and the PC features and requirements for...wccftech.com
As if this can't get any funnier, Assassin's Creed Mirage will only support XeSS at launch as the game is partnered with Intel.
Assassin’s Creed Mirage PC Requirements Revealed - NVIDIA DLSS, AMD FSR, Intel XeSS All Supported
[UPDATED] The official blog post has been changed to say that all main upscalers (NVIDIA DLSS, AMD FSR, and Intel XeSS) are actually supported in Assassin's Creed Mirage's PC version. [ORIGINAL STORY] Assassin’s Creed Mirage arrives in just a few weeks, and the PC features and requirements for...wccftech.com
Are you genuinely judging AMD versus Nvidia performance based on a requirements sheet? Come on now. Surely we dont need to reiterate for the millionth time that these aren't ever accurate?Also, note how without the AMD marketing deal the engine now seems to perform better on Nvidia GPU's vs Valhalla according to those specs.
Are you genuinely judging AMD versus Nvidia performance based on a requirements sheet? Come on now. Surely we dont need to reiterate for the millionth time that these aren't ever accurate?
It's not about a mistake. These are never thoroughly tested specs. They are almost always ballpark guesses. And there's often even blatant 'that doesn't make sense' sorts of things you can spot in them, especially when it comes to like CPU's and whatnot.Hence the "according to those specs" line at the end. If it bares out in testing then it's definitely noteworthy, whereas if it performs like Valhalla, the specs aren't just off, they're totally backwards (suggesting an out of ordinary Nvidia advantage when in fact the game would sport an out of ordinary AMD advantage). A very silly mistake to make if so.
It's not about a mistake. These are never thoroughly tested specs. They are almost always ballpark guesses. And there's often even blatant 'that doesn't make sense' sorts of things you can spot in them, especially when it comes to like CPU's and whatnot.
Either way, it's not Valhalla. Even if it did perform better on Nvidia relatively this time around, that's no reason to go jumping to conspiracies. It could simply be the way it happens to run. All games are gonna be different.
That's not what the Ubisoft article says though.Assassin's Creed Mirage will only support XeSS at launch as the game is partnered with Intel.
You'll also be able to leverage features like Intel's AI-assisted XeSS Super Sampling, which upscales resolution while enabling hardware to pump out more frames per second; optimization for Intel Arc GPUs and 13th-generation CPUs; synchronize ambient lighting with the game's action using MSI's Mystic Light products; and even experience upper-body haptic feedback with the OWO Haptic Gaming System vest (which is also supported on consoles).
Maybe you dont grasp the meaning of 'ballpark'? I mean they aren't really putting any huge effort or thought into this. They just throw out some roughly similar sort of specs. Reading into it like you're doing is utterly insane given how the historical accuracy of such spec requirements are basically at 0%.A ballpark guess would match a 3080 against 6800XT, not a 6900XT.
Maybe you dont grasp the meaning of 'ballpark'? I mean they aren't really putting any huge effort or thought into this. They just throw out some roughly similar sort of specs. Reading into it like you're doing is utterly insane given how the historical accuracy of such spec requirements are basically at 0%.
I think you dont understand that game developers aren't the type to obsess over GPU reviews and benchmarks like many of us hardware enthusiasts do. They will usually have a looser understanding of where things stand relative to each other than we do.
Stop putting so much stock into such things. It's crazy how PC gamers never seem to have learned this.
You didn't actually read my post then.Let's revisit this when the benchmarks are out. If the game still favours AMD like Valhalla in contradiction to these requirements then I'll happily concede your point.
You didn't actually read my post then.