Starfield to use FSR2 only, exclude DLSS2/3 and XeSS: concerns and implications *spawn*

Nvidia's standard license it's RTX SDK (including DLSS) requires 2 things marketing/branding related things -
So in reality, it is nVidia's requirements for implementing DLSS that limit its implementation

It doesn't require any of this, there are dozens of DLSS enabled games where they don't even mention NVIDIA's name. The latest of which is Immortals of Aveum and The Last of Us Part 1. Jedi Survivor implemented DLSS3 no less without even mentioning NVIDIA.
 
Exactly. Who would sponsor a game, to then at the same time allow the competitor to market with the same game, where they basically didn't contribute anything through development? It simply does not work in practice.

I guess Cyberpunk 2077 is being used in AMD's marketing for FSR3 while also being used in Nvidia's marketing or existing RTX features and the upcoming DLSS 3.5 and presumably has some sort of sponsorship agreement with Nvidia. That's just one example. These deals aren't all identical.

So either two things are going to happen with Starfield. Either nVidia dropped the two requirements to allow DLSS, or, AMD is going to allow nVidia to use Starfield for promotions.

It might not be either. These deals are not likely in perpetuity, as I would think the rights to that would be rather high. It's worth noting AMD's bundle promo with Starfield runs to end of September only. Also Starfield is now available via Geforce Now for both Steam and Gamepass.

What would be interesting is if anyone who has Jedi Surivor maybe check if Nvidia/RTX is mentioned anywhere in game splash screens, title screens, or credits now.
 
It doesn't require any of this, there are dozens of DLSS enabled games where they don't even mention NVIDIA's name. The latest of which is Immortals of Aveum and The Last of Us Part 1. Jedi Survivor implemented DLSS3 no less without even mentioning NVIDIA.

As I said it's on the standard licensing terms. However it's also stated that prospective parties can contact Nvidia if these terms do not work for them and I would suspect negotiable.
 
I mentioned earlier in here that AMD likely words it's sponsorship agreement here in a plausibly deniable way that works against the implementation of competitor technologies without explicitly doing so or even explicitly mentioning them. For instance I suspect there is no specific lines in there with respect to DLSS.

Nvidia's standard license it's RTX SDK (including DLSS) requires 2 things marketing/branding related things -

1) The game must have in it's credits, splash screen/title cards, and box (if applicable) attribution to Nvidia and it's logo. Keep in mind this doesn't specify you need one specific splash screen for Nvidia.

2) Allow Nvidia the right to use the game as promotional material for RTX subject to what is "commercially reasonable."

AMD's agreement (at least now) likely probibits the above, which they can plausibly argue fits an exclusive marketing partnership.

Yes it's based on lawyering and technicalities but it is what it is. As with these things one's personel viewpoint is likely going to be swayed by their feelings to the parties involved.


yah, looks like it's in the marketing section.

That's the link to the end user license agreement

1694722375196.png
 
I'm sure the requirements can be negotiated, for both Nvidia and AMD, as there are certainly titles that contain these technologies that don't have any slash screens or any mention whatsoever of them. A recent example is Diablo 4 which has DLSS but pays no mention to Nvidia anywhere, other than the "NVIDIA DLSS" option in the settings.
 
I'm sure the requirements can be negotiated, for both Nvidia and AMD, as there are certainly titles that contain these technologies that don't have any slash screens or any mention whatsoever of them. A recent example is Diablo 4 which has DLSS but pays no mention to Nvidia anywhere, other than the "NVIDIA DLSS" option in the settings.

Honestly there's probably no value in enforcing the splash screen. If DLSS is an option in the game, and it's labelled as DLSS in the options, then Nvidia is probably happy.
 
As if this can't get any funnier, Assassin's Creed Mirage will only support XeSS at launch as the game is partnered with Intel.

 
As if this can't get any funnier, Assassin's Creed Mirage will only support XeSS at launch as the game is partnered with Intel.


But wait... this has absolutely nothing to do with a marketing deal, they probably just wanted to focus on one upscaling technique and thought supporting all 3 would be too much work... That's how this works isn't it? /s

Also, note how without the AMD marketing deal the engine now seems to perform better on Nvidia GPU's vs Valhalla according to those specs.
 
As if this can't get any funnier, Assassin's Creed Mirage will only support XeSS at launch as the game is partnered with Intel.

oh-no-not-5ccde8.jpg
 
Also, note how without the AMD marketing deal the engine now seems to perform better on Nvidia GPU's vs Valhalla according to those specs.
Are you genuinely judging AMD versus Nvidia performance based on a requirements sheet? Come on now. Surely we dont need to reiterate for the millionth time that these aren't ever accurate?
 
Are you genuinely judging AMD versus Nvidia performance based on a requirements sheet? Come on now. Surely we dont need to reiterate for the millionth time that these aren't ever accurate?

Hence the "according to those specs" line at the end. If it bares out in testing then it's definitely noteworthy, whereas if it performs like Valhalla, the specs aren't just off, they're totally backwards (suggesting an out of ordinary Nvidia advantage when in fact the game would sport an out of ordinary AMD advantage). A very silly mistake to make if so.
 
Hence the "according to those specs" line at the end. If it bares out in testing then it's definitely noteworthy, whereas if it performs like Valhalla, the specs aren't just off, they're totally backwards (suggesting an out of ordinary Nvidia advantage when in fact the game would sport an out of ordinary AMD advantage). A very silly mistake to make if so.
It's not about a mistake. These are never thoroughly tested specs. They are almost always ballpark guesses. And there's often even blatant 'that doesn't make sense' sorts of things you can spot in them, especially when it comes to like CPU's and whatnot.

Either way, it's not Valhalla. Even if it did perform better on Nvidia relatively this time around, that's no reason to go jumping to conspiracies. It could simply be the way it happens to run. All games are gonna be different.
 
It's not about a mistake. These are never thoroughly tested specs. They are almost always ballpark guesses. And there's often even blatant 'that doesn't make sense' sorts of things you can spot in them, especially when it comes to like CPU's and whatnot.

Either way, it's not Valhalla. Even if it did perform better on Nvidia relatively this time around, that's no reason to go jumping to conspiracies. It could simply be the way it happens to run. All games are gonna be different.

A ballpark guess would match a 3080 against 6800XT, not a 6900XT. And in a game that performs the same as the previous game why would they need to randomly guess?

Valhalla spec'd the 5700XT against a 2080 Super. Here they are putting it against a 2070.
 
Assassin's Creed Mirage will only support XeSS at launch as the game is partnered with Intel.
That's not what the Ubisoft article says though.

From the article:

You'll also be able to leverage features like Intel's AI-assisted XeSS Super Sampling, which upscales resolution while enabling hardware to pump out more frames per second; optimization for Intel Arc GPUs and 13th-generation CPUs; synchronize ambient lighting with the game's action using MSI's Mystic Light products; and even experience upper-body haptic feedback with the OWO Haptic Gaming System vest (which is also supported on consoles).

The wording "features like..." leaves room for the possibility of features other than those listed in the sentence, no?

Maybe they are just mentioning features from sponsors but are actually implementing other features too.
 
A ballpark guess would match a 3080 against 6800XT, not a 6900XT.
Maybe you dont grasp the meaning of 'ballpark'? I mean they aren't really putting any huge effort or thought into this. They just throw out some roughly similar sort of specs. Reading into it like you're doing is utterly insane given how the historical accuracy of such spec requirements are basically at 0%.
I think you dont understand that game developers aren't the type to obsess over GPU reviews and benchmarks like many of us hardware enthusiasts do. They will usually have a looser understanding of where things stand relative to each other than we do.

Stop putting so much stock into such things. It's crazy how PC gamers never seem to have learned this.
 
Maybe you dont grasp the meaning of 'ballpark'? I mean they aren't really putting any huge effort or thought into this. They just throw out some roughly similar sort of specs. Reading into it like you're doing is utterly insane given how the historical accuracy of such spec requirements are basically at 0%.
I think you dont understand that game developers aren't the type to obsess over GPU reviews and benchmarks like many of us hardware enthusiasts do. They will usually have a looser understanding of where things stand relative to each other than we do.

Stop putting so much stock into such things. It's crazy how PC gamers never seem to have learned this.

Let's revisit this when the benchmarks are out. If the game still favours AMD like Valhalla in contradiction to these requirements then I'll happily concede your point.
 
You didn't actually read my post then.

Valhallas specs indicated AMD would have a disproportionate performance advantage over Nvidia... and it did.

Mirages specs indicate that Nvidia will have a disproportionate performance advantage over AMD, which I argue would be interesting given the AMD sponsorship of the former that doesn't carry over to the latter.

You have said we should ignore the specs because they don't mean anything. That wasn't the case with Valhalla but perhaps it will be with Mirage, in which case my point it moot. So as I suggested, let's wait for the benchmarks to see which is the case.
 
Back
Top