Starfield to use FSR2 only, exclude DLSS2/3 and XeSS: concerns and implications *spawn*

Well this is what AMD had in mind all along. Nvidia invents and brings to market some proprietary innovation, builds up interest and adoption, and then AMD comes in with some open source implementation which works on everything, looking like the good guys, while simultaneously blocking people from implementing the superior product.. expecting consumers to be ok with it because it works on everything..

Yes it's good that FSR2 works on everything.. but most people by far have Nvidia GPUs and want the best..

If AMD wants people to not feel that way, they need to bring a superior product to market.
 
Well this is what AMD had in mind all along. Nvidia invents and brings to market some proprietary innovation, builds up interest and adoption, and then AMD comes in with some open source implementation which works on everything, looking like the good guys, while simultaneously blocking people from implementing the superior product.. expecting consumers to be ok with it because it works on everything..
I rather think what AMD has in mind is to show that upscaling can be done without AI.
And if it works well enough, this proofs that we do not need AI acceleration yet, so it also proofs wrong the competitors claim of AI acceleration being worth it already now.
Because GPUs are expensive, and AI acceleration takes die area which is costly but so far is used exclusively for upscaling, gamers might decide to spend their money better on AMD to get more for the buck.
Making FSR open source helps with adoption. And blocking DLSS obviously also helps, because if FSR adoption is higher, it proofs AI isn't needed.

That's at least how i perceive it. It's basically a war of traditional programming vs. machine learning. And if upscaling is the only application, the war makes sense.
However, some time has passed by, and it has shown:
* AMD has failed or is no longer willing to offer HW at lower cost, even going without ML acceleration and chiplets. People neither buy AMD nor NV GPUs, because both are too expensive. (They don't buy Intel either, although cheap and has ML acc.)
* The ability of ChatGPT & co. has proofed a fight against AI is completely pointless. I'm happy enough if humanity as a whole does not become redundant within the next decades.

So personally i think it's time to give up the war. They better just stop blocking DLSS and focus on adding some ML acceleration to GPUs as well, beside increasing their software division.
I guess it's safe to assume ML will be used in games, not just for upscaling. If i like it or not.

What i don't like about it is that 'open source' for ML (as discussed further above) makes little sense. It would not help game devs to develop their own ML applications. The obstacle remains training data and processing infrastructure.
It feels like traditional software development will just vanish, and there is no way around AI development meant to replace it being centralized, until the development itself moves to AI.
 
Integrating an open source upscaler supporting all platforms is a priority for developers as consoles make the majority of game market. Then maybe some developers decide that it's good enough and don't want to support a second upscaler for PC only?

I imagine if DLSS would offer fallback solution for consoles we would see an uptick in its suport, and NVIDIA would still retain an edge here due to their tensor cores.
 
I imagine if DLSS would offer fallback solution for consoles we would see an uptick in its suport, and NVIDIA would still retain an edge here due to their tensor cores.
Well, even modders can replace FSR with DLSS, so we can assume effort to add DLSS (+ XeSS) beside FSR is negligible to the developers who can do so without a need for hacks.
The only excuse for not adding DLSS support then is if your game runs so well it just does not need any upscaling.
If you need upscaling, you'll be happy that you do not need to work on it yourself, but just integrate those 3 IHV libraries in a fraction of time. They all take the same inputs afaict.

I'm usually first with complaining about proprietary black boxes. But not here. Upscaling is a process with clearly defined inputs and we get higher resolution output.
No need to know what's inside the black box, no need to tweak it if it works, no need to optimize it further. Ideally not even a need to compile or see complex source code.
So i'm totally fine with an upscaling black box.
A standardized API for gfx APIs + dlss.dll moving to gfx drivers would be nice to have, so ideally we do not even need to get in touch with IHVs to make it work. Nice, but not required.

Then maybe some developers decide that it's good enough and don't want to support a second upscaler for PC only?
I really don't think so. NV has >80% market share? No developer would ignore this just because of being lazy.
It's rather the contrary. We want to utilize the HW people have bought. It has RT cores? We try to use them. It has Tensor cores? Same thing.

I'm just an indie dev and have never integrated any upscaler, so maybe i'm wrong. But then AMD would have clearly denied to block DLSS/XeSS after the backlash from Starfield announcement.
 
I'm usually first with complaining about proprietary black boxes. But not here. Upscaling is a process with clearly defined inputs and we get higher resolution output.
It does change how stochastic effects are accumulated. Like whether your noise patterns in SSAO or stochastic transparency will converge after a few frames into a nice smooth image or will have artifacts. Not a huge difference, but it does require extra work and testing, which usually comes at a expense of some other task.

But then AMD would have clearly denied to block DLSS/XeSS after the backlash from Starfield announcement.
My issue with this theory is what AMD could offer for blocking DLSS? With 20% market share they can't really offer a good marketing campaign. And with 129M loss in 2023 Q1 they can't really afford to pay developer for it, at least not for such huge titles as Starfield.
 
Integrating an open source upscaler supporting all platforms is a priority for developers as consoles make the majority of game market. Then maybe some developers decide that it's good enough and don't want to support a second upscaler for PC only?

I imagine if DLSS would offer fallback solution for consoles we would see an uptick in its suport, and NVIDIA would still retain an edge here due to their tensor cores.
I don't think it has anything to do with developers not wanting to do it... it's simply that they are told they can't.
 
Are there any reviews / investigations into XeSS on intel vs AMD/Nvidia?
I know the underlying codepath is different, but i always thought it should produce the same result?

I guess since it's open source anyone could in theory make a CPU based "reference" implementation, that wold at least provide a set of reference images after scaling?
if you mean comparisons, there are, like this one from yesterday -couldn't watch the entire video 'cos I gotta go. XeSS has some advantage on Intel native hardware, some extra speed and depending on the game, certain details might be better.


he also touches on the AMD exclusivity drama by the end of the video.
 
It does change how stochastic effects are accumulated. Like whether your noise patterns in SSAO or stochastic transparency will converge after a few frames into a nice smooth image or will have artifacts. Not a huge difference, but it does require extra work and testing, which usually comes at a expense of some other task.
Yeah, but isn't that just a devil in details? That's expected, and some problems may force you to do adjustments or changes just because of one GPU vendor, eventually.
But then, if i lack the resources (or if i'm lazy) to sink trial and error time into this, i can still release the game with cross vendor support plus eventual artifacts.
Some users could then eventually switch from DLSS to FSR if the latter does better for them as they feel.
They will complain DLSS support of game X is bad and devs did not spend enough love on it, but that's still much better than releasing the game without DLSS support.
From the technical perspective i may sound naive, but i care much more about the increasing disappointment on gaming in general, and how to minimize the problem.

(It's a pretty minor problem imo, but if i would be a NV user with Starfield on my wishlist, i surely would think this totally sucks.
At least NV users now know how it feels to look at greyed out PhysX check boxes in your options menu. :D )
My issue with this theory is what AMD could offer for blocking DLSS? With 20% market share they can't really offer a good marketing campaign. And with 129M loss in 2023 Q1 they can't really afford to pay developer for it, at least not for such huge titles as Starfield.
That's a good point, and i share this question. What do devs get from partnering with either NV, AMD or Intel? Assistance on optimization? Marketing? Money? Nothing, and instead IHVs pay them to put their logo on the splash screen?
Idk. Maybe somebody here can talk.

But not matter what. The problem is that AMD has dodged the question of blocking DLSS. And now everybody and their mom concludes they do block, even if they actually do not.
Even AMDs marketing will be aware about this issue and potential source of distrust, and they would refine their statement if they could, i'm afraid.
 
AMD is trying to make features that only run on their competitor's hardware appear reduntant by making every hardware able to run their open source solutions. Its far worse for them trying to compete head on and limiting the reach of their solutions to AMD hardware adoption. That way its easier to establish a wider adoption without having to convince in a one to one battle and thus slow down NVIDIA's selling points. This is why it is strategically a plus for them to have everything support FSR including NVIDIA with Starfield and block DLSS rather than have only AMD GPUs run it with FSR vs NVIDIA GPUs with no DLSS or FSR. Its a dirty game.
 
AMD is trying to make features that only run on their competitor's hardware appear reduntant by making every hardware able to run their open source solutions. Its far worse for them trying to compete head on and limiting the reach of their solutions to AMD hardware adoption. That way its easier to establish a wider adoption without having to convince in a one to one battle and thus slow down NVIDIA's selling points. This is why it is strategically a plus for them to have everything support FSR including NVIDIA with Starfield and block DLSS rather than have only AMD GPUs run it with FSR vs NVIDIA GPUs with no DLSS or FSR. Its a dirty game.

That tactic might work if there weren’t lots of other data points showing head to head comparisons in other games. Also FSR doesn’t have the greatest reputation when compared to “native” and it’s impossible to avoid that comparison. As it stands people already think/know that FSR is a tier below the ML solutions and I don’t think these moves will change anyone’s minds.

Depending on how Starfield does with CPU usage/bottlenecks it may actually backfire a bit if Ampere owners cause a stink about missing DLSS3. That will be a situation where AMD blocked a technology for which they’re not offering an alternative.
 
Depending on how Starfield does with CPU usage/bottlenecks it may actually backfire a bit if Ampere owners cause a stink about missing DLSS3. That will be a situation where AMD blocked a technology for which they’re not offering an alternative.
Additionally, NV users is the only pool from which they could increase GPU market share. But making them angry will not help to break their brand loyalty.
Nor will it help AMD users to stay loyal, if AMD plays the game dirty so obviously, which is something AMD users may rather expect from the 'evil' competitor.

It's a bad marketing strategy imo. I would just say 'Tensor cores is not worth it yet. Our FSR has slightly worse quality, but it's good enough, and you don't waste money spent on silicon active only for one millisecond per frame.'
That's an argument i would buy. It's honest, makes sense, applies to RT as well to some degree, and it's the reason i personally still prefer AMD over NV.

Their marketing is just terrible. (At least i can be sure the money i give them is not spent on marketing, which isn't too bad : )
 
I also agree with this comment.

There is significant value in the result that DLSS gives, and i can totally understand Nv not wanting to share the internals.
However in an ideal world, ( yes i know i'm in dreamland now ) we would have a version of DLSS that implements a CPU based "software" version of the algorithm,
and then GPU vendors would be able to ( perhaps for a price ) implement support for the algorithm in a hardware accelerated manner on their GPU's
OR - perhaps even better, Engine developers could implement support within the engines themselves.

But if DLSS takes 2ms on a 3080, and 20ms on a 6800XT due to more performant and appropriate HW in the 3080, then it's all a bit of a moot point.
The thing is, right now we just dont know.
Based on very rough calculations, it looks like DLSS is doing somewhere between 3- 10x as much math as FSR2.

If thats the case it might even be in NV's favor to release a CPU version for analysis, it only makes them look good, and there hardware look even better.

I'm guessing in all likelihood (since DLSS isn't open source, not that I feel like I'd have the expertise to necessarily analyze it at that level regardless) the "secret" to DLSS is part software (both the implementation and training) and part hardware (not necessarily just the tensor cores). But it's not just question of whether or not the hardware is leveraged but how important it is to the end results. In general there tends to be, at least a certain point, it starts taking a disproportionately more complexity to achieve better results.

A not insignificant amount of people already feel that FSR2 (some even felt FSR1) is close enough to DLSS. You'd wonder then how much possibly closer a "DLSS lite" in terms of calculations could come. As ultimately we're talking about an not really quantifiable (especially in a marketing sense) subjective determination of how much better the end result is.

Nvidia's hardware also isn't really magic. Keep in mind DLSS is runnable on the RTX 2060, as such hypothetically you'd only need ML performance to match that. Even if competitors are behind you'd still only be targeting what would be 6 year old mid stack hardware by the time next generation rolls around.

All in all the hardware alone might not be enough to differentiate itself.

Are there any reviews / investigations into XeSS on intel vs AMD/Nvidia?
I know the underlying codepath is different, but i always thought it should produce the same result?

I guess since it's open source anyone could in theory make a CPU based "reference" implementation, that wold at least provide a set of reference images after scaling?

It's been compared but off hand I can't remember who did it. I believe it's been discussed on here in one of the threads. The DP4A path doesn't look the same as the XMX path.

I believe XeSS is not completely open source. At least I don't believe they've yet published the actual source code the upscaling portion.
 
Additionally, NV users is the only pool from which they could increase GPU market share. But making them angry will not help to break their brand loyalty.
Nor will it help AMD users to stay loyal, if AMD plays the game dirty so obviously, which is something AMD users may rather expect from the 'evil' competitor.

It's a bad marketing strategy imo. I would just say 'Tensor cores is not worth it yet. Our FSR has slightly worse quality, but it's good enough, and you don't waste money spent on silicon active only for one millisecond per frame.'
That's an argument i would buy. It's honest, makes sense, applies to RT as well to some degree, and it's the reason i personally still prefer AMD over NV.

Their marketing is just terrible. (At least i can be sure the money i give them is not spent on marketing, which isn't too bad : )

Sometimes it's not about marketing to new customers but actually marketing to existing customers. Sometimes it's not about trying to gain market share but maintain market share. Sometimes it's not about marketing to everyone but select target specific groups.

One thing I'll say to give some perspective is maybe don't just look at one group to see what the reaction is. There's plenty of opinions being posted for example along the lines that developers prefer AMD because AMD is in consoles, or that FSR2 is open source, and etc. and if you look outside of this community the ratio of responses along those lines might surprise.

I have some feelings on AMDs marketing that I won't go into too much but I disagree with the sentiment it's terrible. An aspect of successful marketing is if the person being marketed to doesn't think they are being marketed to and that their positive opinion was just formed naturally due to good reason.
 
That's a good point, and i share this question. What do devs get from partnering with either NV, AMD or Intel? Assistance on optimization? Marketing? Money? Nothing, and instead IHVs pay them to put their logo on the splash screen?
Idk. Maybe somebody here can talk.

Assistance on optimization for one, but yes - marketing.

1688392605816.png

AMD makes more than Radeon folks. Ryzen CPU's are incredibly popular with gamers. Just because Radeon is a minority player in the GPU space doesn't mean publishers don't see the worth in getting a marketing deal with AMD as a whole.

But not matter what. The problem is that AMD has dodged the question of blocking DLSS. And now everybody and their mom concludes they do block, even if they actually do not.
Even AMDs marketing will be aware about this issue and potential source of distrust, and they would refine their statement if they could, i'm afraid.

Yeah, I really don't see this as needing any further contemplation at this point. It's taken on a life of its own, there is no reason for AMD to not have nipped this bad PR cycle in the bud a week ago, let alone now. They offer 'no comment' because the alternatives are to say "Yes", which is bad - or say "No", and lie - which would be an even bigger firestorm down the pike.
 
Additionally, NV users is the only pool from which they could increase GPU market share. But making them angry will not help to break their brand loyalty.
Nor will it help AMD users to stay loyal, if AMD plays the game dirty so obviously, which is something AMD users may rather expect from the 'evil' competitor.

It's a bad marketing strategy imo. I would just say 'Tensor cores is not worth it yet. Our FSR has slightly worse quality, but it's good enough, and you don't waste money spent on silicon active only for one millisecond per frame.'
That's an argument i would buy. It's honest, makes sense, applies to RT as well to some degree, and it's the reason i personally still prefer AMD over NV.

Their marketing is just terrible. (At least i can be sure the money i give them is not spent on marketing, which isn't too bad : )
talking of money, they won't be getting any money from me anymore, I've blacklisted AMD until the end of times. nVidia has some sponsored games like Cyberpunk, or Shadow of the Tomb Raider, among others, that use other upscaling technologies.

Now on RE4, AMD sponsored game, I had to install a mod to use XeSS, but as you can see from this post of @trinibwoy both DLSS and XeSS look a lot better in that game, they are a very noticeable upgrade.

https://forum.beyond3d.com/threads/resident-evil-4-remake-xbsx-s-ps4-ps5-pc.62873/post-2295754

 
talking of money, they won't be getting any money from me anymore, I've blacklisted AMD until the end of times.

Really? You're going to blacklist AMD until the end of times? AMD, but not Bethesda or Capcom or Sony who were every single bit as involved in signing these deals?

And you've got to this point in life and somehow not eternally blacklisted Intel and Nvidia?

AMD have really messed this situation up, and blocking DLSS and XeSS will gain them less in sales than it will cost them in goodwill. However, talking about eternal blacklists for AMD while continuing to support and fund publishers who actively look to get exactly these kinds of deals, and while also supporting other chip makers that have done far more egregious things in the past (and present) is baffling.

Sony, MS and Nintendo pay to stop entire games appearing on other consoles. Not supporting a competitors proprietary upscaling tech in a game AMD will have spent big, big bucks on sponsoring is a dick move but it's a small one on the scale of corporate douchebaggery.
 
Last edited:
FSR2 at launch.
DLSS, XeSS in first patch.

Can anyone confirm that this isn't what they're doing?
No. You can't. Because AMD refused comment.

They might engineer the game so that DLSS and XeSS are incapable of functioning with it permanently.
Currently, no one knows anything.
 
Really? You're going to blacklist AMD until the end of times? AMD, but not Bethesda or Capcom or Sony who were every single bit as involved in signing these deals?

And you've got to this point in life and somehow not eternally blacklisted Intel and Nvidia?

AMD have really messed this situation up, and blocking DLSS and XeSS will gain them less in sales than it will cost them in goodwill. However, talking about eternal blacklists for AMD while continuing to support and fund publishers who actively look to get exactly these kinds of deals, and while also supporting other chip makers that have done far more egregious things in the past (and present) is baffling.

Sony, MS and Nintendo pay to stop entire games appearing on other consoles. Not supporting a competitors proprietary upscaling tech in a game AMD will have spent big, big bucks on sponsoring is a dick move but it's a small one on the scale of corporate douchebaggery.
Sure, but sadly AMD don't have anywhere near the quality that nVidia have for architecture development. AMD fanboys might be happy with the circlejerk going around with this deal, they might expect AMD to bulldoze the competition, and well we shall look how it will turn out.

nVidia is going to win one more time. History speaks for itself and this will, unfortunately, be no different.
 
GamersNexus presenter Steve Burke theorizes that AMD may now conspire to save face by quietly allowing Nvidia and Intel to add their input to optimize Starfield. They can then reveal it later so that fans would feel bad for questioning them.

But as a general business practice, we can’t rely on PR nightmares to get companies to behave. This announcement does suggest that AMD can’t really do much optimization for Starfield at all, if they’re suppressing their competitors’ access to it. Really, the right thing to do is to just give other companies the chance to get their optimizations in for Starfield as well.
July 3, 2023
 
Back
Top