Starfield to use FSR2 only, exclude DLSS2/3 and XeSS: concerns and implications *spawn*

It's borderline ridiculous that people feel it's okay to bully game development companies into using arbitrary closed source tech
What's actually ridiculous is the PC version of some game releasing with the same basic console features and settings, with a complete disregard for the more powerful hardware on PC. This has been consistent with AMD titles, they either ship with very minimalstic RT implementations, or the exact same one as consoles, with no opportunity for upward scaling.

And now we have this, AMD blocking other upscalers completely. Hell, a certain UE4 game called Boundary promised RT and DLSS in a partnership with NVIDIA, they then switched to partnership with AMD and immediately dropped RT and DLSS for a lousy FSR implementation, the game looks like crap now. No one even mentions it's name let alone plays it. This is what's hurting PC gaming. The concept that PC ports should be nothing more than a console port for the sake of "openness", AMD is reinforcing that concept with their recent actions.

actual bad actors

Bad actor? Don't you mean the original innovator? If not for DLSS there would be no FSR, if not for DLSS3 there would be no FSR3. Is the original innvoator a bad thing now?

Epic implemented DLSS2 into Fortnite UE4, then reimplemented it again in Fortnite UE5.2 along with TSR, it must be important for people for Epic to do that again.

Heck, Epic made entire games/demos exclusive to one platform, so let's not kid ourselves in an industry fully embracing the concept of exclusivity where entire games are needlessly locked into one single platform for decades, and go blame IHVs for using features best suited to their platform. Enough with this double standard already.

Intel also made XeSS "open" and made it run on all hardware, results? Lower performance and worse image quality on anything but Intel hardware. DLSS going the same route will yield that exact same result.

This is 100% GameWorks all over again

It's been a decade since PhysX stopped, and no game has ever managed to ship the same effects as PhysX did. The "open" mantra became an execuse to serve half assed implementations to PC gamers or stop innovations completely. We are going backwards on so many aspects, that failure of industry veterans to realize that is very perplexing to say the least. Preaching from rooftops is not the solution, actually developing good things and serving them to PC gamers is the thing to do, let actions do the talking.
 
Guys, yelling at Andrew isn’t going to change the outcome here. He said his piece from a developer perspective, and offered something you normally wouldn’t get. No reason to berate him over it just because it isn’t what you wanted to hear.

I think we would all love the latest tech on all games, but it is what it is.

Getting reminiscent of when Xbox had exclusivity over tomb raider.
 
Who is yelling? I see constructive opinions on the matter. Unless, I missed something where a member is being dismissive and intolerable.
The discussion is getting really *heated* and very clearly by proponents of the "bad actors" that Andrew mentioned. Countering a developer's argument from an end user perspective is hardly constructive since it becomes both tiresome and deafening. Everyone's willing to talk about the user experience but what about the developer experience ?
 
Last edited:
The discussion is getting really *heated* and very clearly by proponents of the "bad actors" that Andrew mentioned. Countering a developer's argument from an end user perspective is hardly constructive becomes both tiresome and deafening. Everyone's willing to talk about the user experience but what about the developer experience ?

I mean, most posts at B3D are passionate (or heated). Members have strong views, some dead wrong at times. If Andrew is feeling attacked, then I apologize for adding to it.
 
Hardly heated. Since when does developer experience matter when * exclusives * are not done for the benefit of developers or gaming industry as a whole?
 
Hardly heated. Since when does developer experience matter when * exclusives * are not done for the benefit of developers or gaming industry as a whole?
Good to know that you're being openly callous and dismissive so posting about this topic any further stinks ...
 
Guys, yelling at Andrew isn’t going to change the outcome here. He said his piece from a developer perspective, and offered something you normally wouldn’t get. No reason to berate him over it just because it isn’t what you wanted to hear.

I think we would all love the latest tech on all games, but it is what it is.

Getting reminiscent of when Xbox had exclusivity over tomb raider.
Andrew offered his own perspective on the topic which I'm sure is lock-step with some developers but not all. Expression of personal opinions outside the UE 5 thread is treated and responded like any other poster with conflicting opinions. As such it's unfortunate if someone maintains their reasoning should not be questioned and is the only truth.

Hopefully we can get back on topic from @DavidGraham's Streamline post just before the open source argument talk.

Edit: Fix reported link problem.
 
Last edited by a moderator:
What's actually ridiculous is the PC version of some game releasing with the same basic console features and settings, with a complete disregard for the more powerful hardware on PC. This has been consistent with AMD titles, they either ship with very minimalstic RT implementations, or the exact same one as consoles, with no opportunity for upward scaling.
I wouldn't even mind this in many cases, assuming the developers were really just prioritizing making a good working port with limited resources and all. I've generally considered 'extra graphical options' on PC as a bonus rather than some intrinsic or required aspect of PC gaming. A lot of games being released on PS5/XSX already look very nice in my opinion, and extra horsepower on PC can still be put to use via higher resolutions or higher performance.

Problem here isn't just that the game has a lack of DLSS, it's they are ostensibly being paid not to include it. This isn't all a case of a developer hard up against the ropes, making a necessary sacrifice on features in order to make best use of their limited time and resources. It's a large AAA developer who has even had quite a bit of extra time given to them by their publisher to make the game as good as possible. It's also a historically PC-centric developer, so it's not like they're just getting into doing PC ports after so many years making console games.
 
NVidia spent time and resources developing DLSS so it's understandable in my view why they would then want to reap financial reward from it (in terms of competitive advantage) rather than making it open source for other IHV's to use.
So if this is the main argument - that NVIDIA is now a middleware provider - then they should be treated like any other middleware provider in this aspect. If someone says they aren't using Havok because they want to develop their own physics system (for whatever reason!) no one would claim it was "ridiculous in 2023 for a game to not support Havoc". Similarly for graphics tech does anyone claim it's ridiculous to ship a AAA game without using clearly superior technology like Nanite?

That's not even getting into the fact that NVIDIA as a middleware provider has clear conflict of interest issues.

It would not run well on incompatible hardware because shader cores are too slow at matrix multiplication in real time, just look at how XeSS runs without HW-acceleration.
Indeed and if this is the argument - that the closed source thing is just a temporary stopover while we standardize the necessary APIs to express DLSS/XeSS and so on on hardware with tensor cores - then it's very important to keep pushing the IHVs and API vendors to be working on this actively. As someone already pointed out, we're half a decade in at this point and have we seen any progress indicating NVIDIA or anyone else is putting effort into expressing DLSS in one of the ML APIs, or improving those APIs so that it is expressible if not? Why have I not seen the press and enthusiasts asking them how progress on this is going? Perhaps I have legitimately missed it but my impression has been NVIDIA and Intel have both effectively just been given a pass on this.

tldr if NVIDIA are going to act like a closed source middleware provider here it is completely fine for developers to choose not to use their middleware, for *any* reason. If we truly think that DLSS or similar technology is foundational and should be in every game then it needs to be open and portable in a way that produces the same pixels on different hardware, when similar hardware support for ML operations is available.
 
But you can do it.
I mean what is holding Microsoft back to bring WinML to the market? What is holding Epic back to use CUDA/OpenCL/whatever to make their own "DLSS"? Asking nVidia is strange. it wasnt nVidia who had created Direct3D 26 years ago.
 
But you can do it.
I mean what is holding Microsoft back to bring WinML to the market? What is holding Epic back to use CUDA/OpenCL/whatever to make their own "DLSS"?
If I could request we keep this high level and not specific to anything that would be considered too close to my particular work then I'd feel more comfortable continuing to comment.

That said, Unreal does have TSR. It is currently not ML-based, but that's more of an implementation detail. Maybe it will be at some point. There is still work to do on those ML APIs of course that I would love to see more people asking the relevant parties about. If ML is truly as important a part of the future as many believe then it obviously needs to be standardized and applications go through that standard interface.

It's certainly fine to compare and contrast TSR to other solutions, but I don't think it's reasonable to say that it's not okay for a game to decide to just use TSR. What if the game modifies TSR to tweak it for their specific content? What if they just want to make sure they QA the pixels from a specific path carefully and know that they are going to stay the same in the future? What if they just prefer the TSR tradeoffs to the DLSS/XeSS/FSR2 tradeoffs?

But it's just not okay to "expect" some piece of middleware to replace built-in methods in engines/games. It's certainly fine to ask, but it's also fine for a developer to say no, we're doing our own thing, thanks.

To be clear, I think DLSS2 has gotten pretty great and is a really interesting piece of technology. I have a lot of respect for the friends I have at NVIDIA that work on it and other stuff. I use it frequently in my own gaming as well. My complaints here have nothing to do with the technology itself.

Asking nVidia is strange. it wasnt nVidia who had created Direct3D 26 years ago.
As I'm sure you are aware, it has been a heavy collaborative process between OSVs, IHVs and ISVs over those years. The point is if you truly feel a technique is so important that it should be included in every game, it belongs in the API or an open library implemented on top of it, and it should produce the same pixels across IHVs. That is, after all, the entire point in Direct3D.

Anyways I think I've said my bit at this point. I get the frustration from a use perspective (I'm a gamer too it turns out!), but hopefully this can give some perspective for you to step back and understand the at-least-equal frustration from a graphics industry developer perspective at the behavior of the various companies here. Please don't continue to just eat up their narratives and encourage that behavior.
 
Last edited:
I wouldn't even mind this in many cases, assuming the developers were really just prioritizing making a good working port with limited resources and all. I've generally considered 'extra graphical options' on PC as a bonus rather than some intrinsic or required aspect of PC gaming. A lot of games being released on PS5/XSX already look very nice in my opinion, and extra horsepower on PC can still be put to use via higher resolutions or higher performance.

Problem here isn't just that the game has a lack of DLSS, it's they are ostensibly being paid not to include it. This isn't all a case of a developer hard up against the ropes, making a necessary sacrifice on features in order to make best use of their limited time and resources. It's a large AAA developer who has even had quite a bit of extra time given to them by their publisher to make the game as good as possible. It's also a historically PC-centric developer, so it's not like they're just getting into doing PC ports after so many years making console games.
true that. Also Andrew mentioned that FSR 2.0 is good for developers because it's open source, so it's easy to implement. However, this contrasts with Todd Howard's words, when he says literally: "We have AMD engineers in our code base working on FSR (FidelityFX Super Resolution) 2.0 image processing and upscaling and it looks incredible. You're going to get the benefits of that obviously on your PC but also on Xbox. We're super excited and can't wait to show everybody more." . This is PR, but if they are getting help from AMD engineers they are adding complexity to the project, which defeats the purpose -this is obviously 'cos they got paid to not use DLSS- 'cos most developers don't have that leisure.
 
true that. Also Andrew mentioned that FSR 2.0 is good for developers because it's open source, so it's easy to implement. However, this contrasts with Todd Howard's words, when he says literally: "We have AMD engineers in our code base working on FSR (FidelityFX Super Resolution) 2.0 image processing and upscaling and it looks incredible. You're going to get the benefits of that obviously on your PC but also on Xbox. We're super excited and can't wait to show everybody more." . This is PR, but if they are getting help from AMD engineers they are adding complexity to the project, which defeats the purpose -this is obviously 'cos they got paid to not use DLSS- 'cos most developers don't have that leisure.

It's not necessarily adding more complexity - AMD have a lot of experience with FSR and their experience will be valuable. They say that AMD also have engineers in house helping on the multithreading of the game. This is a potentially a very valuable thing for Bethesda to getting from their tie up with AMD. Better CPU performance would help everyone who plays the game, Nvidia and Intel users included.
 
Despite everyone alleging foul play by AMD did anyone consider the thought that the backroom deals made by them with other parties simply agreed with their opinion ? The truth might be simpler than most would like not to believe ...

If FSR was 'good enough' for developers to agree with AMD and simply made that choice to support only one open reconstruction method of their own volition, then no stipulation to block DLSS as part of the dev program was needed.

If there's no agreement to block DLSS, then AMD's response to wccftech's question would be as direct as Nvidia's - "We don't do that". Instead, it was a rather obvious attempt at deflection, hence why we're discussing this now.

Perhaps the PR person responding was just extremely bad at communicating? Sure, but in this case, the 'simple truth' would be that there's a reason they dance around an answer instead of delivering a very clear response.

Countering a developer's argument from an end user perspective is hardly constructive since it becomes both tiresome and deafening. Everyone's willing to talk about the user experience but what about the developer experience ?

This is not merely a technical discussion however. This discussion wouldn't have been getting the attention it has if this was merely developer preference, if someone is coming in here claiming 'lazy devs' for not implementing DLSS along with FSR and Andrew is offering technical reasons why it's not such a simple copy and paste, that's perfectly reasonable. People are up in arms because AMD themselves are the ones who have reinvigorated the speculation that the lack of DLSS was due to a financial agreement with the publisher rather than a prudent technical choice by the developer. Andrew is not proving any clarity on that, or rather saying people are 'eating up their narratives' - implying that their opinions on this are at least in part due to being manipulated.

You can advocate for an open-source framework for reconstruction while also being concerned about backroom deals that block a competitors currently closed solution, in fact I'd say it's weird that you would use the former to deflect from the latter.
 
Last edited:
I may be wrong, but the Digital Foundry topic doesn't seem the place to have this discussion, this belongs to other sections and topics.

It’s not technically a DF article but key DF people have recently weighed in on the topic. It would be helpful to clarify if DF social media / opinions are fair game in this thread before continuing.

Countering a developer's argument from an end user perspective is hardly constructive since it becomes both tiresome and deafening. Everyone's willing to talk about the user experience but what about the developer experience ?

I’m not sure how AMD partnerships and developer experience are related. Either way developers are free to choose whatever software they want to include in their engine or app. The noise you’re hearing is due to the perception that AMD is actively discouraging devs away from competiing products. If the devs in question came out and said they think XeSS/DLSS suck that would be a different story.

There’s an undertone here that nobody uses DLSS/XeSS because they want to and it’s all “bad actors” making them do it. That’s a pretty fantastic allegation given lots of paying customers do enjoy using those technologies.
 
People are up in arms because AMD themselves are the ones who have reinvigorated the speculation that the lack of DLSS was due a financial agreement with the publisher rather than a prudent technical choice by the developer. Andrew is not proving any clarity on that, or rather saying people are 'eating up their narratives' - implying that their opinions on this are at least in part due to being manipulated.
Just to make myself clear for posterity here, I'm not commenting at all on whether AMD has some deal or not that is "blocking" DLSS. I have no inside knowledge and wouldn't comment if I did. I've been on record in the past saying I dislike all forms of IHV deals with developers and think the two should remain at arms length as much as possible. Obviously I'm not in charge and these sorts of deals are very common, with AMD historically at least not being the largest offender. I'm all for people complaining about these things as long as they do it generally and consistently across brands.

You can advocate for an open-source framework for reconstruction while also being concerned about backroom deals that block a competitors currently closed solution, in fact I'd say it's weird that you would use the former to deflect from the latter.
To be clear, this is basically my position as noted above.

I don't consider my response a deflection; I'm responding specifically to the quotes DF and others have made to the effect of "it's ridiculous for a AAA game to not support DLSS/XeSS/etc in 2023" which I fundamentally do not agree with. As this is a pretty direct quote from twitter, I think it is appropriate for this thread, as long as we all remain respectful. I realize this discussion is in response to the Starfield situation but I think on B3D there's enough space for us to discuss these two related-but-not-the-same topics. We could create separate threads, but I think both topics are quite relevant to the past week of DF discussions.

That said I imagine the majority just want to talk about the spicy AMD/Starfield-specific politics which I have no interest in, so I'll exit unless anyone wants to talk about the broader "IHV middleware" issues :)
 
Why would it being around for some amount of time change anything about the fundamental argument about how closed source stuff stifles innovation
Does it? I've been proponent of open source for 20 years now and never thought to say closed source stifles innovation. I like to think of open source a slowly rising tide. Closed source is fine, but it can't stay still. It needs to innovate to stay afloat or get swallowed. DLSS is superior right now but literally nothing prevents someone else from doing something better and its only a matter of time... unless it continues to innovate.
 
DLSS is superior right now but literally nothing prevents someone else from doing something better and its only a matter of time... unless it continues to innovate.
That's not true though as DLSS uses proprietary driver interfaces that are not available to regular applications. This is precisely the problem with an IHV controlling this sort of middleware and why people should be pushing for API standards that allow expressing it and other similar algorithms in a portable way.
 
Back
Top