Starfield to use FSR2 only, exclude DLSS2/3 and XeSS: concerns and implications *spawn*

Somehow I feel I can provide another practical aspect or excuse of not supporting DLSS2.
No matter how easy the tech integration it is, it is always another task on the table, and you still have to get people test it before shipping (especially this is PC, so prolly a bunch of cards to go with). And, even we all say it is easy, things might just break sometimes -- maybe not even due to DLSS but their poor engine codes -- the point is, ya it's one more thing to do.
Then since they already got a universal upscaling method, and the game is delayed a bunch of time, Devs prolly death marching to the deadlines rn. All they care is one more thing off the bug list.

I'm not saying they shouldn't add XeSS and DLSS, but more like there're reasons other than techs. I hope they can add back all the other upscaling tech after the devs take a great break.
 
Somehow I feel I can provide another practical aspect or excuse of not supporting DLSS2.
No matter how easy the tech integration it is, it is always another task on the table, and you still have to get people test it before shipping (especially this is PC, so prolly a bunch of cards to go with). And, even we all say it is easy, things might just break sometimes -- maybe not even due to DLSS but their poor engine codes -- the point is, ya it's one more thing to do.
Then since they already got a universal upscaling method, and the game is delayed a bunch of time, Devs prolly death marching to the deadlines rn. All they care is one more thing off the bug list.

I'm not saying they shouldn't add XeSS and DLSS, but more like there're reasons other than techs. I hope they can add back all the other upscaling tech after the devs take a great break.

Dude this isnt some indie effort by a few guys in a garage strapped for cash and time. It’s freakin’ Bethesda owned by Microsoft. Let’s stick with reality here.

AMD wanted their name plastered on a AAA title. Happily paid for it and in turn get to block objectively superior tech in the process. That’s the only reality and yet clearly so many here are living either in a bubble or being intentionally obtuse.

On a game this hyped, there would be a plethora of content comparing upscaling tech, RT performance etc in which AMD loses out on. That’s money wasted so why even allow some of these comparisons in the first place.
 
Dude this isnt some indie effort by a few guys in a garage strapped for cash and time. It’s freakin’ Bethesda owned by Microsoft. Let’s stick with reality here.

AMD wanted their name plastered on a AAA title. Happily paid for it and in turn get to block objectively superior tech in the process. That’s the only reality and yet clearly so many here are living either in a bubble or being intentionally obtuse.

On a game this hyped, there would be a plethora of content comparing upscaling tech, RT performance etc in which AMD loses out on. That’s money wasted so why even allow some of these comparisons in the first place.
I don't know what makes you think indie are guys who don't get cash and time. I feel like big budget title devs whom have already got the game delayed a few times are more thirsty on cash and time. The ppl who *can* delay the title are the business man, not the engineers. The guys who run the company not gonna delay the game to ensure they have time to support something not crucial to the finish of a product.
Also if i were the engineers, I prolly don't want to delay the game either. Ppl might already overworked a ton of time, and they just wanna the game to be done ASAP. FH5 added TAA, DLSS and FSR quite a while after launch, I would be happy if we can see the same thing to starfield.
At least this is my limited observation in the game industry, as someone who have interned in a triple A studio. Again, I'm really just guessing here. Maybe AMD had made a evil deal, but I just tried to think about more reasons. Not everything is tech related at the end of day, And it's sad to see.
 
The proper thing to do would be for reviewers of the game to include the DLSS mod being developed as part of image quality and performance comparisons. And if found that the DLSS mod greatly improves performance and IQ over what FSR can provide then that should be published as well. And then maybe a recommendation to use the mod if you own an Nvidia card that supports DLSS. At least this way a consumer watching or reading the review will have knowledge they could have better performance despite what the dev's reason is to not include support for it.
 
The proper thing to do would be for reviewers of the game to include the DLSS mod being developed as part of image quality and performance comparisons. And if found that the DLSS mod greatly improves performance and IQ over what FSR can provide then that should be published as well. And then maybe a recommendation to use the mod if you own an Nvidia card that supports DLSS. At least this way a consumer watching or reading the review will have knowledge they could have better performance despite what the dev's reason is to not include support for it.
I'd love to see the mod maker force DLSS 3 on 30x0 and 20x0 cards to see if it performs well on that too
 
I'd love to see the mod maker force DLSS 3 on 30x0 and 20x0 cards to see if it performs well on that too
I think at least here on B3D we should be referring to the technologies in particular rather than Nvidias stupid marketing naming. In this case you're referring to frame generation.
 
I think at least here on B3D we should be referring to the technologies in particular rather than Nvidias stupid marketing naming. In this case you're referring to frame generation.

What you telling me that Nvidia having DLSS 3.0 that doesn't support 2x00/3x00 cards but 3.5 that does support 2x00/3x00 cards for ray tracing scaling ? I mean that is even easier than wii / wii u and xbox one/ xbox series
 
The proper thing to do would be for reviewers of the game to include the DLSS mod being developed as part of image quality and performance comparisons. And if found that the DLSS mod greatly improves performance and IQ over what FSR can provide then that should be published as well. And then maybe a recommendation to use the mod if you own an Nvidia card that supports DLSS. At least this way a consumer watching or reading the review will have knowledge they could have better performance despite what the dev's reason is to not include support for it.

The problem is the mods aren't equivalent to being integrated though. The modder doesn't have access to the render targets the same way the developer does. This is why Puredark's DLSS mods fucks up in the Resident Evil series with depth of field, motion blur, bloom, lens distortion effects - that's a lot of intended artistic presentation you have to disable not to have glaring artifacts because he can't control when and where the jitter/upscaling is applied (which is why overlays like player inventory can be blurred/jittery too). You also have to force a negative LODBIAS with Nvinspector as the game has no clue about reconstruction when it comes to mip levels, which means Intel/AMD users are pretty much out of luck without wanting to suffer from poor texture quality as there is no equivalent method to force lodbias with those GPU's that I'm aware of.

So while they can still be an improvement over a shitty FSR2 implementation (ala RE4 - which speaks to how far DLSS is ahead when even a broken implementation produces superior results), I see people routinely say "Well modders will just mod it in". Yes they will, and it will not be nearly as good as if the developer included it in the first place.

Again, I'm really just guessing here. Maybe AMD had made a evil deal, but I just tried to think about more reasons. Not everything is tech related at the end of day, And it's sad to see.

Again, this is just silly. If AMD didn't force the issue, they would do what Nvidia did and come out and say so. There is no benefit to being silent on this if you're not doing what you're being accused of. It hurts AMD and the Starfield developers, you can see any Starfield announcement on Twitter and Youtube being commented on to oblivion about this. Being this intentionally naive at this point is just...weird.

wasn't a controversy with nvidia and physx along the lines of Nvidia buying the company and then hobbling the cpu code so any of the effects would run extremely poorly vs on nvidia graphics cards.

I am sure AMD wouldn't want to just trust me bro with nvidia

So then don't use Streamline. AMD themselves were advertising how easy it is to add FSR2 to games with exiting DLSS support, the bulk of the work is implementing any temporal reconstruction method in the first place. Steamline just potentially makes that even a little easier.
 
Last edited:
What you telling me that Nvidia having DLSS 3.0 that doesn't support 2x00/3x00 cards but 3.5 that does support 2x00/3x00 cards for ray tracing scaling ? I mean that is even easier than wii / wii u and xbox one/ xbox series
DLSS3 is supported on same level of cards as DLSS2 isn't it?
I thought it was that Frame Generation wasn't.
I assume DLSS3.5 supports same cards as DLSS2 including the new Ray Generation.

I'm not sure what is confusing, it's just a single feature that isn't BC.

Guess it's confusing because people say DLSS2 when that is a legacy version when they mean just the image reconstruction part of DLSS3.

Sounds like you're complaining that they are making newer features available on older catds?
 
DLSS3 is supported on same level of cards as DLSS2 isn't it?
I thought it was that Frame Generation wasn't.
There lies the problem. Users (here included), articles, videos etc use the term interchangeably referring to either the super resolution, frame generation, or both. Which is Nvidia's fault because they are lumping it all together.
 
There lies the problem. Users (here included), articles, videos etc use the term interchangeably referring to either the super resolution, frame generation, or both. Which is Nvidia's fault because they are lumping it all together.
You might be right as I've not followed how Nvidia present it. I just remember at release they specifically said DLSS2 is rolled into DLSS3.
If they don't specifically call it DLSS3 FG when that's what they mean it is bad messaging.
 
DLSS3 is supported on same level of cards as DLSS2 isn't it?
I thought it was that Frame Generation wasn't.
I assume DLSS3.5 supports same cards as DLSS2 including the new Ray Generation.

I'm not sure what is confusing, it's just a single feature that isn't BC.

Guess it's confusing because people say DLSS2 when that is a legacy version when they mean just the image reconstruction part of DLSS3.

Sounds like you're complaining that they are making newer features available on older catds?

There lies the problem. Users (here included), articles, videos etc use the term interchangeably referring to either the super resolution, frame generation, or both. Which is Nvidia's fault because they are lumping it all together.
I mean the problem with Nvidia is the marketing of DLSS.


Reading this what 30 series cards can run is just dlss 1/2 rebranded as 3. The real features of 3 are only 40 series cards.

Nvidia's marketing needs to be better. I mean technically according to the imagine at the bottom of the linked page the 900 series is cable of DLSS 3 because of reflex
1692824142763.png
 
I mean the problem with Nvidia is the marketing of DLSS.


Reading this what 30 series cards can run is just dlss 1/2 rebranded as 3. The real features of 3 are only 40 series cards.

Nvidia's marketing needs to be better. I mean technically according to the imagine at the bottom of the linked page the 900 series is cable of DLSS 3 because of reflex
View attachment 9443
Guess for me I don't see that as a problem there all components that makes up DLSS3 and they let components run on supporting cards.
But based on how DLSS started, would've been better to have a different overall branding name now. So do appreciate where your coming from.

So I would say for normal discussions should talk about dlss fg or dlss image reconstruction unless it's in the marketing and pr threads.
 
The problem is the mods aren't equivalent to being integrated though. The modder doesn't have access to the render targets the same way the developer does. This is why Puredark's DLSS mods fucks up in the Resident Evil series with depth of field, motion blur, bloom, lens distortion effects - that's a lot of intended artistic presentation you have to disable not to have glaring artifacts because he can't control when and where the jitter/upscaling is applied (which is why overlays like player inventory can be blurred/jittery too). You also have to force a negative LODBIAS with Nvinspector as the game has no clue about reconstruction when it comes to mip levels, which means Intel/AMD users are pretty much out of luck without wanting to suffer from poor texture quality as there is no equivalent method to force lodbias with those GPU's that I'm aware of.

So while they can still be an improvement over a shitty FSR2 implementation (ala RE4 - which speaks to how far DLSS is ahead when even a broken implementation produces superior results), I see people routinely say "Well modders will just mod it in". Yes they will, and it will not be nearly as good as if the developer included it in the first place.



Again, this is just silly. If AMD didn't force the issue, they would do what Nvidia did and come out and say so. There is no benefit to being silent on this if you're not doing what you're being accused of. It hurts AMD and the Starfield developers, you can see any Starfield announcement on Twitter and Youtube being commented on to oblivion about this. Being this intentionally naive at this point is just...weird.



So then don't use Streamline. AMD themselves were advertising how easy it is to add FSR2 to games with exiting DLSS support, the bulk of the work is implementing any temporal reconstruction method in the first place. Steamline just potentially makes that even a little easier.
Thank you for your well written response. I hadn't considered that a mod could break a game visually so there may not be a good reason to perform such tests if the overall IQ will be diminished.
 
I'd love to see the mod maker force DLSS 3 on 30x0 and 20x0 cards to see if it performs well on that too
I too was under the increasing that dlss3 is incompatible to Nvidia God's pmbefore the 40x0 series. They have some confusing marketing.
 
I too was under the increasing that dlss3 is incompatible to Nvidia God's pmbefore the 40x0 series. They have some confusing marketing.
yea, I mean the part that the 20/30 series is compatible with is just dlss 2.0 renamed and bundled into 3.0. But the major new features which is optical frame generation isn't supported

So sure this is great marketing for Nvidia because it tricks people into believing they can use all of dlss 3.0 but in reality they can't and thus its poor marketing and naming for consumers
 
Avatar doesnt support DLSS (and XeSS) - the PC feature trailer doesnt mention them.

/edit:
AMD FSR Technology, Resolution and Ratio Adaptability

Avatar: Frontiers of Pandora will utilize AMD's FSR 2 technology so players can fine-tune visual quality and performance, and it will be able to handle most resolution and aspect ratios; including ultra-wide screens. The game will also support DLSS at launch.

 
Last edited:
Back
Top