Starfield to use FSR2 only, exclude DLSS2/3 and XeSS: concerns and implications *spawn*

So, AMD don't prevent them from implementing DLSS, they just can't release it. :p
 
An interesting technicality here could be that the agreement doesn't actually place any prohibition on the technological implementation of DLSS but prohibits/restrictions the mention of competitors and/or competing technologies.

This means that the developer can implement the technological portion of DLSS but they can't use the terms DLSS (competitor term) or attribution to Nvidia (competitor) or any other competitor terms (eg. RTX). This would then mean unless Nvidia allows the implementation of DLSS under a different none copyright/trademark term and without any attribution than this means in effect it can't be put into the end user product.

In this sense AMD technically is correct. The developer is free to request and implement competitor technologies. AMD is not the one blocking anything per say and it's actually Nvidia and Intel that is blocking their own technologies.
 
I don't really understand what in Starfield would require some intense game logic or simulation. There are a bunch of standard brainless npcs walking around, but that's about it. I wouldn't say it has nearly as much going on as an assassin's creed game, or GTA/Red Dead. Plus the game has loading screens that segment the world. The only thing I can see that might require a lot of bandwidth is just streaming in objects and terrain as you move around, if it has a streaming system for each area. In terms of game logic/simulation, I just don't see much there.

Hardware is not always leveraged directly for the end user but indirectly by benefiting the developer. My educated guess (and given what we know with the tools) is that BGS likely consciously places an emphasis on (and has always had this philosophy) the content authoring side, as in they will trade off end user performance/graphics if it benefits the content authoring side which in turn benefits the content side of the game.

They might all be open world games for the instance but the lower down feel of the AC games and GTA/RDR is way different from that of BGS games.

Just to touch upon the example of the NPCs. They might not at that moment be asked to do much but the underlying code is likely still being run. Is it the most performance efficient philosophy? Likely not. But judging with how easy modders can go at NPC behavior say compared to the other games my guess is that is the trade off for the content authoring benefits.
 
In this sense AMD technically is correct. The developer is free to request and implement competitor technologies. AMD is not the one blocking anything per say and it's actually Nvidia and Intel that is blocking their own technologies.
That's a really pathetic perspective, and of course if that's AMD's argument then that's even worse. You can't say you're not blocking if you absolutely know that the competitors integration requires attribution.
 
That's a really pathetic perspective, and of course if that's AMD's argument then that's even worse. You can't say you're not blocking if you absolutely know that the competitors integration requires attribution.

We're analyzing legal speak here and in terms of plausible deniability I think it fits. I wouldn't be surprised if there's nothing in the agreement specifically addressing DLSS (or XeSS) at all. It's also common in sponsorship agreements to restrict the mention of competitors.
 
We're analyzing legal speak here and in terms of plausible deniability I think it fits. I wouldn't be surprised if there's nothing in the agreement specifically addressing DLSS (or XeSS) at all. It's also common in sponsorship agreements to restrict the mention of competitors.
I understand that, but AMD came out and publicly responded to this issue "we're not blocking them", which is usually not done when you're hiding behind legal technicalities in which you're placing the fault on the other party with full knowledge of how you structured your marketing agreement to block them.
 
Bethesda has effectively alienated 87% of it's PC player base with their decision.


Meanwhile, people are downloading DLSS mods in droves, the 1st and 6th most downloaded Starfield mods on Nexus are mods that replace FSR with DLSS.


Some even cracked the DLSS3 mod, which was posted behind a paywall by the modder!

 
not surprised people are downloading the DLSS mod in droves. FSR2 has some crazy shadow ghosting issues.


Ghosting issues overall.


Also particles.

 
Few more DLSS 3 FG demos:
 
Last edited by a moderator:
Starfield, the game that invites us to dream? Dream of what could be if it weren’t stuck in basic graphics. No ray tracing? Rasterizer graphics? I suddenly feel like I’ve been catapulted into a time capsule straight back to the late 2010s. And where the heck is DLSS? Not even Intel’s XeSS made it to the party. I mean, if neither Nvidia nor Intel are on the guest list, there should at least be some party hats and streamers. Bethesda and AMD obviously threw an exclusive party where only FSR 2, AMD’s answer to DLSS, was on the guest list. ‘Look, we have friends too!’ they proudly proclaim, presenting us with their bundle of Starfield and GPUs, while hoping we’ll just ignore the absence of the cool kids, DLSS and Ray Tracing.
...
Well, you could say it’s only fair because, in real life, the sponsor doesn’t always win, right? It’s like Pepsi financing a concert where all the guests are happily drinking Coca-Cola. The absurdity is so delicious that you can’t help but marvel—unless you privately bought a Ryzen 9 7950X3D instead of an Intel Core i9-13900K. And I can imagine AMD folks sitting in their offices wondering where it all went wrong. ‘We sponsored the game, we have AMD hardware in the bundle! Why, oh why, is Intel stealing the show?’ Naughty Intel! The answer, ladies and gentlemen, is complicated and technical, and probably involves a whole bunch of things only engineers understand. But at the end of the day, it’s just hilarious. For marketing and Intel buyers.

Maybe it’s also a lesson in humility for AMD. You can’t just slap your name on a game and expect automatic fame and recognition. Or perhaps it’s a lesson for all of us not to believe everything that’s on the glossy marketing brochure. That extends to FSR2. This AMD feature (sorry, dear fanbase) is so self-impressed that it reactivates itself at every opportunity. ‘You want to switch from medium to high details? Surprise! Here I am again! Gotcha,’ it practically screams from the depths of the settings. And then you have to deselect it. Every. Single. Time.

But wait, there’s more. The game apparently just missed the launch of FSR 3. How ironic! This could have been the big moment for AMD and Starfield, the red carpet on which they hand-in-hand stroll into a visually stunning future. But no, the carpet was rolled up and stashed back into the cyber-closet until 2077. The opportunity to really impress graphically was as elegantly missed as a meteorite narrowly passing Earth.

All in all, Starfield feels like a missed opportunity, a marriage proposal without a ring. Sure, the game has other qualities, but when it comes to graphics, it’s like fireworks being set off in the rain. It fizzles and pops, but the big ‘Ahs’ and ‘Ohs’ are missing. So, all we can say is: Maybe next time, dear Starfield developers. Maybe next time. Once again…”
 
Last edited by a moderator:
HUB just posted a video testing over 40 CPUs, with a short snippet regarding memory speeds. Memory scaling is at 10:08.
Very interesting. Prior to the launch of Starfield I was second-guessing my decision to go with a 13700k CPU, thinking perhaps I should have gone with 7800X3D. It's a fairly marginal difference in Starfield however and on reflection I think I would have preferred the cooler AMD build.
 
HUB just posted a video testing over 40 CPUs, with a short snippet regarding memory speeds. Memory scaling is at 10:08.
I also saw the Gamer's Nexus CPU scaling video. This is a confusing game. I can't figure out what it scales with.
 
Starwars Jedi Survivor has released an official patch to support DLSS2.


Interesting timing just after the recent AMD announcement... Also puts the total lack of fanfare about the DLSS implementation (which is a big deal) into context.
 
Back
Top