Starfield to use FSR2 only, exclude DLSS2/3 and XeSS: concerns and implications *spawn*

they won't be getting any money from me anymore, I've blacklisted AMD until the end of times.
That's what i've meant. They do not help themselves.
And it's pretty hard to like any tech company, currently.

However, it's peanuts. Nothing compared to what's in front of us.
It looks we re at Kurzweils Singularity already now. AI can code, and just scaling models up emerges new abilities. Artificial evolution might operate at exponential rate very soon.
I doubt anybody can control something which is thousands, then millions of times smarter than we are.
But if so, the next world war may be a civil war against those pretending to have control, and that's tech mega corps.
I also doubt we could convince the artificial god that optimizing our life standard is the priority.

Am i just crazy or do bad SciFi movies become real? Idk. Hopefully the former.
 
Hardware Unboxed blasts AMD on the subject of blocking DLSS, after asking AMD direct questions on the situation and recieving another "no comment" answer from AMD.

Hardware Unboxed also analysed the full list of AMD/NVIDIA sponsored titles in the last two years, and found out that 67% of NVIDIA titles get FSR support, while only 25% of AMD titles get DLSS, a stark difference that points clearly to a malicious pattern from AMD in their analysis.

 
That's what i've meant. They do not help themselves.
And it's pretty hard to like any tech company, currently.

However, it's peanuts. Nothing compared to what's in front of us.
It looks we re at Kurzweils Singularity already now. AI can code, and just scaling models up emerges new abilities. Artificial evolution might operate at exponential rate very soon.
I doubt anybody can control something which is thousands, then millions of times smarter than we are.
But if so, the next world war may be a civil war against those pretending to have control, and that's tech mega corps.
I also doubt we could convince the artificial god that optimizing our life standard is the priority.

Am i just crazy or do bad SciFi movies become real? Idk. Hopefully the former.
that's the biggest difference, the use of AI and how nVidia has trained their AI models. I am upset with AMD 'cos I defended them in the past -though I was always an Intel person- and purchased the first Ryzen -Ryzen 1500X in that case along with the RX 570- and then the Ryzen 3700X.

Gotta admit that I am in love with technologies like DLSS. To me anything that do smart things that save energy and performance is revolutionary. Heck, now I am playing games on my 10 years old 32" 1080p TV which has superb image quality and color bit depth -no HDR nor freesync though- and consumes 40W at most. I can throw anything to the GPU at 1080p to the point that it doesn't break a sweat. Maybe only in games with path tracing, but even Raytraced games run well without the GPU wattage going overboard -save for Diablo 2 Resurrected which is an odd beast even if it doesn't have RT-
 
that's the biggest difference, the use of AI and how nVidia has trained their AI models. I am upset with AMD 'cos I defended them in the past -though I was always an Intel person- and purchased the first Ryzen -Ryzen 1500X in that case along with the RX 570- and then the Ryzen 3700X.

Gotta admit that I am in love with technologies like DLSS. To me anything that do smart things that save energy and performance is revolutionary. Heck, now I am playing games on my 10 years old 32" 1080p TV which has superb image quality and color bit depth -no HDR nor freesync though- and consumes 40W at most. I can throw anything to the GPU at 1080p to the point that it doesn't break a sweat. Maybe only in games with path tracing, but even Raytraced games run well without the GPU wattage going overboard -save for Diablo 2 Resurrected which is an odd beast even if it doesn't have RT-
Nobody is suggesting you cant or shouldn't have preferences. It's just silly to announce a 'boycott' of AMD over this situation as if they are doing anything more egregious than Nvidia or Intel has done in the past in terms of anti-consumer moves. It's bad reasoning. If you just say, "I prefer Nvidia at the moment cuz they have better upsampling", absolutely nobody would have anything against such a stance. It would also be reasonable to just say, "I dont see myself buying AMD anytime in the foreseeable future cuz I think they will remain behind Nvidia technologically", that'd also be fairly reasonable(though we ARE ignoring pricing/value here...).
 
that'd also be fairly reasonable(though we ARE ignoring pricing/value here...).
I don't think AMD being competitive in the gpu space will benefit us consumers, going by how they priced up their cpus after they had intel by the balls. I don't think their prices would be any better than nvidia if the roles ever reverse, or if intel managed to knock them both off the perch i'm sure the price of their gpus would no longer be the cheapest either. The days of AMD going for the throat via price like the 4000hd series (sorry can't remember architecture name, tahiti?) seem to be a distant memory.
 
Nobody is suggesting you cant or shouldn't have preferences. It's just silly to announce a 'boycott' of AMD over this situation as if they are doing anything more egregious than Nvidia or Intel has done in the past in terms of anti-consumer moves.

Completely agree that a boycott over this is silly in the grand scope of 'corporations are not your friends', but I'm not so sure AMD's behavior here really has a direct comparison with Nvidia's past history - is there really a precedent for another IHV deliberately blocking developer choice in adopting a competitors technology? I'm not so sure, can you give the closest comparison in your mind? This seems pretty unique which is part of the reason it's got such traction.

We all expect egregious marketing and aggressive promotion of features with questionable value-adds tacked onto a game largely at Nvidia's behest with their partners, sure. That's different though, nobody gives a crap that AMD is working with EA, it's what they're preventing them from doing.

Part of the problem with looking at past history and trying to surmise if Nvidia has done something similar is due to the Radeon group being hollowed out as a result of their dwindling market share, they haven't simply haven't had the software/hardware engineering resources to actually offers developers much wrt custom software solutions like they did in the days of TressFX and TruForm. So I don't know, maybe Nvidia has baked something similar into their contracts in the past, but whenever I've seen the argument that "Nvidia has done similar stuff", so much of the reasoning ultimately seems to be "I see the Nvidia logo on games a lot".

I don't think AMD being competitive in the gpu space will benefit us consumers, going by how they priced up their cpus after they had intel by the balls. I don't think their prices would be any better than nvidia if the roles ever reverse, or if intel managed to knock them both off the perch i'm sure the price of their gpus would no longer be the cheapest either. The days of AMD going for the throat via price like the 4000hd series (sorry can't remember architecture name, tahiti?) seem to be a distant memory.

Those days of truly cheap components seem to be a distant memory for most everyone unless you're an SSD manufacturer at this point though. AMD's competition in the CPU space has had a direct effect on Intel's CPU prices and core counts in general however, I don't see how that can be denied. Ryzen may not be the bargain-basement option it was in the past sure, but AMD lit a fire under Intel and got us out of this 4-core quagmire.

When people wish for competition in the GPU space, they don't want either company to have another 'by the balls' - that situation is what you have now, because Intel just can't really compete cost-effectively. If/when they can get their process nodes under control and start producing desktop CPU's again with reasonable wattages, you'll see fiercer competition. Regardless, even now this is leagues better than the GPU situation - there are numerous well performing ~$250 CPU offerings from both camps. GPU pricing, relative to every other component in a modern gaming PC, is still the one area that has seen the most significant inflation.
 
Last edited:
Completely agree that a boycott over this is silly in the grand scope of 'corporations are not your friends', but I'm not so sure AMD's behavior here really has a direct comparison with Nvidia's past history - is there really a precedent for another IHV deliberately blocking developer choice in adopting a competitors technology? I'm not so sure, can you give the closest comparison in your mind? This seems pretty unique which is part of the reason it's got such traction.
Just because there isn't a very specific example of this exact same thing doesn't mean there hasn't been similar enough situations in terms of one company using deals to basically compromise the experience for those using the competitor's GPU's, in which case there's very much examples to bring up(that I wont here to keep things on-track).
 
Just because there isn't a very specific example of this exact same thing doesn't mean there hasn't been similar enough situations in terms of one company using deals to basically compromise the experience for those using the competitor's GPU's, in which case there's very much examples to bring up(that I wont here to keep things on-track).

There's a massive chasm in my mind between giving engineering resources to a studio to add an additional graphical effect that runs the best on your GPU architecture that never would have existed in the game without that marketing deal otherwise, and putting in language that specifically restricts commonly used features though.

For example, I highly doubt that WB Games was about to develop a GPU-agnostic, physics-based volumetric smoke effect for the PC version of Arkham Knight until Nvidia came along and offered their Gameworks version and effectively locked out Radeon GPU's from utilizing that feature. That addition of Nvidia-optimized code did not restrict a well-established, easily integrated similar graphical feature that everyone would have normally expected from being included. There's a big difference, it's literally the basis of this controversy, which AMD is fueling because they won't answer a direct question. Nobody was trying to get quotes from Nvidia on adding in a higher-tier PCSS shadow effect in a game, there's no mystery there.

(mods: Maybe the whole discussion around IHV marketing deals can be spun off into a new thread?)
 
Last edited:
There's a massive chasm in my mind between giving engineering resources to a studio to add an additional graphical effect that runs the best on your GPU architecture that never would have existed in the game without that marketing deal otherwise, and putting in language that specifically restricts commonly used features though.

For example, I highly doubt that WB Games was about to develop a GPU-agnostic, physics-based volumetric smoke effect for the PC version of Arkham Knight until Nvidia came along and offered their Gameworks version and effectively locked out Radeon GPU's from utilizing that feature. That addition of Nvidia-optimized code did not restrict a well-established, easily integrated similar graphical feature that everyone would have normally expected from being included. There's a big difference, it's literally the basis of this controversy, which AMD is fueling because they won't answer a direct question. Nobody was trying to get quotes from Nvidia on adding in a higher-tier PCSS shadow effect in a game, there's no mystery there.

(mods: Maybe the whole discussion around IHV marketing deals can be spun off into a new thread?)
Man, I really didn't want to get into this. But Nvidia pushing devs to implement specific graphics techniques that only work well on Nvidia hardware is really not that different a situation, and it seems like a huge stretch to suggest otherwise. Especially when they've done it for things that weren't just some specific Nvidia-special option, like with their tessellation mayhem. You're again getting super hung up on needing some example to be the EXACT SAME as this one, when that shouldn't be the main point.

I just rabidly hate when people try and push good guy/bad guy narratives about large corporations like this. And this forum in particular seems to have some fairly extreme Nvidia brand warriors who work overtime to try and downplay all the lousy things they do. I'm not even some Nvidia hater, either. My last three GPU's have been Nvidia and I think they make great products.

AMD is rightfully getting raked over the coals for the current situation, but Nvidia is not above this kind of crap whatsoever.
 
Man, I really didn't want to get into this. But Nvidia pushing devs to implement specific graphics techniques that only work well on Nvidia hardware is really not that different a situation,

Of course it is. The specifics of why it was different was spelled out in my post. It's not a 'hang up', it's actually recognizing the details of this situation and why it's stroking such controversy.

for things that weren't just some specific Nvidia-special option, like with their tessellation mayhem.

Give me an example of this 'tessellation mayhem', what does this mean? Ok, this is at least an attempt at a comparison, but I honestly don't know what this is referring to.

This isn't about what Nvidia is 'theoretically' capable of. It has to be framed that way because there we don't have evidence of an actual parallel to Nvidia blocking competitors technology, one that is actually easily achievable and expected to be included in games. The specifics matter in this case.

I mean, you literally said the same thing in another thread my man!

Well yea, but the point is that Nvidia isn't trying to compromise the experience of others when doing these deals. I feel the main difference is simply that Nvidia wouldn't feel the need to when their solution is simply superior. Because dont think for a second they'd be above doing something like this if put in the same situation as AMD.

That's all my point is, there definitely is a distinction with a difference. Believing this is, in fact, 'more egregious' does not mean Nvidia isn't the amoral business entity they are, and yes they may be 'capable' of such behavior, it's just recognizing why this story has gotten the legs it has in this moment. Saying what Nvidia might do in an alternate market reality is quite a different statement than saying this is actually just history repeating itself:

It's just silly to announce a 'boycott' of AMD over this situation as if they are doing anything more egregious than Nvidia or Intel has done in the past in terms of anti-consumer moves. It's bad reasoning. I

Like I said, I don't think something like Streamline exists because Nvidia is benevolent - they're in a position where it simply benefits them, they're not scared of developers adding in FSR because they know they have the better technology. But there is a difference in the here and now in degree - like you just said.
 
Last edited:
Completely agree that a boycott over this is silly in the grand scope of 'corporations are not your friends', but I'm not so sure AMD's behavior here really has a direct comparison with Nvidia's past history - is there really a precedent for another IHV deliberately blocking developer choice in adopting a competitors technology? I'm not so sure, can you give the closest comparison in your mind? This seems pretty unique which is part of the reason it's got such traction.
Is there a game that uses Hairworks that also uses TressFX? What about PhysX and Havok/Bullet/Tokamak? The only thing that's really new here is that the technologies exist as mostly an interchangeable component that accepts the same inputs and outputs an upscaled result. I'm sure TressFX and Hairworks work completely differently, for comparison. But I don't think I've ever seen a game support both.

About the only vendor sponsored feature I can think of, other than the upscalers, that I've seen in games together is HBAO+ and CACAO.
 
The only thing that's really new here is that the technologies exist as mostly an interchangeable component that accepts the same inputs and outputs an upscaled result.

I disagree that this is 'only thing', but even still - this is pretty much the lynchpin for the outrage on this in the first place! This is precisely why there was suspicion that it was management interference rather than just developer preference to begin with, the expectation of a feature being included touched this investigation off. Nobody is saying "Where the fuck is my TressFX?!" when a new game drops. This is precisely what separates something from being hand-waved away as whiny fanboys vs. reasonable expectations. It's not a mere bullet point, it's the foundation.

I'm sure TressFX and Hairworks work completely differently, for comparison. But I don't think I've ever seen a game support both.

Well yeah, I mean the former goes a long way to explaining the latter, at the very least it makes it far more difficult to assume there is something contractual going rather than the cold hard realities of development time. If there was say, UE plugins for Hairworks/TressFX and we had 50+ games using both with distinct advantages for each architecture, and games sponsored by one vendor suddenly stop giving the option to the detriment of the users of that particular vendor, then a similar outrage would be justified. But it's just not really comparable, Hairworks/TressFX have been in just a handful of games over the years, they are not expected features, and there's no indication it's a reasonable expectation for developers to allow users to toggle them as distinct options. There's no history of such here.

As I've said, if implementing a competing reconstruction option was akin to adding it to a game without one at all, where it was a month+ of work for each, then it would be far easier to assume games would eventually just default to the reconstruction tech that can support the widest amount of users out of the gate. I'm sure some Nvidia users would still be annoyed and complain, but it would be far more understandable and the announcement wouldn't have generated these types of investigations. The past history of AMD sponsored games, especially from smaller developers, would look less like an incriminating breadcrumb trail and more just like devs choosing the more prudent option wrt development time.

We don't have the same expectation for replacement physics engines/hair rendering because we have no evidence these things are easily replaceable, we have ample evidence that is true for reconstruction though.

As a Nixxes developer said wrt to integrating competing reconstruction technologies with how they actually come about today though, "There's really no excuse".
 
Last edited:
Nobody is saying "Where the fuck is my TressFX?!" when a new game drops.
When I had an AMD GPU, I did. Looking at both the graphics and performance of Witcher 3 on nVidia cards with Hairworks on, I really wished that there was something between Hairworks off and on, again both in performance and image quality. It's part of the reason I upgraded from an AMD card to nVidia. PhysX and Hairworks. They were locked to a vendor and there wasn't an option for something close when you had a card from not that vendor.
 
When I had an AMD GPU, I did. Looking at both the graphics and performance of Witcher 3 on nVidia cards with Hairworks on, I really wished that there was something between Hairworks off and on, again both in performance and image quality. It's part of the reason I upgraded from an AMD card to nVidia. PhysX and Hairworks. They were locked to a vendor and there wasn't an option for something close when you had a card from not that vendor.

Man, I'm glad I wasn't the only one that was thinking that. And even with a GTX 1070, I was wishing that that game offered TressFX because it was far less demanding on the system than Hairworks. TressFX might have actually been useable in game versus the slideshow that the game became with Hairworks (what a piece of dung, IMO).

I sometimes wonder if NV deliberately made Hairworks perform badly so that NV users would have to upgrade their graphics cards in order to use it.

Regards,
SB
 
About the only vendor sponsored feature I can think of, other than the upscalers, that I've seen in games together is HBAO+ and CACAO.

I'm not sure if you want to consider them vendor sponsored or even vendor specific but -

FXAA (promoted by Nvidia at the time) and MLAA (promoted by AMD at the time) do have appearances in games together also with SMAA (third party). Albeit in this case none of these solutions specifically favoured either vendor nor were they limited to either vendor even thought there was a marketing association with both.

DX11 Nvidia specific CPU overhead optimizations (eg. command lists, deferred contexts) do appear in games that support Mantle and later (early DX12). At that point there was still essentially a marketed vendor preference for each API.

There's also been games in which a sponsored game has the other vendors technologies and also games in which both vendor marketed technologies have made appearances but also if they do not overlap in functionality.

In terms of the above discussion though I think we do need to have context in terms of much implementation of a specific technology precludes the practical implementation of the vendors. The upscaling technology being discussed is really more akin to FXAA and MLAA implementation than say Hairworks and TressFX in terms of having both technologies.
 
Back
Top