Nvidia DLSS 1 and 2 antialiasing discussion *spawn*

Interesting comparison. The DLSS shot looked better for most of the video IMO but it was clearly massively sharpened to achieve that look. I quite like sharpening, hence my preference, but it fell apart quite badly in that last scene with the trees where the DLSS shot was flickering quite a bit.
For DLSS performance mode it was pretty reasonable, though I'll likely be playing at a less performant DLSS mode since my gaming preference is more towards visual quality.
 
1. Not a single piece of proprietary Nvidia technology has ever become an industry standard feature.
2. When developers have access to a single solution that works on ALL platforms DLSS will be redundant.
...because it's not and likely NEVER was intended to become an industry standard feature. DLSS is a feature to benefit Nvidia GPU owners primarily. Nvidia partners with studios to integrate their proprietary features. And customers will keep asking and pushing for it... until something truly better comes along.

FSR ain't it.... not yet... Intel's solution might be? But at this point DLSS is already integrated into all of the biggest engines out there.

By the time the industry standardizes a "DLSS" type feature... Nvidia will have moved on to the next big thing that everyone will want AMD to copy and open source so they can have it too...


And absolutely none of that is happening within the next 12 months...
 
...because it's not and likely NEVER was intended to become an industry standard feature. DLSS is a feature to benefit Nvidia GPU owners primarily. Nvidia partners with studios to integrate their proprietary features. And customers will keep asking and pushing for it... until something truly better comes along.

FSR ain't it.... not yet... Intel's solution might be? But at this point DLSS is already integrated into all of the biggest engines out there.

By the time the industry standardizes a "DLSS" type feature... Nvidia will have moved on to the next big thing that everyone will want AMD to copy and open source so they can have it too...


And absolutely none of that is happening within the next 12 months...

is it though? All of the biggest?

And per my other reply, there's likely some dev teams already working on such a solution and likely started working on it a while ago.
 
is it though? All of the biggest?

And per my other reply, there's likely some dev teams already working on such a solution and likely started working on it a while ago.
Yes. All of the biggest engines.

This has to be like 90% of games made today right here...
Unreal 4/5
Unity

Then there's:
Decima
RAGE (Rockstar)
Northlight
4A Engine
Crystal Engine
Disrupt Engine
Snowdrop
Anvil Next
ect ect..

I could continue on..
 
When developers have access to a single solution that works on ALL platforms DLSS will be redundant.
That doesn't make any sense.

First, even if there is a single solution that works on ALL platforms, it is not guaranteed to work best on all platforms, more likely it will be inferior performance and quality wise - dp4a is not as fast as matrix multiplies, neither it is as precise as FP16, neither it is as mature.
So it's not enough to just work on ALL platforms, it has to be the best on ALL platforms to win the market.

Second, since these are similar technologies, why would devs want to hurt the majority of their audience by using a single solution on ALL platforms, instead of simply adding in another solution?
That's the same stuff as replacing your standard in-game spatial upscaler with FSR - requires a few lines of code changes and probably slightly tweaked DLSS's dll so that inputs match XeSS's ones.
Unless there is a contract, which would explicitly prevent addition of competing technologies, a single solution similar to DLSS would only improve DLSS adoption.
 
1. Not a single piece of proprietary Nvidia technology has ever become an industry standard feature.
CUDA says hi.

And beyond that pretty much all pieces of "proprietary Nvidia technology" has become industry standard features. From vertex and pixel shader (yeah, they were developed by Nvidia) to ray tracing h/w acceleration approach. It's in fact harder to think of such pieces which didn't.
 
CUDA says hi.

And beyond that pretty much all pieces of "proprietary Nvidia technology" has become industry standard features. From vertex and pixel shader (yeah, they were developed by Nvidia) to ray tracing h/w acceleration approach. It's in fact harder to think of such pieces which didn't.
NVIDIA was first to get consumer hardware with vertex and pixel shader support on the market, but that's not the same as them being developed by NVIDIA.
3dfx was the only one making PS 1_0 hardware, Rampage would have probably beaten GF3 to market hadn't 3dfx gone under. That would have made it first with VS support too.
ATi's R100 aka original Radeon wasn't far from qualifying for D3D Pixel Shader support either.
 
NVIDIA was first to get consumer hardware with vertex and pixel shader support on the market, but that's not the same as them being developed by NVIDIA.
3dfx was the only one making PS 1_0 hardware, Rampage would have probably beaten GF3 to market hadn't 3dfx gone under. That would have made it first with VS support too.
ATi's R100 aka original Radeon wasn't far from qualifying for D3D Pixel Shader support either.
GeForce 256 had pixel shading hardware sometimes referred to as PS 0.5. Further developments were based on that.
 
GeForce 256 had pixel shading hardware sometimes referred to as PS 0.5. Further developments were based on that.
R100 released less than 6 months from GF256, it definitely wasn't based on that. Rampage had been in development long before GF256 too, even the "final iteration" feature wise
 
So why does it take them longer to copy designs today than back then???
 
So why does it take them longer to copy designs today than back then???
They got more complex? Everyone "copies" designs btw because you don't really have a lot of options if you want to be compatible with DX and VK/OGL. Remember S3TC? That was "copied" by everyone.
 
Back
Top