Current Generation Games Analysis Technical Discussion [2020-2021] [XBSX|S, PS5, PC]

Status
Not open for further replies.
I understood that NXG was trying to make the same point. He also said that CBR is another method with a similar output (but worse when at 400% zoom), but that they're broadly close. I think this is a totally fair comment to make.

Broadly close on the same way a bilinear upscale is broadly close to native perhaps. Not broadly close for anyone that actually cares about a sharp and stable image though.

Check out the Digital Foundry CB vs DLSS analysis for this game for a proper comparison.
 
the problem here is that NXG is presenting his opinion that DLSS is worse than console native as a fairly unambiguous fact

Native resolution is objectively better than upscaled resolution regardless of the scaling technique. At best DLSS looks better in parts of the image and worse in others.

If I had the choice of DLSS Quality at 4k or native 4k, everything else the same, then it'd be native every single time. Wouldn't you?
 
As noted a couple of times already, the ghosting can be effectively eliminated in this game by simply using a newer DLSS dll. If it couldn't then I may agree with you, but it can. So putting that aside, I don't disagree that DLSS has some unambiguous quality compromises vs native. But it also has some unambiguous quality improvements over native, the primary one for me which is amply demonstrated in NXGs own video being a more stable image which is by far the worst side effect of low resolution/poor AA in my opinion.

So the question isn't whether DLSS can match native point for point, but rather which set of compromises results in a better overall image. And once the ghosting issue is resolved the answer to that question is DLSS imo. Others may feel differently but the problem here is that NXG is presenting his opinion that DLSS is worse than console native as a fairly unambiguous fact, helped along by the fact that he's ignored the now widely known fix for DLSS ghosting.
Firstly, the DLLs don’t completely fix the problem in Death Stranding and it’s not a fix for all DLSS enabled games. Ghosting is still a problem regardless of the DLL used from my testing.

Secondly but more importantly, don’t view my response as a defence for NX Gamer nor the Ps5. I simply responded because I felt you were trying to hand wave away the evident flaws of DLSS. As far as I’m concerned, the sooner we can get meaningful technological advances in GPUs, the sooner we can move away from these crutches.
 
Broadly close on the same way a bilinear upscale is broadly close to native perhaps. Not broadly close for anyone that actually cares about a sharp and stable image though.

Check out the Digital Foundry CB vs DLSS analysis for this game for a proper comparison.

I've watched it, they're broadly similar.
 
Crutch: “a source or means of support or assistance that is relied on heavily or excessively”

DLSS is a crutch.

In that case all of processing constrained graphics is a "crutch". Rasterisation is a crutch. Polygons are a crutch. Compressed textures are a crutch. Pixels are a crutch.

Bemoaning a "crutch" and talking about "meaningful technological advances in GPUs" is to naively think those advances won't include better "crutches" (as you define them) and then patting yourself on the back for it.

Native resolution is objectively better than upscaled resolution regardless of the scaling technique.

In some cases DLSS is actually more representative of what a natural image would show than when seen with 1, 2 4 etc sample points per display device pixel.

You can't actually say that "Native resolution is objectively better than upscaled resolution regardless of the scaling technique" because in some cases with ML upscaling this is not accurate, and "native resolution" is worse. Including in the mathematical sense of colour per pixel in terms of representing what should be behind it.
 
In that case all of processing constrained graphics is a "crutch". Rasterisation is a crutch. Polygons are a crutch. Compressed textures are a crutch. Pixels are a crutch.

Bemoaning a "crutch" and talking about "meaningful technological advances in GPUs" is to naively think those advances won't include better "crutches" (as you define them) and then patting yourself on the back for it.
.
Nice slippery slope. Could go skiing on it. Seriously though, it’s okay to have different opinions. Your thoughts on the matter won’t change my views. I see DLSS as a crutch and by the very definition of crutch, it is a crutch.
 
Nice slippery slope. Could go skiing on it. Seriously though, it’s okay to have different opinions. Your thoughts on the matter won’t change my views. I see DLSS as a crutch and by the very definition of crutch, it is a crutch.

Unfortunately, unless there is some paradigm change (likely involving something other than silicon on insulator tech that drives current computing devices) with how CPUs and GPUs are made, these "crutches" are the primary way forward for increased graphics rendering quality and complexity.

I don't like DLSS because of the limited hardware it can run on and the fact that anything less than Quality mode looks like ass to me and even Quality mode doesn't always result in a better experience. But the premise (need) behind its creation isn't going to go away anytime soon.

Thus, things like VRS, Checkboard Rendering (and all other temporal rendering techniques), ML upscaling techniques (what DLSS is part of), etc. are all useful tools in an era where silicon scaling is rapidly grinding to a halt.

So, unless there's a paradigm change things like ML upscaling (DLSS) are going to become increasingly needed and more prevalent. The hope is that quality of implementation, ease of implementation, and ubiquity across various hardware will improve as unfortunately there isn't any terribly promising technologies close to commercial deployment coming soon that will replace SOI.

Regards,
SB
 
Nice slippery slope. Could go skiing on it. Seriously though, it’s okay to have different opinions. Your thoughts on the matter won’t change my views. I see DLSS as a crutch and by the very definition of crutch, it is a crutch.

There is no "slippery slope" in rendering. That's the whole point. It's all about best compromises, and the industry (or even individual developers) trying to find its / their way towards the best set of compromises for a given goal.

I know I won't change your views. They aren't based on the reality of making something that runs on machines people use.
 
I know I won't change your views. They aren't based on the reality of making something that runs on machines people use.

Or will use, any time soon. The cost of doing anything interesting at native 4k is just vastly too high, being able to reconstruct it over a few frames is an absolute requirement. Some games run 'native' but still upscale certain effects -- some games use quite poor reconstruction effects, some games design the camera movement and art direction around the constraints of reconstruction, and some games use dlss, which is at the moment the best option. Regardless, I think 'dlss is unstable and ghosts' is an uncontroversial claim and some people are a little quick to defend it -- if you want a clean looking game, it's never going to flawlessly deliver that. However, your clean looking games also going to be stuck in 2014's tech for a very long time.

What i do take affront to is the idea that rtx cards 'aren't ready' for raytracing. You don't need 16 samples a pixel at 4k in order to use an effect, and rt effects have enormous effects in actual shipped cards, even sometimes on the much weaker amd cards. It's a little bit of a "five steps forward, one step back" thing, but that's how people have always made progress in graphics.

He is only human, and

He is human and completely untrained on how renderers work. DF makes mistakes too, but at least they put in an effort to make their big mistakes and omissions somewhat rare.
 
DF makes mistakes too, but at least they put in an effort to make their big mistakes and omissions somewhat rare.

Not only that they'll admit when they make a mistake.

And the biggest thing to me is that they equally praise the PS consoles, Xbox consoles, and PC if they deserve it.

They also equally bash the PS consoles, Xbox consoles, and PC if they deserve it.

Sometimes PS consoles come out ahead in head to head comparisons. Sometimes Xbox consoles come out ahead. Sometimes PC comes out ahead.

I'm always extremely skeptical of any site where it's almost always the case when 1 platform consistently is praised while other platforms almost never get any praise. That's generally a clear sign of bias either intended or unintended. This has never been more evident than this generation when both console platforms are so close to each other in performance and capabilities.

I'm a PC gamer, but the experience in games on PC isn't always better than the experience on console. So, any site that would try to cater to me by saying PC is always better is immediately on my "take with a huge pile of salt" list. Same goes for any site that almost always says games on PS consoles are better or any site that almost always says that games on Xbox consoles are better.

Regards,
SB
 
There is no "slippery slope" in rendering. That's the whole point. It's all about best compromises, and the industry (or even individual developers) trying to find its / their way towards the best set of compromises for a given goal.

I know I won't change your views. They aren't based on the reality of making something that runs on machines people use.
No, your argument was a slippery slope logical fallacy. Rendering is about making compromises and that was never in dispute. You took my comment where I called DLSS a crutch and then ran a mile with it to a pointless and unintended end, with the sole purpose of justifying your disagreement with my use of the word crutch. If you don’t like the definition of the word, take it up with Oxford and Mariam & Webster.

Trying to spin my argument as not based on reality is disingenuous at best and malicious at worst. As GPUs have gotten more powerful, we almost always move away from crutch rendering techniques created to bridge the inadequate power of the hardware. We move to techniques with less compromise than before. We moved from bilinear/trilinear filtering to anisotropic as example and the same vain, the industry will move away from DLSS when the time is right.

Anyway, you’ve made it awful clear that you have no interest in discussing actual rendering. Instead, you’ve chosen to dwell on semantics and fallacious arguments. Please refrain from quoting me in the future as I have no interests in intellectually dishonest discussions.
 
Unfortunately, unless there is some paradigm change (likely involving something other than silicon on insulator tech that drives current computing devices) with how CPUs and GPUs are made, these "crutches" are the primary way forward for increased graphics rendering quality and complexity.

I don't like DLSS because of the limited hardware it can run on and the fact that anything less than Quality mode looks like ass to me and even Quality mode doesn't always result in a better experience. But the premise (need) behind its creation isn't going to go away anytime soon.

Thus, things like VRS, Checkboard Rendering (and all other temporal rendering techniques), ML upscaling techniques (what DLSS is part of), etc. are all useful tools in an era where silicon scaling is rapidly grinding to a halt.

So, unless there's a paradigm change things like ML upscaling (DLSS) are going to become increasingly needed and more prevalent. The hope is that quality of implementation, ease of implementation, and ubiquity across various hardware will improve as unfortunately there isn't any terribly promising technologies close to commercial deployment coming soon that will replace SOI.

Regards,
SB
Technology advances in leaps followed by a period of stagnation and then the process repeats itself. In 100 years, the use of DLSS and many rendering techniques used today will be non-existent due to technological advances. We always move to techniques with less compromise as technology advances.

With regards to the prevalence of DLSS, as long as it’s not open source, it’s period of relevance is drastically limited. It’ll eventually be replaced by an open source equivalent at some point and we’re already seeing evidence of that with intels proposed solution. I’m don’t think DLSS is useless. It’s quite useful but, it has its very evident flaws. I guess I take strong offence to people parading around spewing out the marketing speak of their favourite hardware manufacturer. I’m not saying you’re doing that but certain people here are quite guilty of it.

Finally, how we categorize what is and is not a crutch is subjective. It’s fine to have disagreement and I’m perfectly okay with that. What I won’t tolerate is, intellectually dishonest arguments which aim mis-characterize the comments of users with the sole aim of winning internet points. It’s why I called out function in my previous post because it is the lowest form of argument. It’s something people resort to when they don’t understand how to have a proper cordial discussion. Those types of arguments belong to the realm of Facebook, Twitter and gamefaqs not here.
 
I'm not sure why people are bashing him for his overclocked RTX2070 vs RTX2070 Super comments for?

Techpowerup has the RTX2070 Super at

9% faster then an RTX2070 at 1080p
12% faster then an RTX2070 at 1440p
12% faster then an RTX2070 at 2160p

An RTX2070 at 2Ghz core (Like NXGamer says his is clocked too) is enough to bring it within 5-6% of an RTX 2070 Super.

So his claims are not far off in all honesty.
 
As noted a couple of times already, the ghosting can be effectively eliminated in this game by simply using a newer DLSS dll.

There won't be many PC gamers dropping in .dll files in to games to increase the IQ.

I'm guessing the vast majority will be using the .dll that comes with the game, so it's fair for NXG to compare that rather then what is essentially a 'user mod'
 
I'm not sure why people are bashing him for his overclocked RTX2070 vs RTX2070 Super comments for?

Techpowerup has the RTX2070 Super at

9% faster then an RTX2070 at 1080p
12% faster then an RTX2070 at 1440p
12% faster then an RTX2070 at 2160p

An RTX2070 at 2Ghz core (Like NXGamer says his is clocked too) is enough to bring it within 5-6% of an RTX 2070 Super.

So his claims are not far off in all honesty.





Its 20% faster at 4k and 18% faster at 1440p. Regardless, its not complicated. If you have a 2070 in your computer, you have a 2070. There's really no reason to do mental gymnastics like he does and try to up your card to a next tier because you applied some oc. He's doing this because its one of his numerous ways do downplay computers and inflate whatever sony product he's testing. I mean just recently, his conclusion for Horizon, a game which you can run on PC at higher fidelity, higher framerate, higher resolution, use mouse and keyboard, was thats the best way to play was on playstation. He's completely unqualified to do what he pretends he does and he's not even trying to hide the sony cheerleading that he does at every step
 



Its 20% faster at 4k and 18% faster at 1440p. Regardless, its not complicated. If you have a 2070 in your computer, you have a 2070. There's really no reason to do mental gymnastics like he does and try to up your card to a next tier because you applied some oc. He's doing this because its one of his numerous ways do downplay computers and inflate whatever sony product he's testing. I mean just recently, his conclusion for Horizon, a game which you can run on PC at higher fidelity, higher framerate, higher resolution, use mouse and keyboard, was thats the best way to play was on playstation. He's completely unqualified to do what he pretends he does and he's not even trying to hide the sony cheerleading that he does at every step

Where are you getting your data from as Techpowerup data disagrees with your numbers.

And NXG never claimed to have a 2070S in his machine, nor have I. He's claimed that an overclocked 2070 is close to a 2070S, which it is.

You're letting your dislike for NXG blind you.
 
  • Like
Reactions: snc
Where are you getting your data from as Techpowerup data disagrees with your numbers.

And NXG never claimed to have a 2070S in his machine, nor have I. He's claimed that an overclocked 2070 is close to a 2070S, which it is.

You're letting your dislike for NXG blind you.


https://www.computerbase.de/2019-07...er-test/2/#abschnitt_benchmarks_in_2560__1440

As always, it depends on the suite of tests and the areas benchmarked. I like to use computerbase because their graphs allow to chose which card you want as the baseline by clicking on it and all the rest are automatically scaled around that
 
Status
Not open for further replies.
Back
Top