Current Generation Games Analysis Technical Discussion [2023] [XBSX|S, PS5, PC]

Status
Not open for further replies.
They (deservedly) get a lot of critique on this front, but otoh I'm not aware of another channel which has devoted this much time to really examine DLSS/FSR2 in detail like they have recently. Yeah they should have done it sooner, but better late than never, especially if they back it up with actual evidence.
They have devoted more time downplaying DLSS than showing how it can help PC gamers. So, nope, they dont get any credit for telling us what we already knew - since January, 15th 2020:
 
They have devoted more time downplaying DLSS than showing how it can help PC gamers.
That's just a straight up lie. Absolutely devoid of any truth at all.

I mean, here's another video from a year ago where in the very beginning, he says this:

"DLSS of course has many strengths and we recommend that Nvidia GPU owners use it more often than not".


Again, y'all are so eager to hate on them, you're literally inventing criticisms.
 
Last edited:
They have devoted more time downplaying DLSS than showing how it can help PC gamers. So, nope, they dont get any credit for telling us what we already knew - since January, 15th 2020:

That's one game, and the vast majority of the video is focused on ray tracing and vrs, not dlss.

Youngblood also has severe issues with motion blur and DLSS, which while not a significant regression as you can disable it easily, gets back to my point about low-res buffer effects being missed in these assessments often. However, more to the point, motion artifacting in general was not analyzed thoroughly with these early DLSS comparisons at all, as they usually focused on comparing still scenes, as is the case in this video - one scene where the camera is almost completely still is utilized to highlight the effectiveness of DLSS resolve. Of course with later DF videos this was greatly expanded to more thoroughly compare motion resolve.

Every channel has looked at how DLSS performs in individual games from time to time, I'm talking of a compressive assessment of how it is progressing over a suite of games. I also said they have given it unreasonably short-shrift at points in the past too, so your reply has little to do with my post.
 
Last edited:
It was the first game. It was better than anything else available on every other plattform. It was a giant leap forward for PC gaming and introduced a mechanic which has been used going forward.
 
Going forward, I will pit 4K DLSS perf more against native 1440p instead of 1080p but you get the idea.

I definitely use 4k DLSS perf over native 1080p every time too, it's no contest - but yeah - as your screenshots show, native 1080p has a significant performance advantage, and it can be greater in other games. DLSS has a cost, so it's only really relevant to compare it against bilinear upscaled modes that deliver the same performance, it's internal rendering res really isn't that relevant.

Also to properly judge image quality you need to compare shots with some camera motion, as that's where lower levels of DLSS can break down. My gripe with DLSS at these lower settings is not that when it produces artifacts, they're similar to what you get at the native resolution it's starting from - they can actually be worse. DLSS can amplify these at times, if it was just producing them at a quality of what you would get with native 1440p/1080p I wouldn't care much, they're a little blurrier, big whoop. It's only noticeable when they product artifacts you never see with regular upscaling, no matter what the starting res.

I'll check his fur using DLSS to down-sample later tonight and see how it looks.

It'll be a good test.

Also bear in mind these hair/fur artifacts only really occur with the High hair setting in Spiderman, DLSS just doesn't play well with its implementation, but it even affects the PS5 in Quality mode too, albeit not nearly to the same degree - but hair in performance mode at a lower res is more stable than the hair in quality more, precisely because it's using something close to PC's Medium setting performance vs. High (or slightly less) inn Quality. You just get far more noticable shimmer with it, to the point I don't think I'd ever go over Medium hair regardless of my GPU power. Medium's less fine detail is more than compensated for its image stability imo.
 
It was the first game. It was better than anything else available on every other plattform. It was a giant leap forward for PC gaming and introduced a mechanic which has been used going forward.

We're talking about Hardware Unboxed's coverage of DLSS, not the superiority of the PC platform in this thread, albeit I get for some posters that's their raison d'etre here.

Like why are you trying to educate me about the history and benefits of DLSS? I've looked at it in all games extensively, hell it's pretty much the only videos I ever post on my channel, and I pretty much always use it when available. Outside of bitching about shader compiling there is not a single facet of PC rendering tech I don't devote more time to.
 
I definitely use 4k DLSS perf over native 1080p every time too, it's no contest - but yeah - as your screenshots show, native 1080p has a significant performance advantage, and it can be greater in other games. DLSS has a cost, so it's only really relevant to compare it against bilinear upscaled modes that deliver the same performance, it's internal rendering res really isn't that relevant.

Also to properly judge image quality you need to compare shots with some camera motion, as that's where lower levels of DLSS can break down. My gripe with DLSS at these lower settings is not that when it produces artifacts, they're similar to what you get at the native resolution it's starting from - they can actually be worse. DLSS can amplify these at times, if it was just producing them at a quality of what you would get with native 1440p/1080p I wouldn't care much, they're a little blurrier, big whoop. It's only noticeable when they product artifacts you never see with regular upscaling, no matter what the starting res.



Also bear in mind these hair/fur artifacts only really occur with the High hair setting in Spiderman, DLSS just doesn't play well with its implementation, but it even affects the PS5 in Quality mode too, albeit not nearly to the same degree - but hair in performance mode at a lower res is more stable than the hair in quality more, precisely because it's using something close to PC's Medium setting performance vs. High (or slightly less) inn Quality. You just get far more noticable shimmer with it, to the point I don't think I'd ever go over Medium hair regardless of my GPU power. Medium's less fine detail is more than compensated for its image stability imo.
Thanks but my comparisons are usually in movement as well (you can see Kratos running, or Ellie moving. That's why my comparisons are never properly aligned). I even did video comparisons. (Make sure to download them, as Dropbox preview view will be highly compressed)

 
Also bear in mind these hair/fur artifacts only really occur with the High hair setting in Spiderman, DLSS just doesn't play well with its implementation, but it even affects the PS5 in Quality mode too, albeit not nearly to the same degree - but hair in performance mode at a lower res is more stable than the hair in quality more, precisely because it's using something close to PC's Medium setting performance vs. High (or slightly less) inn Quality. You just get far more noticable shimmer with it, to the point I don't think I'd ever go over Medium hair regardless of my GPU power. Medium's less fine detail is more than compensated for its image stability imo.

I tried this last night (hair on highest setting) with the DLSS 4k down sample trick and it matched the quality of native 1440p with the games TAA which I was little surprised by as normally it offers better AA than native, but it was a huge improvement over plain old DLSS quality mode at 1440p.
 
It was the first game. It was better than anything else available on every other plattform. It was a giant leap forward for PC gaming and introduced a mechanic which has been used going forward.
Not true, Youngblood was the second game to get DLSS 2.0. Deliver us the Moon got it first in December 2019.

PS. the HUB hate here is absolutely ridiculous.
 
Not true, Youngblood was the second game to get DLSS 2.0. Deliver us the Moon got it first in December 2019.

PS. the HUB hate here is absolutely ridiculous.
Both games got the update in December. And there is no hate for AMDunboxed. They have never considered DLSS as a relevant feature for a GPU decision in the high end. Here is their conclusion from the 6800XT review:

2 1/2 years later this "questionable" feature looks in most cases as good as or even better than native resolution.
 
These updates on how older cards are faring are nice but in reality no one should be considering a 3080 10GB today for $700 when the 4070 exists. Also isn’t the 3080 EOL and no longer being manufactured?
 
Last edited:
Yes, a new 3080 isnt relevant anymore. The 4070 is ~5% slower in 1440p and costs 150€ less in germany... Unlike AMD it doesnt make sense to buy anything higher than a new 3060 on nVidia sides, when you dont want to buy a Lovelace GPU.
 
Both games got the update in December.
Do you have a source for this?

Official Betshesda patch notes are January 9th 2020. SteamDB shows the exact same date for the game adding nvngx_dlss.dll.



Nvidia introduced the RTX features a couple of days earlier during their CES 2020 presentation. I don't see anything pointing to December 2019.

And there is no hate for AMDunboxed.
It's exactly this kind of childish bullshit that I mean. Hate, name-calling, call it whatever you want, doesn't really matter. The dogpiling on HUB on this forum has been going on for a good while and it's pretty distasteful imo.
 
And they have bought a certain amount of it on to themselves with some of their testing and words.
Yeah, they flip flopped a lot, and their messaging weren't consistent at all.

First they claimed DLSS2 is good, but not that good, then they insisted it doesn't factor in purchasing decisions, then claimed FSR1 is close to DLSS2, then as DLSSs was getting better and getting wider support, they said DLSS2 is important and a key selling feature of RTX GPUs then claimed FSR2 is also close to DLSS2 and is a DLSS killer, then said they are only going to test games with FSR2, then backpedalled on that decision, and now they say DLSS2 is far superior.

They were doing the same with RT too, first refusing to acknowledge its importance, and not factoring it in purchasing decisions, and ignoring testing it (outside of some curiosity tests), then started featuring it bit by bit, in their main reviews, and now they are factoring it in the overall results and conclusions.

They just need to be consistent, and not let the trends take them up and down.
 
Last edited:
These updates on how older cards are faring are nice but in reality no one should be considering a 3080 10GB today for $700 when the 4070 exists. Also isn’t the 3080 EOL and no longer being manufactured?
Yeah, they do have refurbished 3080's on amazon (and other sites for much less). Lowest price should be considered if price/performance comparison reviews are to be taken seriously.
 
It's exactly this kind of childish bullshit that I mean. Hate, name-calling, call it whatever you want, doesn't really matter. The dogpiling on HUB on this forum has been going on for a good while and it's pretty distasteful imo.
Maybe because they called something "questionable" which has influenced a huge wave of new upscaling techniques to improve the life of a PC gamer. DLSS doesnt matter, it was the push towards better reconstruction techniques to make the most of the hardware. And we have already learned in January 2020 that native rendering will not the future anymore.

HUB was one channel who was against upscaling because it would have given nVidia a huge push. So they downplayed it as long as possible - see the 6800XT review.
 
HUB was one channel who was against upscaling because it would have given nVidia a huge push.
The bolded part is blatant conjecture on your part, right? Or did they actually say they are against upscaling because they are anti-Nvidia?

Also, still waiting on your source for Youngblood DLSS 2.0 December 2019.
 
Status
Not open for further replies.
Back
Top