Current Generation Games Analysis Technical Discussion [2023] [XBSX|S, PS5, PC]

Status
Not open for further replies.
HMM thats an interesting data point.
8x the L2 gets you approx, the same performance, with 50% of the mem bandwidth.

For each doubling of L2, NVidia can effectively reduce the VRAM to 80% of the previous level.
It would be interesting to do some similar maths on AMD's infinity cache numbers and VRAM bandwidth, with the caveat that it more like a L3, than a L2.

Although I strongly suspect it's not a fully linear relationship, and that some minimum bandwidth for VRAM is still required.
"8GB L2 is limiting games" anyone? ;)

Related to the point made above, my initial thought was that No console will ever match the 4090ti,
but if we get a 5070ti with 384 L2, then just maybe we could end up seeing a console with similar performance.
However I suspect that would physically look a bit like a Epyc-X chip, and priced accordingly.
 
How much does L2 cache matter? I hear they increased it's amount by some crazy amount compared to the 3000 series.
It's a good substitute to supplement bandwidth, but not necessarily for actual memory capacity. If the data aint there due to limited memory size, more L2(or L3) obviously isn't gonna help you.
 
It's a good substitute to supplement bandwidth, but not necessarily for actual memory capacity. If the data aint there due to limited memory size, more L2(or L3) obviously isn't gonna help you.
This is pretty well illustrated by the relative placement of the new 4070 and the old 3080 10GB in the performance charts.
The 4070 is a few percent ahead of the 3080 at 1080p, pretty close to even at 1440p, and loses by a few percent to even 10-12% in some titles at 4k, in spite of the lower VRAM pool.
 
So it's basically memory bandwidth hmm...is it a cheaper or more easy thing to add l2 cache rather than simply add more core memory bandwidth?
 
So it's basically memory bandwidth hmm...is it a cheaper or more easy thing to add l2 cache rather than simply add more core memory bandwidth?
Eventually you need to go to vram, so I would say that it’s pretty critical to have that still.

Cost wise, it gets more expensive the more lower level cache there is.
 
Our understanding of technologies changes as the understanding of game technologies change. They know the future is streaming so they have to accommodate for that now. Previously with HDDs being so slow, you couldn’t count on them to perform what is being asked of them today.
A bit late of this, but streaming is not the future it has been a cornerstone of AAA game technology for more than 20 years. Renderware powering GTA IIII was the breakthrough middleware that demonstrated why streaming was so important. And it was doing this not from data read from HDDs, but slow-arse DVDs.

You cannot blame Microsoft, they can only implement in software [Windows APIs] what hardware supports and allows. There simply is no equivalent to the current generation console's I/O design. And to implement that design in hardware would limit the PC.
 
You cannot blame Microsoft, they can only implement in software [Windows APIs] what hardware supports and allows.
As one of the biggest organizations in the space, they’re not 100% blameless — they definitely have a lot of influence on hardware and driver advancement (behind closed doors and in public committees) as the developers of both windows and all of the windows apis — but I agree with your overall point.
 
A bit late of this, but streaming is not the future it has been a cornerstone of AAA game technology for more than 20 years. Renderware powering GTA IIII was the breakthrough middleware that demonstrated why streaming was so important. And it was doing this not from data read from HDDs, but slow-arse DVDs.

You cannot blame Microsoft, they can only implement in software [Windows APIs] what hardware supports and allows. There simply is no equivalent to the current generation console's I/O design. And to implement that design in hardware would limit the PC.
Correct, sorry I mean Microstreaming! (Thousands of small requests) Sorry yes, it’s always been streaming since as you say slow ass CDs.

All of our games have been streaming for some time.
 
Good video comparing upscaling vs native.


Tim makes a good point in the video about native's slightly better image quality not worth the huge performance cost over using DLSS in quality mode for the average gamer.
 
Good video comparing upscaling vs native.


Tim makes a good point in the video about native's slightly better image quality not worth the huge performance cost over using DLSS in quality mode for the average gamer.
The main takeaway for me from this video is that a lot of games have bad aa and TAA is terrible like I've said multiple times. With regards to whether the image quality is worth the performance cost, that really depends on the individual. For me, the better image quality is worth the cost.
 
I like that they showed some of the edge cases where DLSS/FSR break down (albeit a minority considering there are far more in the games they tested), and not surprisingly it's usually involving post-process effects. The massive aliasing you get in the GoW cutscene with the vegetation in the background for example, is due to DoF being applied which is a low res effect that hasn't been compensated for with scaling. This is so often missed, DLSS performance can have issues with moire patterns sure, but more often than not it's where it scales disproportionately with post process effects that stand out.

It can be done. Metro Exodus & Deathloop are games I play at 4K/DLSS performance on my 3060, and they rarely suffer from these kinds of glaring artifacts that other games have. I'm sure there's a performance hit if you don't scale post process effects, maybe there should be a reconstruction-specific option in games going forward, but when not taken into account they really give a bad impression for reconstruction that can be remedied.

Minor quibble with the video as it's somewhat outside of its scope, but I would have liked to see the equivalent performance setting for traditional scaling in some of those games too. Show what DLSS actually brings to the table vs. older methods at the same performance target.

The main takeaway for me from this video is that a lot of games have bad aa and TAA is terrible like I've said multiple times.

With only one game (Death Stranding) where they found that DLSS provided a better image quality than native, so not sure how you came to that conclusion (and even then they didn't touch on motion blur or Dof in that game, where DLSS breaks down compared to native TAA). Their critiques focusing on image stability would be magnified substantially when using SMAA/MSAA, with the latter having a huge performance penalty in modern engines.

I mean you can test it right now - enable SMAA in Spiderman. It's just a sea of blinking pixels and moire patterns on skyscrapers as you swing by, there's just too much sub-pixel detail that those AA methods can't touch. You need the temporal component to deal with shader aliasing.

And it took them only three years to make this conclusion. Everyone knows that who has used DLSS 2 since January 2020...

DLSS only recently with DLSS 2.5+ made significant strides with ghosting, and as the video showed, there are still plenty of cases where it falls short of native. As they said, it can vary significantly per game, DLSS settings used, and output resolution. DLSS improved since 2020 quite a bit, the implementation in games improved too.

HB have definitely had some dumb takes with regards to DLSS, either not factoring it at all into GPU reviews or downplaying it (especially when comparing it to FSR 1), but it's simply not some truism that it's been a near-imperceptible, or superior method than a game's TAA for years. There are just too many variables in settings and implementation quality to imply that's some obvious fact that some people have just chosen to ignore. It depends.

They (deservedly) get a lot of critique on this front, but otoh I'm not aware of another channel which has devoted this much time to really examine DLSS/FSR2 in detail like they have recently. Yeah they should have done it sooner, but better late than never, especially if they back it up with actual evidence.
 
Last edited:
Don't know where else to put these as I'm back on a 1440p monitor now (don't ask)

Using DLSS at 4k to provide down-sampling to 1440p is proving to be very effective and so far I've yet to find a single game where it doesn't produce a better image than native 1440p in every way.

And as it's using 1080p as a base resolution performance is way higher too.

My 1440p monitor natively accepts a 4k signal and then down-samples it to to the panels 1440p resolution, it does this so BenQ could advertise the monitor as being PS5 compatible (This monitor was released before PS5's 1440p firmware update)

Now I'm not sure if you could use CRU to create a custom 4k resolution for a 1080p or 1440p monitor and then down-sample from there or if you could enable 4k via Nvidia's dynamic super resolution and down-sample that using DLSS?

These three screenshots show the difference and how much better using DLSS to down-sample from can be over native while offering a crap load more performance.

Open in new browser tab and click between them.
 

Attachments

  • APlagueTaleRequiem_x64_2023_04_16_22_54_40_120.png
    APlagueTaleRequiem_x64_2023_04_16_22_54_40_120.png
    7.1 MB · Views: 9
  • APlagueTaleRequiem_x64_2023_04_16_22_54_58_443.png
    APlagueTaleRequiem_x64_2023_04_16_22_54_58_443.png
    7.1 MB · Views: 10
  • APlagueTaleRequiem_x64_2023_04_16_22_55_15_092.png
    APlagueTaleRequiem_x64_2023_04_16_22_55_15_092.png
    7.2 MB · Views: 11
That's been always the experience, native 1440p (or 1440p+DLSS) are largely inferior to 4K/DLSS performance despite internally being at so called "1080p".


It just works. And needs VRAM. For me, 4K DLSS performance by itself makes rendering at 1440p altogether obsolete (not the monitor, the render resolution itself). There really is no point wasting compute power to native 1440p or 1440p upscaling when 4K DLSS performance exists. 9 out of 10 times this provides much, much better image quality, LODs, higher quality textures, much better clarity in motion, and of course the added benefit of DLSS and its reconstruction over native TAA while offering similar or better performance.

Also, you can use DSR/DLDSR with DLSS. You don't need the native monitor feature to utilize the combo.

Many on forums will brush off these comparisons by saying "its sharpen bro" well, it's not. There literally is extra texturing detail that is not even visible/seen at native 1440p. Its not reconstruction either, the engine simply loads higher quality assets when it is upscaling/outputting to 4K.

Look at Ellie's face, most of her face details are simply non existent at native 1440p. It comes alive with 4K DLSS performance.

TLOU one was at 1440p but I also did a lot of comparisons to show people how "1080p" rendering of DLSS has nothing to do with actual 1080p at all.


This one is one of the most brutal


Going forward, I will pit 4K DLSS perf more against native 1440p instead of 1080p but you get the idea.
 
I get both better IQ than native in pretty much all areas with DLSS and higher performance 👍
I still often deactivate DLSS because of the unstable imagequality in movement. DLSS is perfect for screenshots but in movement in can often be a flickering hell, where details are often visible but disappear in the next frame.
I'm also not a fan of temporal AA as it also introduces artifacts to a scene, but it is so far the only other solution to reduce shader aliasing & flickering.
 
Last edited:
Good video comparing upscaling vs native.


Tim makes a good point in the video about native's slightly better image quality not worth the huge performance cost over using DLSS in quality mode for the average gamer.

Really good video. It’s great that they are finally tackling this topic head on even though it does seem quite late given their past skepticism of upscaling. DLSS seems to mostly struggle with extremely fine detail where there’s no substitute for increased resolution like in Spidey Mile’s’ shirt and the fur in his coat. This could be a bigger problem as textures become even more detailed in the future. For now though this should reassure folks who depend on upscaling even on performance mode that they aren’t really losing much in IQ in return for a massive performance boost.
 
Good video comparing upscaling vs native.


Tim makes a good point in the video about native's slightly better image quality not worth the huge performance cost over using DLSS in quality mode for the average gamer.
I would certainly always use DLSS quality at the very minimum if it was available to me. Can't argue with that.
 
Really good video. It’s great that they are finally tackling this topic head on even though it does seem quite late given their past skepticism of upscaling. DLSS seems to mostly struggle with extremely fine detail where there’s no substitute for increased resolution like in Spidey Mile’s’ shirt and the fur in his coat. This could be a bigger problem as textures become even more detailed in the future. For now though this should reassure folks who depend on upscaling even on performance mode that they aren’t really losing much in IQ in return for a massive performance boost.

I'll check his fur using DLSS to down-sample later tonight and see how it looks.

It'll be a good test.
 
HB have definitely had some dumb takes with regards to DLSS, either not factoring it at all into GPU reviews or downplaying it (especially when comparing it to FSR 1), but it's simply not some truism that it's been a near-imperceptible, or superior method than a game's TAA for years. There are just too many variables in settings and implementation quality to imply that's some obvious fact that some people have just chosen to ignore. It depends.

They (deservedly) get a lot of critique on this front, but otoh I'm not aware of another channel which has devoted this much time to really examine DLSS/FSR2 in detail like they have recently. Yeah they should have done it sooner, but better late than never, especially if they back it up with actual evidence.
HUB have literally always been positive about DLSS2. And they've probably talked about/done videos about reconstruction stuff like this more than any other channel. I really feel like people are just inventing reasons to criticize them at this point.
 
Status
Not open for further replies.
Back
Top