davis.anthony
Veteran
but the PS5 looks to outpace a 2080 when using RT here which is rarely, if ever seen.
RT Reflections on the PS5 is of lower spatial resolution, that is checkerboarded, and temporally rendered as well, they also come from a very low resolution BVH structure as well, so indeed it's a very light form of RT reflections that is more suitable for console hardware.I suspect what's happening here is that the RT implementation is very light. Considering it's almost certainly a very similar implementation to that in Spiderman
DLAA (not DLSS) is not superior to PS5 ITGI
DLSS is rendering at lower resolution then upscaling the image using AI and applying AA using AI as well, to produce an image that is near/equal to/better than native resolution + TAA, depending on the situation. DLSS is designed to boost fps primarily while minimizing or maintaining image quality.
DLAA is rendering the game at native resolution then applying AA using AI to boost the quality to near super sampling resolution, so it's always universally better than native + TAA. DLAA is designed to boost image quality at the expense of fps. DLAA is the ultimate form of DLSS.
In R&C, even DLSS is providing superior image to ITGI, so naturally DLAA is providing even better image than DLSS, at the cost of some performance (typically around 10% of fps).
no 16GB is not enough if you don't have VRAM on your GPU also
not it's not loading small chunks of levels but the levels themselves, apart from the jump in the futuristic city on the flying dinosaur which is not from any level in the game, and that's why this small portion loads also faster on a HDD.
And it was always a given that with enough ram you could achieve similar results, but at a more expensive price.
Where are you getting all this information from?You are introducing a flawed comparison within Alex's video. As you said, DLAA is rendering a native 4k image, while the performance mode is 1080p-1440p. That isn't showing the strength of DLAA, it's showing inherent quality of the higher native pixel count.
And if you followed my back and forth with davis, you see that we are comparing the non RT performance mode which typically renders at 1440p- 1800p (DF says it stays at 1800p most of the time) and uncapped goes up to 100fps vs the 3070 with a locked 1440p native. At high settings, the 3070 achieves 40-50fps. At medium settings, it struggles to hit 60.
PS5 is outperforming the 3070. Claiming DRS or DLAA as the reason for PS5 outperforming is ridiculous.
12 GB RAM + 9 GB VRAM?
For PC or PS5?Does anyone know how resolution works in uncapped VRR modes?
DLAA is actually superior to DLSS but costs a lot more performance. Yeah, you don't belong here typing all this nonsense.DLAA (not DLSS) is not superior to PS5 ITGI
He says the PS5 runs 20% faster than his 2070, How is running 20% faster than RTX 2070 means it's closer to 2080Ti? the 2080Ti is 60% faster than 2070, and the 3080 is 2x times faster than a 2070, his math is all wrong."PS5 running much closer to a 2080TI and maybe even a 3080"
PS5. Since their is no target what happens? Does it just lock at the bottom of the DRS range?For PC or PS5?
I'm not 'making excuses', I'm explaining that the situation we have right now is ultimately still pretty good for us as consumers. And again, calling this a 'terrible port' is just straight hyperbole. With such standards, you're unlikely to ever be happy with 95% of future demanding games. It's just not realistic, quite frankly.
I'm just a fan of having some perspective, that's all. I've got no special love for Sony or anything.
This is a rare example where we get to see a more accurate gpu comparison. PS5 Performance non-RT Mode runs 1440p native 80-100fps with VRR. and is obliterating this 3070 coupled with a Ryzen 3600.
Have you tried deleting the DirectStorage dll from the game files to force it to not use GPU Decompression (if indeed that works)?
(posting here instead of the DF thread where you originally asked due to that thread's current infestation)
Ok, here you go. And of course after rendering out the second video after 40 minutes I realized....wait, why in the hell did I record it with vsync? Duh. Well regardless, the difference is still apparent even if the absolute framerate would have been a more sensible metric. This at least has the advantage of showing disparity wrt to CPU usage when trying for the same peak framerate I guess.
Here's 2 scenes, with and without those DS dll's. PC was rebooted after each run to ensure nothing was cached. Very High Textures, High everything else, no RT. First mostly combat, then the familiar Rift sequence - very surprisingly how even with that sequence, DirectStorage only assisted in loading times very marginally, if even that.
Now here's a much smaller area, with High textures - I wanted to see if there was still a difference in an enclosed section and using a texture setting that still engages the Gdeflate-packed textures, but not the largest ones. Surprisingly, still a constant fps difference in favour of non-directstorage.
Now these aren't the most exhaustive tests and it's only on one GPU, without using RT - so I'm not vram pressured with my 12GB 3060. As TechReport notes, cards with lower vram usage suffer - but they do so in a linear fashion. If you run out of vram in this game, your performance will drop - but it doesn't really get those massive stutters that other games do, which is ideally how it should behave. Bear in mind this is not something entirely new that DirectStorage has brought, Doom Eternal also behaved this way, but it could mean if I did this test on an 8GB GPU I might see more stutters when not using DS.
Make of it what you will. Ideally as I expected before launch, the option to switch from CPU->GPU DS decompression would be a visible toggle in the game to take advantage of precisely this scenario - where you have the CPU headroom at your chosen settings to let it fully handle decompression but want to free up a few GPU cycles from the task. Hopefully this option will come in a future patch, or they (or MS, or Nvidia/AMD) may just optimize the GPU path further.
(This all of course assumes removing these dll's actually removes DirectStrorage support from the game entirely, it's possible an earlier version of DS without GPU decompression may be in Windows system folders and the game just defaults to that? Dunno.)
He says the PS5 runs 20% faster than his 2070, How is running 20% faster than RTX 2070 means it's closer to 2080Ti? the 2080Ti is 60% faster than 2070, and the 3080 is 2x times faster than a 2070, his math is all wrong.
Shut all the noise above really quick haha gotta love it!
"PS5 running much closer to a 2080TI and maybe even a 3080"
View attachment 9294