Next gen lighting technologies - voxelised, traced, and everything else *spawn*

Red Dead Redemption 2 running at max settings 1440p, with no MSAA or TAA, and dropping to 45fps on an insanely OC'ed 2080Ti (2.1GHz) and a 9900K!

Meet the new Crysis of our time.


Max settings present a lot of luxurious extreme graphical features, such as increased resolutions for AO, reflections, shadows, Gl, and Ray Marched light shafts, still the cost on fps is too much, and without a real significant visual upgrade over consoles. I am wondering whether replacing all of this with ray tracing would have benefited the image quality more at the same hit to fps?

IMO, this really doesn't bode well for pure Rasterization compared to hybrid Ray Tracing/Rasterization.
 
Red Dead Redemption 2 running at max settings 1440p, with no MSAA or TAA, and dropping to 45fps on an insanely OC'ed 2080Ti (2.1GHz) and a 9900K!

Meet the new Crysis of our time.


Max settings present a lot of luxurious extreme graphical features, such as increased resolutions for AO, reflections, shadows, Gl, and Ray Marched light shafts, still the cost on fps is too much, and without a real significant visual upgrade over consoles. I am wondering whether replacing all of this with ray tracing would have benefited the image quality more at the same hit to fps?

IMO, this really doesn't bode well for pure Rasterization compared to hybrid Ray Tracing/Rasterization.

This is not necessarily an example of rasterization not scaling visually at reasonable performance costs. It may well be that as with most pc ports, the devs just allow you to scale the effects that can be increased by simply ratcheting up the resolution without much dev engineering effort. Id say we will know better after we start seeing some of the games built for the ground up for the new consoles since they will almost surely be primarily rasterized.
 
Red Dead Redemption 2 running at max settings 1440p, with no MSAA or TAA, and dropping to 45fps on an insanely OC'ed 2080Ti (2.1GHz) and a 9900K!
Yea, sort of shows why consoles can keep up. Ultra settings are really shit performers. Medium is clearly where it's at for bang for buck.
I think you showed some videos of that in the past.

Looking forward to the DF comparison entry, undoubtedly X1X will be the baseline comparison.
 
This is not necessarily an example of rasterization not scaling visually at reasonable performance costs. It may well be that as with most pc ports, the devs just allow you to scale the effects that can be increased by simply ratcheting up the resolution without much dev engineering effort. Id say we will know better after we start seeing some of the games built for the ground up for the new consoles since they will almost surely be primarily rasterized.
While I agree with your statement, it seems like a chicken and egg type scenario. If it was optimized to run well on ultra, it wouldn't be ultra, it would be medium and something higher would be ultra. By definition the maximum optimization is where the best performance intersects the best graphical quality. That does not mean it is the best graphical quality at all. Ultra is the definition of best graphical quality, so you should be definition be taking heavy performance hits to get there.

Without knowing the visual/performance curves for each feature, it also seems slightly misleading to say that ultra could be have been better running. When really all we're saying is that, ultra could have been pared back from further graphical quality to obtain better performance. So it's not that ultra isn't optimized, it's just that the developers set that area further out of reach.

Having said that, where i agree with you is that how a feature is implemented will affect it's visual/performance graphs. The implementation will change the optimization points, some skewing better towards low and perhaps others towards ultra.

I think ray tracing is a good example of this - its performance graphs (when accelerated) are certainly better in the uber ultra area comparatively to competing algorithms.

There needs to be a heavy paradigm shift in the PC space as they have gotten way too used to thinking Ultra was the baseline. This may have been caused by last generation lasting too long coupled with this generation starting too low in the power spec; but I recall back in the earlier days, it was expensive to hit those taxing ultra numbers.
 
Last edited:
This whole discussion of "My gpu should be able to run this game on high/ultra" reminds me of this:


low, medium, high, ultra are relative. One games low may be another games medium. That's pretty much the case with RDR2. You can't look at a games performance numbers for high and decide whether it's well optimized unless you really understand why high means.
 
Red Dead Redemption 2 running at max settings 1440p, with no MSAA or TAA, and dropping to 45fps on an insanely OC'ed 2080Ti (2.1GHz) and a 9900K!

Meet the new Crysis of our time.


Max settings present a lot of luxurious extreme graphical features, such as increased resolutions for AO, reflections, shadows, Gl, and Ray Marched light shafts, still the cost on fps is too much, and without a real significant visual upgrade over consoles. I am wondering whether replacing all of this with ray tracing would have benefited the image quality more at the same hit to fps?

IMO, this really doesn't bode well for pure Rasterization compared to hybrid Ray Tracing/Rasterization.

Has anything been published on the lighting and shadow implementations in RDR2?

It won the GDC award for “best technology” but I can’t find anything on the tech itself.

https://www.spieltimes.com/news/red-dead-redemption-2-wins-best-technology-award-at-gdc-2019/
 
Been playing Dreams and the lighting there is accumulated over about a second. Disable a light and its influence gradually fades.
 
Crytek's Noir RT demo is benchmarked ahead of it's public release. The demo is running purely on Compute now. Once more Vega cards are at a considerable disadvantage in RT workloads compared to Pascal and Navi (mirroring what happened in World of Tanks). Turing remains superior compared to all of them.

-GTX 1080 is 20% faster than Vega 64
-RX 5700XT is 40% faster than Vega 64
-RTX 2060 Super is 15% faster than RX 5700XT

https://wccftech.com/neon-noir-crye...ic-ray-tracing-demo-tested-and-available-now/
 
Last edited:
Crytek's Noir RT demo is benchmarked ahead of it's public release. The demo is running purely on Compute now. Once more Vega cards are at a considerable disadvantage in RT workloads compared to Pascal and Navi (mirroring what happened in World of Tanks). Turing reigns supreme over them all.

-GTX 1080 is 20% faster than Vega 64
-RX 5700XT is 40% faster than Vega 64
-RTX 2060 Super is 15% faster than RX 5700XT

https://wccftech.com/neon-noir-crye...ic-ray-tracing-demo-tested-and-available-now/

I think the demo is using RTX

https://www.techspot.com/news/80004-performance-details-behind-crytek-rtx-free-neon-noir.html

Crytek says that once RTX support is implemented on the Neon Noir demo, the visuals can be boosted even further, all the way up to full-screen (perhaps black bar-free?) 4K rendering. We don't know what sort of performance RTX cards would allow for, but it should be notably better than what the Vega 56 was capable of considering raw throughput alone.
 
Hmmm. It seems like they're getting very useable results without needed RTRT hardware.
Remember, the demo is running medium reflections, they are quite laggy and blurry in the demo. Crytek said their implementation will be hardware accelerated through RT cores next year, which will enable them to increase the quality of reflections to Ultra levels.
 
It's odd to see parity between a RX5700XT and a GTX1070, I wonder what that code does... (which bottleneck it hits)
 
Back
Top