Red Dead Redemption 2 running at max settings 1440p, with no MSAA or TAA, and dropping to 45fps on an insanely OC'ed 2080Ti (2.1GHz) and a 9900K!
Meet the new Crysis of our time.
Max settings present a lot of luxurious extreme graphical features, such as increased resolutions for AO, reflections, shadows, Gl, and Ray Marched light shafts, still the cost on fps is too much, and without a real significant visual upgrade over consoles. I am wondering whether replacing all of this with ray tracing would have benefited the image quality more at the same hit to fps?
IMO, this really doesn't bode well for pure Rasterization compared to hybrid Ray Tracing/Rasterization.
Yea, sort of shows why consoles can keep up. Ultra settings are really shit performers. Medium is clearly where it's at for bang for buck.Red Dead Redemption 2 running at max settings 1440p, with no MSAA or TAA, and dropping to 45fps on an insanely OC'ed 2080Ti (2.1GHz) and a 9900K!
While I agree with your statement, it seems like a chicken and egg type scenario. If it was optimized to run well on ultra, it wouldn't be ultra, it would be medium and something higher would be ultra. By definition the maximum optimization is where the best performance intersects the best graphical quality. That does not mean it is the best graphical quality at all. Ultra is the definition of best graphical quality, so you should be definition be taking heavy performance hits to get there.This is not necessarily an example of rasterization not scaling visually at reasonable performance costs. It may well be that as with most pc ports, the devs just allow you to scale the effects that can be increased by simply ratcheting up the resolution without much dev engineering effort. Id say we will know better after we start seeing some of the games built for the ground up for the new consoles since they will almost surely be primarily rasterized.
Looking forward to the DF comparison entry, undoubtedly X1X will be the baseline comparison.
Red Dead Redemption 2 running at max settings 1440p, with no MSAA or TAA, and dropping to 45fps on an insanely OC'ed 2080Ti (2.1GHz) and a 9900K!
Meet the new Crysis of our time.
Max settings present a lot of luxurious extreme graphical features, such as increased resolutions for AO, reflections, shadows, Gl, and Ray Marched light shafts, still the cost on fps is too much, and without a real significant visual upgrade over consoles. I am wondering whether replacing all of this with ray tracing would have benefited the image quality more at the same hit to fps?
IMO, this really doesn't bode well for pure Rasterization compared to hybrid Ray Tracing/Rasterization.
There's this: https://forum.beyond3d.com/posts/2087411/Has anything been published on the lighting and shadow implementations in RDR2?
I think that link doesn't work for people with different posts per page setting, could you link directly to said message using the link on #xxxx in the bottom right corner? Like https://forum.beyond3d.com/posts/2087411/ for your post
Crytek's Noir RT demo is benchmarked ahead of it's public release. The demo is running purely on Compute now. Once more Vega cards are at a considerable disadvantage in RT workloads compared to Pascal and Navi (mirroring what happened in World of Tanks). Turing reigns supreme over them all.
-GTX 1080 is 20% faster than Vega 64
-RX 5700XT is 40% faster than Vega 64
-RTX 2060 Super is 15% faster than RX 5700XT
https://wccftech.com/neon-noir-crye...ic-ray-tracing-demo-tested-and-available-now/
Crytek says that once RTX support is implemented on the Neon Noir demo, the visuals can be boosted even further, all the way up to full-screen (perhaps black bar-free?) 4K rendering. We don't know what sort of performance RTX cards would allow for, but it should be notably better than what the Vega 56 was capable of considering raw throughput alone.
I think the demo is using RTX
https://www.techspot.com/news/80004-performance-details-behind-crytek-rtx-free-neon-noir.html
No it's not, it says so in the press release. Besides if it were using the RT cores the 2060 would be significantly faster than the 5700XT, instead of just 15%.I think the demo is using RTX
It's not using RTX.
Remember, the demo is running medium reflections, they are quite laggy and blurry in the demo. Crytek said their implementation will be hardware accelerated through RT cores next year, which will enable them to increase the quality of reflections to Ultra levels.Hmmm. It seems like they're getting very useable results without needed RTRT hardware.