I think its more likely RT sees minimal use. I suspect shadows will be the most common application.Current or previous gen?
Current gen will get more DXR enabled games as time goes on.
Previous gen without DXR will fair much better on RDNA2.
I think its more likely RT sees minimal use. I suspect shadows will be the most common application.Current or previous gen?
Current gen will get more DXR enabled games as time goes on.
Previous gen without DXR will fair much better on RDNA2.
I think its more likely RT sees minimal use. I suspect shadows will be the most common application.
Ya there will be the rare game that Nvidia pushes extra RT effects in. Consoles aren’t performant enough for devs to devote much RT IMO.That would be ideal for AMD as their DXR solution doesn't really do good (usable) outside DXR shadows.
But NVIDIA will push for more...and you can, as some games already have implemented, have various DXR fidelity settings.
This kinda reminds me of the tessellation debate a few years ago...here is hoping we don't get driver toggles that reduce the image quality, but that games will implement DXR settings instead so it is up to the end-user.
I expect that it will be a layered/hybrid approach.Ya there will be the rare game that Nvidia pushes extra RT effects in. Consoles aren’t performant enough for devs to devote much RT IMO.
I think it’s very unlikely there is widespread use of RT in the PC space above and beyond what the console version incorporates. Youll be able to dial up the resolution/ray count with ultra settings but expecting more is likely setting yourself up for disappointment. 15 years of multiplatform games with only the super rare Nvidia funded gameworks title having extra tech above the console versions. Game development costs are only going up. Theres no reason for this trend to change.I expect that it will be a layered/hybrid approach.
So I expect RT to be used on more than just shadows.
So you could benchmark with the settings dialled down to make it totally apples to apples comparison, sure, but I don't think that's fair or useful either.
Unlike hair works etc it's not proprietary so is fair to benchmark with it on. So I'm also expecting more work from benchmarkers in the short term with RT effects on and off.
When is RDNA3 due as I expect RT to be more performant on it.
The thing consoles have done is its made it necessary to have a wide range of flexibility and settings in this respect. From XSS - RTX3080
Ya there will be the rare game that Nvidia pushes extra RT effects in. Consoles aren’t performant enough for devs to devote much RT IMO.
Consoles don't and haven't held back progress for ages. Cyberpunk is Nvidia funded.I do not hope that consoles will hold back progress for the next 6-7 years.
CyberPunk 2077 is doing the opposite though...maximum fidelity on the PC...scaled down fidelity on consoles...consoles seem "obsolete" on launch this time.
Again, a simple in-game DXR setting would solve the problem of the consoles poor DXR performance.
I wonder what will be an issue first...10 vs 16 GPU RAM...or DXR performance.
And won't in regards to use of RT either.Console don't and haven't held back progress for ages.
Consoles don't and haven't held back progress for ages. Cyberpunk is Nvidia funded.
And won't in regards to use of RT either.
RDNA2 will just have to use lower settings etc
Not true, The Witcher 3 suffered due to console...they even stated so public:
CD Projekt tackles The Witcher 3 downgrade issue head on • Eurogamer.net
I know it is "dangerous" to critize consoles, but they do not add to progression the way their fans would like them too IMHO.
Quite the opposite as I see it.
PC exclusive games exist and look vastly worse than consoles games. Consoles and their large userbase is what allows so much money to be poured into AAA games and the visuals they deliver. The addressable market with a system more powerful than these new consoles will be miniscule for several years. Witcher 3 would never have looked like the early footage even if it was a PC exclusive, it likely would have looked far worse than what it ended up as. No developer is going to design a game around 700$+ GPUs. That ended with Crysis.
I think it’s very unlikely there is widespread use of RT in the PC space above and beyond what the console version incorporates.
We’re already seeing RT used for shadows and reflections in cross gen console games. There’s no reason to think both effects won’t be widespread in next gen exclusives.
Also once you’re already doing all the work to build the RT pipeline and BVH for shadows the incremental work to cast reflection or GI rays shouldn’t be that daunting. I fully expect RT to see widespread use on consoles and expanded usage in PC hardware especially as the PC upgrade cycle rolls on.
I expect GI to use other method like on Unreal Engine 5 and CryEngine(SVOGI) or Demon's soul's and use HW acceleration for non triangle base RT. I think most of the game will use shadows or reflection or no triangle based RT at all and this is logic AMD did not dedicate the same area than Nvidia for triangle RT.
The reason is simple performance.
EDIT: Reflection will stay in games in a city for example.
Yeah I’ll believe that when I see it. We’ve been hearing about SVOGI for 8 years now with no uptake. RT has seen much faster adoption in comparison.
A quick scan through also highlights there wouldn't even be a game without a console version.Not true, The Witcher 3 suffered due to console...they even stated so public:
CD Projekt tackles The Witcher 3 downgrade issue head on • Eurogamer.net
I know it is "dangerous" to critize consoles, but they do not add to progression the way their fans would like them too IMHO.
Quite the opposite as I see it.
A Hitman 3 trailer came out yesterday. Seems IOI and Intel approach is similar to Nvidia's, and imo is a good thing.We don't even know the performance of intels RT implementation is yet.
How do you conclude this from a purely gameplay trailer?A Hitman 3 trailer came out yesterday. Seems IOI and Intel approach is similar to Nvidia's, and imo is a good thing.