Yes but it's unclear if the quality is the same. Given how expensive it is on PC i don't see how it could be.Do consoles feature RTGI in The Witcher 3?
Yes but it's unclear if the quality is the same. Given how expensive it is on PC i don't see how it could be.Do consoles feature RTGI in The Witcher 3?
That makes more sense. If I remember, Control and Cyberpunk are in the 40-45% range with all RT effects at 4K.Warhammer got an update today. Runs a little bit better. Performance drop with Raytracing is in the normal range for GI and reflections - around 40%.
That's good to hear. Someone should test the performance impact on a Turing/Ampere GPU (on low settings and high).Warhammer got an update today. Runs a little bit better. Performance drop with Raytracing is in the normal range for GI and reflections - around 40%.
Witcher 3 with just RTXGI on is in the over 200% range. From 119 fps (GPU limit) to 28 fps.Witcher 3 and Callisto Protocol are in the 65-70% range which is a bit crazy.
Witcher 3 with just RTXGI on is in the over 200% range. From 119 fps (GPU limit) to 28 fps.
Yes but it's unclear if the quality is the same. Given how expensive it is on PC i don't see how it could be.
The Witcher 3 doesnt have any kind of advanced GI system. So the performance drop will always be much bigger than in others games: https://imgsli.com/MTQwMzcxWitcher 3 with just RTXGI on is in the over 200% range. From 119 fps (GPU limit) to 28 fps.
I dont see a difference. But the RTGI hasnt this huge impact at all.Is there any noticeable change to the visuals? I wonder if they just reduced the RT quality (ray count/resolution) as opposed to making what they had run better. Any idea if the frame drops improved? I was originally using RT on low for GI and Reflections and it managed to stay around 60 fps with small hordes but once a big one came the framerate dropped heavily and was basically unplayable.
Correct me if I'm wrong. But isn't that jump kind of insane? Not even 1 fps to 6 fps average?! That's more than 10 times the RT performance between architecturesPortal RTX:
6950XT: 0.4
7900XTX: 6
3090Ti: 10
4080: 30
4090: 45
I've got a 3080 and just went and had a quick look, these were done at 1440p with dlss on quality (so 1080p render res). RT off was getting 120 in the hub area looking across the round room back down towards the spawn area. RT reflections on low and RTGI on low that dropped to 80ish, RTGI changed to high it dropped to 70. I didn't get to try Reflections set to high aswell because it crashed and I don't have enough time at the moment to jump in and try again and also actually get into a mission and see if it's actually playable (it used to drop to 30s on me).That's good to hear. Someone should test the performance impact on a Turing/Ampere GPU (on low settings and high).
Thank you for these tests! What I am interested in is the impact just RTXGI has without reflections on the GPU (preferably on RTXGI on low like you tested!). Also, were the 120 fps without RTXGI in the hub area in GPU or CPU limit? (If its in CPU limit, your GPU usage will be below 90-100%)I've got a 3080 and just went and had a quick look, these were done at 1440p with dlss on quality (so 1080p render res). RT off was getting 120 in the hub area looking across the round room back down towards the spawn area. RT reflections on low and RTGI on low that dropped to 80ish, RTGI changed to high it dropped to 70. I didn't get to try Reflections set to high aswell because it crashed and I don't have enough time at the moment to jump in and try again and also actually get into a mission and see if it's actually playable (it used to drop to 30s on me).
I did quickly run around the circular room that Alex used to show the stutters in his video and it seemed improved, I saw a couple of frame time spikes but they were not like the ones in Alex's video and I think they were more to do with people loading into the hub. That was with RT on low reflections and low GI.
I'll try do a better job later tonight in a mission, I used to get about ~70 in the hub with RT low/low before this patch so there has been an improvement not sure if it will be enough to keep it playable in mission for me though.
That's actually a good point. In a game without dynamic GI system, the performance impact from adding RTXGI would be more drastic compared to a game that already has a demanding GI system in place, if RTXGI replaces it entirely. Eitherway, it still runs super poorly with RTXGI even without considering how the game ran before turning RTXGI on.The Witcher 3 doesnt have any kind of advanced GI system. So the performance drop will always be much bigger than in others games: https://imgsli.com/MTQwMzcx
I actually didn't pay attention to the gpu or cpu usage, it was sitting in the overlay but I was kinda rushing. Should have time in a few hours to do a better job and will do it without RT reflections and also make not of what the rest of my settings are and also get some framerates from in an actual mission. But will get the hub area with no/low/high GI so you can just see the RTGI perf impact.Thank you for these tests! What I am interested in is the impact just RTXGI has without reflections on the GPU (preferably on RTXGI on low like you tested!). Also, were the 120 fps without RTXGI in the hub area in GPU or CPU limit? (If its in CPU limit, your GPU usage will be below 90-100%)
That's actually a good point. In a game without dynamic GI system, the performance impact from adding RTXGI would be more drastic compared to a game that already has a demanding GI system in place, if RTXGI replaces it entirely. Eitherway, it still runs super poorly with RTXGI even without considering how the game ran before turning RTXGI on.
Portal performance is dictated mostly by registers, not rt performance, at least on AMD if the early estimates are to be believed.Correct me if I'm wrong. But isn't that jump kind of insane? Not even 1 fps to 6 fps average?! That's more than 10 times the RT performance between architectures
It's insane indeed and not only for AMD, but for all architectures. The 4090 is 4.5x times the 3090Ti, while the 3090Ti itself is ~2x times the 2080Ti. I think it's natural the 7900XTX is the way it is here. And according to PCGH and their usual analysis for RT heavy hitters, the 7900XTX did reach almost 2X over the 6950XT in two titles (Metro and Dying Light 2). Here is their full analysis.Correct me if I'm wrong. But isn't that jump kind of insane? Not even 1 fps to 6 fps average?! That's more than 10 times the RT performance between architectures
PurePC.pl has tested the Witcher 3 NG - https://www.purepc.pl/wiedzmin-3-dz...wymagania-sprzetowe-z-ray-tracingiem?page=0,7:
In 1440p the 4090 is twice as fast as the 7900XTX, 4080 is 46% faster. In 1080p the 7900XTX 42% faster than the 6900XT, which is less than the performance increase with Uber+ and w/o raytracing (57%).
That's a 76% performance drop though. A 200% drop would result in a negative frame rate lol.Witcher 3 with just RTXGI on is in the over 200% range. From 119 fps (GPU limit) to 28 fps.
Yeah, it's a stupid world we live in where even manufacturers regularily use wrong terms like that, like "power consumption dropped 2.5 times lower". No it didn't. It would be producing power.That's a 76% performance drop though. A 200% drop would result in a negative frame rate lol.
When the fps is that low, it doesn't mean much. The most shocking part is the 3090 Ti at 10fps and the 4090 at 45.Correct me if I'm wrong. But isn't that jump kind of insane? Not even 1 fps to 6 fps average?! That's more than 10 times the RT performance between architectures