GPU Ray Tracing Performance Comparisons [2021-2022]

Not officially, just YouTube BF streamers reporting their performance. here's one.


Lol AMD 3700 and AMD 3950X. This is what happens when review sites do crappy gpu-limited testing in their cpu reviews. These cpus lose to the top-end by a landslide. I have a 3600x. It's a bad gaming cpu, period. They're great productivity cpus that can also game. If you're just casual and only care about getting 60fps, you're fine. If you're looking for high frame rates, especially in modern games, you probably bought the wrong cpu, as I did (Technically I bought a 1700 and the 3600X was just one I could use with my motherboard). Many games become cpu-limited by a single thread, or they have difficult memory access patterns and the Zen2 memory latency just kills performance and leads to high variability. The best you can do is highly tune your RAM to try to get latency down, or upgrade to a Zen3 cpu if you can. Zen3 was AMD's first really good gaming cpu.

You'll get fooled by sites that benchmark games by using games that are easy to benchmark and highly repeatable vs the ones that people actually play that can be demanding on CPUs (Fortnite, Warzone, Apex, Valheim).

Like this guy turns on DLSS and wonders why fps doesn't go up ... because it's the CPU probably paired with 3200 ram. Lol .. also RT is DOA because he dropped to 50fps. Again, RT has large CPU overheads and he's already CPU limited.

This game is going to be just plain brutal on CPUs. It's 128 players with vehicles, deforming terrain, destruction, all kinds of weird physics from weather, explosions, abilities etc. Welcome to being disappointed by old CPUs that were overhyped as gaming cpus.
 
Last edited:
Lol AMD 3700 and AMD 3950X. This is what happens when review sites do crappy gpu-limited testing in their cpu reviews. These cpus lose to the top-end by a landslide. I have a 3600x. It's a bad gaming cpu, period. They're great productivity cpus that can also game. If you're just casual and only care about getting 60fps, you're fine. If you're looking for high frame rates, especially in modern games, you probably bought the wrong cpu, as I did (Technically I bought a 1700 and the 3600X was just one I could use with my motherboard). Many games become cpu-limited by a single thread, or they have difficult memory access patterns and the Zen2 memory latency just kills performance and leads to high variability. The best you can do is highly tune your RAM to try to get latency down, or upgrade to a Zen3 cpu if you can. Zen3 was AMD's first really good gaming cpu.

You'll get fooled by sites that benchmark games by using games that are easy to benchmark and highly repeatable vs the ones that people actually play that can be demanding on CPUs (Fortnite, Warzone, Apex, Valheim).

Like this guy turns on DLSS and wonders why fps doesn't go up ... because it's the CPU probably paired with 3200 ram. Lol .. also RT is DOA because he dropped to 50fps. Again, RT has large CPU overheads and he's already CPU limited.

This game is going to be just plain brutal on CPUs. It's 128 players with vehicles, deforming terrain, destruction, all kinds of weird physics from weather, explosions, abilities etc. Welcome to being disappointed by old CPUs that were overhyped as gaming cpus.
He isn't completely CPU limited as his performance with his 2080ti demonstrates. A 3950x is still a very capable gaming CPU. Just because the absolute best CPU money can buy is some 20-30% faster doesn't negate that. Many other BF streamers are all reporting the same thing across a wide range of CPUS, some better some worse.
 
He isn't completely CPU limited as his performance with his 2080ti demonstrates. A 3950x is still a very capable gaming CPU. Just because the absolute best CPU money can buy is some 20-30% faster doesn't negate that. Many other BF streamers are all reporting the same thing across a wide range of CPUS, some better some worse.

Zen2 just has big disadvantages for gaming. Clock speed isn't so much the issue as it is memory latency. The new consoles will not run BF 2042 at 120fps, even at lower resolutions, because it's cpu-limited by nature of being 128 players etc etc. The CPU hit is not for nothing. What you're going to find is cpus with big weaknesses in their architecture in terms of gaming will show it more. Unfortunately for this guy, his 3090 isn't going to help him if he pairs it with a poor gaming CPU. I'm in the same boat. I have a 3600x and a 3080. It's not a good combo.
 
Zen2 just has big disadvantages for gaming. Clock speed isn't so much the issue as it is memory latency. The new consoles will not run BF 2042 at 120fps, even at lower resolutions, because it's cpu-limited by nature of being 128 players etc etc. The CPU hit is not for nothing. What you're going to find is cpus with big weaknesses in their architecture in terms of gaming will show it more. Unfortunately for this guy, his 3090 isn't going to help him if he pairs it with a poor gaming CPU. I'm in the same boat. I have a 3600x and a 3080. It's not a good combo.
Is a 5950x a good enough CPU?

 
Many games become cpu-limited by a single thread, or they have difficult memory access patterns and the Zen2 memory latency just kills performance and leads to high variability. The best you can do is highly tune your RAM to try to get latency down, or upgrade to a Zen3 cpu if you can. Zen3 was AMD's first really good gaming cpu.
I would assume that given the hardware in the consoles, if the above are issues, these are things that need to be optimized for, even for PC. It would be awkward that they make memory patterns and single threads knowingly full well the consoles are using Zen 2 machines with significantly more memory latency than a PC would have.

It may just be a case of waiting for release code before making further observations here. Optimization will occur in the last leg of the race for most developers, being content complete is the most critical aspect.
 
I would assume that given the hardware in the consoles, if the above are issues, these are things that need to be optimized for, even for PC. It would be awkward that they make memory patterns and single threads knowingly full well the consoles are using Zen 2 machines with significantly more memory latency than a PC would have.

It may just be a case of waiting for release code before making further observations here. Optimization will occur in the last leg of the race for most developers, being content complete is the most critical aspect.

The threading model on the consoles, from what I understand, is different. Many games are still using some kind of threading model where you can have one "long" thread which can bottleneck a frame. Some games like Doom Eternal and Naughty Dog games don't have a main thread and try to distribute work across all cores. I know Doom Eternal did that at the expense of cache performance, which could be a huge tradeoff in a game like Battlefield 2042. It's still VERY common to see games fail to scale across cores and take advantage of multi-threading performance particularly well. It's not an easy problem to solve.

In terms of the CPUs in the consoles themselves, they may have done some work to address memory latency. Zen3 is much better than Zen2.

Ultimately we won't know until people look at the game in depth. I'm not ready to say it's horrible optimized until people look into it more. I can't think of a single game that has the same scale in terms of players, vehicles, physics etc. I'm not sure what we'd compare it to to know whether performance is good or bad.
 
Is a 5950x a good enough CPU?


This is definitely more concerning. 5950x is very new and traded back and forth with the 10900k until alder lake came out. The 3700x and 3950 are probably more in line with an 8700k or 9600k in cpu-limited games.

Is it possible the CPU demands are about double Battlefield V, with larger maps, double the number of players, more physics and animation, more complex sounds etc?

battlefield-v-1280-720.png



Edit: I'll stress that these benchmarks are average fps, and we don't know the scene (probably single player).

My experience with the 3600x is that it's great until it isn't. The lows are lower than my friends on older intel cpus. We all have the same gpu, but they're running intel and I'm running a 3600x. In games like Control, there are certain rooms I'll navigate and my frame rate just tanks because of my cpu. Same with Apex. Same with Warzone etc. Generally our average fps would be the same, but the worst case is worse. I expect that difference to grow as newer and more demanding games come out.
 
Last edited:
Lol AMD 3700 and AMD 3950X. This is what happens when review sites do crappy gpu-limited testing in their cpu reviews. These cpus lose to the top-end by a landslide. I have a 3600x. It's a bad gaming cpu, period. They're great productivity cpus that can also game. If you're just casual and only care about getting 60fps, you're fine. If you're looking for high frame rates, especially in modern games, you probably bought the wrong cpu, as I did (Technically I bought a 1700 and the 3600X was just one I could use with my motherboard). Many games become cpu-limited by a single thread, or they have difficult memory access patterns and the Zen2 memory latency just kills performance and leads to high variability. The best you can do is highly tune your RAM to try to get latency down, or upgrade to a Zen3 cpu if you can. Zen3 was AMD's first really good gaming cpu.

You'll get fooled by sites that benchmark games by using games that are easy to benchmark and highly repeatable vs the ones that people actually play that can be demanding on CPUs (Fortnite, Warzone, Apex, Valheim).

Like this guy turns on DLSS and wonders why fps doesn't go up ... because it's the CPU probably paired with 3200 ram. Lol .. also RT is DOA because he dropped to 50fps. Again, RT has large CPU overheads and he's already CPU limited.

This game is going to be just plain brutal on CPUs. It's 128 players with vehicles, deforming terrain, destruction, all kinds of weird physics from weather, explosions, abilities etc. Welcome to being disappointed by old CPUs that were overhyped as gaming cpus.
This is perhaps one of the worst takes I've had the displeasure of coming across on this forum. I really dislike engaging in debates with people who only operate on the extremes but, this take is so bad that I couldn't ignore it. The Zen 2 lineups are good gaming cpus period. Are they the best gaming cpus? Nope but, they deliver above mid-tier gaming performance. This game first of all, looks worse than both BF5 and BF1 while delivering almost no destruction and worse optimization. When you look at the frequency of destruction during a match, you experience destruction more frequently in COD Vanguard than you do in 2042. It's imo a very mediocre battlefield game and on the technical side, it's very poor. One of the worst deployments of frostbite in recent history. Yes, they're pushing for 128 players but that's hardly an excuse. Warzone runs on the ps4 with 100 players. They have vehicles? Yea warzone has vehicles as well. Even MAG on the ps3 had 256 players. Don't bother mentioning destruction because it's basically non-existent. The issue is not SkillUp's 3950x or 3700x, it's the poor work by dice. Most of the veterans left the team a long time ago and the lead for 2042 I believe is a former COD dev. It's just a poor product technically.
 
This is definitely more concerning. 5950x is very new and traded back and forth with the 10900k until alder lake came out. The 3700x and 3950 are probably more in line with an 8700k or 9600k in cpu-limited games.

Is it possible the CPU demands are about double Battlefield V, with larger maps, double the number of players, more physics and animation, more complex sounds etc?

battlefield-v-1280-720.png



Edit: I'll stress that these benchmarks are average fps, and we don't know the scene (probably single player).

My experience with the 3600x is that it's great until it isn't. The lows are lower than my friends on older intel cpus. We all have the same gpu, but they're running intel and I'm running a 3600x. In games like Control, there are certain rooms I'll navigate and my frame rate just tanks because of my cpu. Same with Apex. Same with Warzone etc. Generally our average fps would be the same, but the worst case is worse. I expect that difference to grow as newer and more demanding games come out.
The frame rates are roughly 1/3rd of those in BFV multiplayer on the same hardware. The maps are larger and player count doubled yes, but destruction and visuals have seen notable cutbacks. So far it’s just looking like a poorly developed game. More demanding RT features is the last thing needed.
 
This is perhaps one of the worst takes I've had the displeasure of coming across on this forum. I really dislike engaging in debates with people who only operate on the extremes but, this take is so bad that I couldn't ignore it. The Zen 2 lineups are good gaming cpus period. Are they the best gaming cpus? Nope but, they deliver above mid-tier gaming performance. This game first of all, looks worse than both BF5 and BF1 while delivering almost no destruction and worse optimization. When you look at the frequency of destruction during a match, you experience destruction more frequently in COD Vanguard than you do in 2042. It's imo a very mediocre battlefield game and on the technical side, it's very poor. One of the worst deployments of frostbite in recent history. Yes, they're pushing for 128 players but that's hardly an excuse. Warzone runs on the ps4 with 100 players. They have vehicles? Yea warzone has vehicles as well. Even MAG on the ps3 had 256 players. Don't bother mentioning destruction because it's basically non-existent. The issue is not SkillUp's 3950x or 3700x, it's the poor work by dice. Most of the veterans left the team a long time ago and the lead for 2042 I believe is a former COD dev. It's just a poor product technically.

Cool thanks.

Warzone will get you maybe 140 fps average on a 3700x? What do you think the worst case areas of the map are going to get you with a cluster of players? That would be fully cpu limited in 1080p low or something like that. My 3600x with tuned memory will hover in the 100-120 range if I'm playing one of those deathmatch type modes where there are a lot of players in the same area. Is it really crazy to think a 3700x could get 80fps in Battlefield 2042 without ray tracing? I'm assuming this videos are showing the worst case. I've seen people generally running around at 120fps in Battlefield on similar CPUs. But the worst case scenarios dropping into the 80s or 90s doesn't seem crazy to me. Then if ray tracing costs you another 1 or 2ms, then you're not going to be getting the best performance.
 
Cool thanks.

Warzone will get you maybe 140 fps average on a 3700x? What do you think the worst case areas of the map are going to get you with a cluster of players? That would be fully cpu limited in 1080p low or something like that. My 3600x with tuned memory will hover in the 100-120 range if I'm playing one of those deathmatch type modes where there are a lot of players in the same area. Is it really crazy to think a 3700x could get 80fps in Battlefield 2042 without ray tracing? I'm assuming this videos are showing the worst case. I've seen people generally running around at 120fps in Battlefield on similar CPUs. But the worst case scenarios dropping into the 80s or 90s doesn't seem crazy to me. Then if ray tracing costs you another 1 or 2ms, then you're not going to be getting the best performance.
70-80 is the average in 2042 at 1440p with drops to the 50s on the best GPUs and CPUs money can buy. That's without enabling RTAO.
 
Im still on a zen 2 (3900x) and i think its a quite capable cpu for gaming. Its just that zen3 is a whole lot of an improvement for gaming.
What scott_arm is saying i have seen on quite many instances (forums, analysis erc). Im not going to upgrade for this but had i known it i would have wait abit and gone for zen3.
 
Just took a quick look. On this map called Renewal I was getting 80-100 fps while cpu-limited at ultra settings with dlss balanced. Really depended on the density of the area. Turning on RTAO dropped me to 60fps with dips into the 50s. I'll take more of a look when my friends are on to play. Not wasting my 10 hour trial solo.

ryzen 3600x with pretty highly tuned DDR4-3600 and an RTX3080. I don't have dual-rank memory, so there's probably some fps to be gained there.

Ray tracing from a cursory glance is costing 4-6 ms on the cpu for me.
 
Last edited:
I know Doom Eternal did that at the expense of cache performance, which could be a huge tradeoff in a game like Battlefield 2042
I didn't realize that the job system for either would impact cache hits. I thought there is a shared L3 infinity fabric cache for this purpose? Then again, I suppose it's how you send out jobs

I can't think of a single game that has the same scale in terms of players, vehicles, physics etc
I was under the assumption the majority of this should be handled by the server, the CPU is responsible for the rendering of everything being transmitted. It seems a good fit for AVX instructions for so handling so many at once if this is true. You're given back a matrix of vectors for all the items and players in the game for the next frame and you just add it to everything.
 
I didn't realize that the job system for either would impact cache hits. I thought there is a shared L3 infinity fabric cache for this purpose? Then again, I suppose it's how you send out jobs

I don't think id software has really explained their threading model yet. I just know they don't have a main thread. I remember seeing a question asked about how it works and all they said was they decided it was best to keep all cores busy with work rather than optimizing for cache hit rate, but they used data-oriented design as much as possible.

I was under the assumption the majority of this should be handled by the server, the CPU is responsible for the rendering of everything being transmitted. It seems a good fit for AVX instructions for so handling so many at once if this is true. You're given back a matrix of vectors for all the items and players in the game for the next frame and you just add it to everything.

I believe Battlefield V was a pretty good overclock check because it used AVX heavily and if your overclock was unstable you'd find out fast. I'm sure anything critical to gameplay fairness is done on the server, but there's a ton of animation and physics particles that are just pretty that are done on the client. Trying to synchronize every swaying branch, spark, ragdolled body or dirt particle doesn't make a lot of sense. Ultimately I don't know what's done on the client on the cpu vs the server or the gpu. I'm sure there might be some info out there, but not super inclined to look it up.
 
I believe Battlefield V was a pretty good overclock check because it used AVX heavily and if your overclock was unstable you'd find out fast
So is the issue possibly that AVX2 performance on Zen2 is very harsh resulting in a massive downclock?
Or better put, did some gamers find a way to disable AVX on their CPUs (BIOS) resulting in poor performance on PC?

In particular, if your PC is crashing under AVX2 loads, the power supply is likely inadequate.
 
Last edited:
So is the issue possibly that AVX2 performance on Zen2 is very harsh resulting in a massive downclock?
Or better put, did some gamers find a way to disable AVX on their CPUs (BIOS) resulting in poor performance on PC?

In particular, if your PC is crashing under AVX2 loads, the power supply is likely inadequate.

I think the issue with zen2 is there is a limit to how much L3 each CCX can access, so any core can access half the L3 very fast and the other half of the L3 "slow". I think in the end latency to ram through the cache hierarchy is relatively slow. Don't think there are any particular issues with AVX itself. Memory latency is a huge waste of clock cycles.

On top of cross-ccx latency being a huge bummer, the windows scheduler doesn't help. In Windows you have many threads in flight and threads can become blocked and then active again. If you have threads that need to coordinate, or you just simply block and then activate, the data you need could suddenly be across that ccx boundary where you pay a higher latency penalty (because your thread may become active on a different core). This is why a 3950 or 3900x has gaming performance like an 8700k, or maybe even an 8600k sometimes. In productivity apps where cache performance is highly optimized the zen2 will trounce similar intel cpus. But in highly dynamic workloads like games where data demands change on a ms or ns timescale, it will not be ideal.

Edit: I'd also like to mention that a game like BF2042, it can be hard to know how well optimized it is. Multi-threading is a non-trivial and unsolved problem. What Unity is trying to do with DOTS has a lot of naysayers that rightly point out that not everything is trivially parallelizable. Battlefield probably has a lot of legacy code that makes going to a pure ECS data-oriented type design difficult. I imagine it uses ECS memory layouts selectively. Unreal Engine would be a similar case.
 
Last edited:
So is the issue possibly that AVX2 performance on Zen2 is very harsh resulting in a massive downclock?
Or better put, did some gamers find a way to disable AVX on their CPUs (BIOS) resulting in poor performance on PC?

In particular, if your PC is crashing under AVX2 loads, the power supply is likely inadequate.
Zen(2) doesn't underclock itself due AVX like Intel does. It might boost lower than it would on other loads, though, but Intel is the only one that actually has separate clocks for AVX loads
 
Back
Top