Value of Hardware Unboxed benchmarking *spawn

<snip> Particularly because an added complication is that future games may leverage/stress the different pros/cons of various architectures.
I mean, if we're going to dig to the bottom of that particular pendandicism hole, we could just say "futureproofing doesn't exist" because there's literally no way for us to truly know what the next interesting technology will depend upon. The next thing might somehow be using all those int4 operations in NPU's to somehow accelerate, I dunno, game AI seems the most apt right? So if you didn't buy an NPU, then your fancy-shmancy new CPU is basically half speed now right?

However, to get back to the real world, we can make inferences about what's likely to continue happening in the near future. And that future is going to continue bringing faster GPUs in both raster as well as raytracing, which will need to be fed by faster CPUs. Which actually is a great segway to:
Does the 5800x3d increasingly pull ahead of the 5800x in 2024 games with rtx 4090 at 1440p/4k as reflected by 720p/1080p rtx 3090 tests in 2022?
It does! Check out the GN review of the 9800X3D, where they include prior X3D parts and their non-X3D bretherin:

I linked specifically to the Stellaris portion of the review, because simulation time actually matters in that game and is irrespective of GPU power. And in Stellaris, the 5800X3D variant is 13% faster than the 5800X, despite the 400MHz base / 200MHz turbo clock disadvantage to the X3D.

Even at 1440p or 4k, there will be scenes in modern games where even the fastest GPUs will still be waiting on the CPU to send data. And in those scenarios, a faster CPU will matter, such as the X3D line. It probably isn't going to increase maximum framerate, it will instead drag up the average on the low-end.
 
I mean, if we're going to dig to the bottom of that particular pendandicism hole, we could just say "futureproofing doesn't exist" because there's literally no way for us to truly know what the next interesting technology will depend upon. The next thing might somehow be using all those int4 operations in NPU's to somehow accelerate, I dunno, game AI seems the most apt right? So if you didn't buy an NPU, then your fancy-shmancy new CPU is basically half speed now right?

However, to get back to the real world, we can make inferences about what's likely to continue happening in the near future. And that future is going to continue bringing faster GPUs in both raster as well as raytracing, which will need to be fed by faster CPUs. Which actually is a great segway to:

We can just focus on the specific scenario mentioned wihtout going into a broader discussion though. Is there enough data to conclusively say that running multiplatform graphics driven SP games at 720p/1080p today translates to meanginful real world differences for future games at 1440/4k?

What specific games of that type for example specifically in real usage terms play that much better on the 5800x3d vs 5800x at 1440p/4k?

It does! Check out the GN review of the 9800X3D, where they include prior X3D parts and their non-X3D bretherin:

I linked specifically to the Stellaris portion of the review, because simulation time actually matters in that game and is irrespective of GPU power. And in Stellaris, the 5800X3D variant is 13% faster than the 5800X, despite the 400MHz base / 200MHz turbo clock disadvantage to the X3D.

Even at 1440p or 4k, there will be scenes in modern games where even the fastest GPUs will still be waiting on the CPU to send data. And in those scenarios, a faster CPU will matter, such as the X3D line. It probably isn't going to increase maximum framerate, it w

I don't think you're referring to the same thing. What I'm referring to was the comment that running 720p/1080p predominately multiplatform graphics driven SP games that are used in the GPU reviews would showcase how future similar titles would make use of those faster CPUs at resolutions like 1440p/4k with high (or maxed) out graphics. That is for example the 5800XD would play say Alan Wake 2 much better than a 5800X because it ran Far Cry 5 (2018) something like 30% faster at 720p using Techpowerup's test in 2022. However instead with TPUs 2024 9800X3D review it showed them at basically the same fps using a 4090 at 1440p.

Using Stellaris if anything would further my last point that you can already test real usage scenarios to illustrate CPU differences. Stellaris an old game at this point. You did not need to bench those multiplat graphics driven SP games at 720p to showcase how CPUs would make a difference with Stellaris would in 2024 at 1440p/4k, you could've just tested Stellaris at 1440p/4k from the start. Games like Stellaris exist, you can just use those. Using GPU driven games from your GPU test suite at 720p as an representative instead seems like a strange choice.
 
I'm not sure what you're getting at, I was answering your direct question here:
Does the 5800x3d increasingly pull ahead of the 5800x in 2024 games with rtx 4090 at 1440p/4k as reflected by 720p/1080p rtx 3090 tests in 2022?

And I believe we showed that yes, the 5800X3D is pulling ahead of the 5800X in resolution-bound cases. So, if that's the answer you wanted, then... Cool I suppose? I hadn't picked a "side" either way, I thought it was an interesting question.
 
That is for example the 5800XD would play say Alan Wake 2 much better than a 5800X because it ran Far Cry 5 (2018) something like 30% faster at 720p
This is not the point though. The matter is that the 5800X3D would feed the future RTX 5090 and RTX 6090 with 30% more fps than the 5800X even at 4K resolution.

Why? because Far Cry 5 at 4K for the future RTX 6090 is like 1080p for the current RTX 4090.
 
This is not the point though. The matter is that the 5800X3D would feed the future RTX 5090 and RTX 6090 with 30% more fps than the 5800X even at 4K resolution.

Why? because Far Cry 5 at 4K for the future RTX 6090 is like 1080p for the current RTX 4090.

Not the point I was referring to though, which was this one -

I think everyone understands that lower resolutions are needed to show the raw impact of faster CPUs. The disconnect seems to be that people want to know if upgrading their CPU today will improve frame rates and Steve is claiming he wants to show the real performance differences because the faster CPU will provide benefits in the future on more CPU demanding games.

The thing is those future more demanding graphics driven SP games are also more demanding on the GPU as well. Now TPU didnt retest everything with the 5800x in their 9800x3d review but the 5900x is close enough if not even slower at gaming in some cases. Going with1% lows in the 5800x3d review and 9800x3d lets look at how the 5800x3d stacks up against the non X3D counterparts -

Far Cry 5 (2018) in their 5800x3d review at 720p - 161fps vs 120.5 fps (5800x), a difference of 33.7%. The largest in the test suite on a RTX 3080.

So we're saying that can forecast real usage performance on demanding future games at 4k with future GPUs?

5800x3d vs 5900x with a rtx 4090

Alan Wake at 4k - 63.5 vs 63.7
Alan Wake at 4k with RT - 40.2 vs 39.8



So yes we've already established benching at low resolutions shifts more to the CPU (with some caveats) which in turn shows more CPU differences. What we haven't really established is how much relevance it has in terms of real world usage in terms of forecasting future games.
 
I'm not sure what you're getting at, I was answering your direct question here:


And I believe we showed that yes, the 5800X3D is pulling ahead of the 5800X in resolution-bound cases. So, if that's the answer you wanted, then... Cool I suppose? I hadn't picked a "side" either way, I thought it was an interesting question.

I'm not seeing how though? Your example was Stellaris a game released in 2016. Stellaris is a CPU demanding game, but it was a CPU demanding game in 2022 when the 5800x3d was launched and when the game was launched as well. If you wanted to show the real world differences between CPUs you could've just tested Stellaris is what I'm saying. You did not need to test some sort of graphics driven SP game at 720p.

Because again the proposal is that testing graphics driven SP games at 720p is done because it's the most suitable predictor for future games of that type at real usage resolution and settings. It's not about CPU bound games like Stellaris, you could just test those directly as they exist.
 
I'm not seeing how though? Your example was Stellaris a game released in 2016. Stellaris is a CPU demanding game, but it was a CPU demanding game in 2022 when the 5800x3d was launched and when the game was launched as well. If you wanted to show the real world differences between CPUs you could've just tested Stellaris is what I'm saying. You did not need to test some sort of graphics driven SP game at 720p.
Whether the game is new or old doesn't really matter, it's the CPU consumption that drives the result. I'm going to point to 40+ years of gaming benchmarks and analysis which clearly demonstrate newer games use more CPU than older games. Do you disagree with this assertion? Do you believe newer games somehow aren't using more CPU than older games? Do you have some sort of example data to demonstrate this?

I get you don't like my example, yet it's a valid example nonetheless. If you don't like my example, find one of your own :)
 
Last edited:
Steve did a little reality check on how the 5800X3D compares to the 5800X in modern games at the end of his video a few days ago.

Screenshot 2024-11-19 185412.png

Seems about right. Obviously the 5800X3D dominates all other AM4 stuff.
 
Whether the game is new or old doesn't really matter, it's the CPU consumption that drives the result. I'm going to point to 40+ years of gaming benchmarks and analysis which clearly demonstrate newer games use more CPU than older games. Do you disagree with this assertion? Do you believe newer games somehow aren't using more CPU than older games? Do you have some sort of example data to demonstrate this?

I get you don't like my example, yet it's a valid example nonetheless. If you don't like my example, find one of your own :)

The CPU isn't in isolation. Newer games of the type we are talking about use more CPU but they also use more GPU proportionally. This means it always shifts the limiting factor back the GPU especially at 4k.

In terms an example I posted one specifically with Alan Wake 2. Suddenly all the CPUs perform near parity in the real world at 4k resolutions. If you bought a 5800x3d over a 5800x hoping to play future games like Alan Wake 2 much faster due to their CPU demands at 4k did you benefit from that? Not from the data I posted from Techpowerups test set. You'll also find that the case for other games as well of that type.

I still think you're avoiding the real world usage angle here. The 5800x3d does show real world differences in actual CPU heavy games like Stellaris, but that's independent of the age of the game. What we are asking is if testing at 720p for existing (and older games) forecasts real world usage differences for future games at 4k (or 1440p) as per the original line I quoted.
 

Pretty good video, he also tested on medium or low raster settings at various RT quality levels, so he gave it a fair chance.

There's a couple of games that run well but other than that, RT is not viable anymore on this kind of hardware.
 

Pretty good video, he also tested on medium or low raster settings at various RT quality levels, so he gave it a fair chance.

There's a couple of games that run well but other than that, RT is not viable anymore on this kind of hardware.

Some crazy mental gymnastics going on in that video to try and justify that GPU not being viable for RT use. The 2060 offers roughly premium console level RT capability so as a minimum it should be able to offer a roughly console level RT experience which should absolutely be considered viable - and as far as I can see from this video - its does.

Dismissing 30fps as unplayable despite that being the way pretty much every current gen console gamer will be playing RT enabled games in the vast majority of cases is bizzare. And restricting the absolute bottom end RT capable product to Quality upscaling settings without looking at DLSS Performance is equally bizarre. Yes this is at 1080p so you'd be talking about a 540p base resolution, but with DLSS that's still better image quality than you'll get in many console RT titles using FSR with a higher base resolution, and given the age and performance tier of this GPU, expecting some image quality compromises if you want to use RT should be a given.

The video is happy to set games to their lowest quality preset (which generally look horrible) but doesn't use DLSS Performance mode which to many would have a much lesser visual impact :/

Yes you can potentially claim that that the performance hit and associated compromises to other graphical settings might make activating RT not worth it, but that's a user choice and totally different to saying RT isn't viable. Plus, the difference in graphics settings and resolution and/or framerate from turning RT on on a 2060 should be lower than the differences experienced between RT and non-RT modes on consoles which have higher raster performance than the 2060 but take a much bigger performance hit from RT. Yet the consoles still offer RT modes which further validates the viability of it's use on the 2060.

EDIT: I mean, just look at the nonsense of this summary chart:

1732127370166.png

Every game with the exception of Cyberpunk and Alan Wake 2 can offer a MINIMUM of 30fps here with RT enabled.... at DLSS QUALITY. And with Cyberpunk the lowest RT setting they use is Ultra (WTF?) while Alan Wake doesn't even use RT on the PS5 and XSX (yet I would expect using DLSS Performance, or worse case ultra performance here would still net you 30+ fps).
 
Last edited:
Yes you can potentially claim that that the performance hit and associated compromises to other graphical settings might make activating RT not worth it, but that's a user choice and totally different to saying RT isn't viable.
Exactly. Steve's personal vendetta against RT has been taking him to some weird places. One of these places is him claiming that RT isn't viable as if you're somehow forced to use it - or as if there was a better option for RT at this price than the 2060 he's looking at.

2060 launched in Jan 2019 at $350, its competition at the time was a mix of cheaper obsolete Polaris cards and a more expensive Vega 56, none of them were doing any better than 2060 *in games without RT* in Steve's own launch benchmark. Now it's revision time apparently.
 
Looks like Gamers Nexus answered your question comprehensively. The X3D clearly dominated the X of the same generation when playing at 4K.

Gee, sounds like something we've talked about very recently.

This is why I hate youtube videos but I parsed through the GN video you linked and all the games seem to use 1080p, including 1080p low.

Secondly the other issue is GN at least in their video does lean towards CPU driven games. But we weren't talking about that specific GN video are we? The premise again was using low resolution on GPU driven games that primairly SP games that are graphics driven (and typically FPS/TPS) to show case future performance. Games like Alan Wake 2, Cyberpunk and etc.

So I'm not seeing what that GN video is showing? It didn't include any of the latest GPU driven games nor tested that at 4k. So how can it be showing the X3D dominating the X CPUs at 4k?

The TPU review I linked actually has tests at both 4k and 1440p. That one shows plateauing numbers for CPUs especially at 4k. Which still seems to support the idea that regardless going forward if you're looking at 4k max settings, those games typically plateau on anything other then low end CPUs due to how GPU demanding they are at that resolution.
 
Exactly. Steve's personal vendetta against RT has been taking him to some weird places. One of these places is him claiming that RT isn't viable as if you're somehow forced to use it - or as if there was a better option for RT at this price than the 2060 he's looking at.

2060 launched in Jan 2019 at $350, its competition at the time was a mix of cheaper obsolete Polaris cards and a more expensive Vega 56, none of them were doing any better than 2060 *in games without RT* in Steve's own launch benchmark. Now it's revision time apparently.
Epic points about the vendetta Steve totally has.👌

It's Tim's video though.
 
The 2060 offers roughly premium console level RT capability so as a minimum it should be able to offer a roughly console level RT experience which should absolutely be considered viable
Couldn't have said this better myself.

The video is happy to set games to their lowest quality preset (which generally look horrible) but doesn't use DLSS Performance mode which to many would have a much lesser visual impact :/
They couldn't be more absurd with their standards or weirder with their logic! Unbelievable!
 
Your claims are seriously bordering on the ridiculous.
Some crazy mental gymnastics going on in that video to try and justify that GPU not being viable for RT use. The 2060 offers roughly premium console level RT capability so as a minimum it should be able to offer a roughly console level RT experience which should absolutely be considered viable - and as far as I can see from this video - its does
I understand why you'd want to use consoles as the benchmark for pc's since according to the steam hardware survey, consoles are more powerful than ~85% of pcs. However, console players do not view 30 fps as a viable way to play games. We have information directly from Sony saying that 75% of users choose to play at 60fps over 30fps with improved graphics and raytracing. So no, the 2060 is not viable for raytracing because raytracing at 30 fps is not viable to the majority of players on consoles.
Dismissing 30fps as unplayable despite that being the way pretty much every current gen console gamer will be playing RT enabled games in the vast majority of cases is bizzare.
See above, this is very much false.
And restricting the absolute bottom end RT capable product to Quality upscaling settings without looking at DLSS Performance is equally bizarre. Yes this is at 1080p so you'd be talking about a 540p base resolution, but with DLSS that's still better image quality than you'll get in many console RT titles using FSR with a higher base resolution, and given the age and performance tier of this GPU, expecting some image quality compromises if you want to use RT should be a given.
The bold is a false. DLSS from 540p might be better than FSR from 540p but it's not even better than native 1080p. Talk less of games that use FSR with a base input resolution > 1080p. I've tested this myself on multiple Nvidia gpus and monitor combinations. What might be tolerable for others may differ however, one thing that is very obvious is the performance of DLSS when the input resolution is less than 1080p. It's flaws are very visible, I don't even need to put on my glasses to see it.
The video is happy to set games to their lowest quality preset (which generally look horrible) but doesn't use DLSS Performance mode which to many would have a much lesser visual impact :/
DLSS performance is only viable when the output resolution is >=4k, so an input resolution >= 1080p. Below that, it's flaws are way too obvious to ignore.
Yes you can potentially claim that that the performance hit and associated compromises to other graphical settings might make activating RT not worth it, but that's a user choice and totally different to saying RT isn't viable. Plus, the difference in graphics settings and resolution and/or framerate from turning RT on on a 2060 should be lower than the differences experienced between RT and non-RT modes on consoles which have higher raster performance than the 2060 but take a much bigger performance hit from RT. Yet the consoles still offer RT modes which further validates the viability of it's use on the 2060.
Imagine how biased one would have to be to suggest playing at 30 fps on pc with a mouse? Completely ridiculous really. On pc, the minimum expected framerate is 60fps as 30fps usually has poor frame pacing and mouse responsiveness. Furthermore, if you play on an oled screen, 30fps is not viable at all. It looks like a slideshow due to oled's pixel response time and real lack of persistence blur. 2060 has never been and will never be viable for raytracing. It's a card you use to run benchmarks to see what raytracing looks like but, not to play raytraced games. I owned a 2060 at launch and I quickly got rid of it once I realized how useless it was for raytracing. I went from 2060 -> 2070 super -> 3080 -> 6800xt -> 3080 -> 4070 -> 4090 -> 4080 Super. Of the 2000 series rtx cards, only the 2080ti and 2080 were "viable" for ray-traycing.
 
Mod mode: very yes.The personal attack snipes of "ridiculous claims" and "how biased" need to stop.

There is no hard answer to "what is better, framerates or graphics" and nor is there any hard answer to "is 30fps acceptable?" The answer to both is "it depends", and for very good reason. 60FPS isn't necessary to enjoy a flight sim, or a turn-based strategy game, or most real time strategy games, or most top-down scrollers. However, the higher framerates matter a lot more in twitch shooters, although further to that point, graphics quality is rather pointless -- all the pro / league players of those games would enable 256 color untextured flat-shaded polygons if the sliders allowed it.

As such, let's treat the topic with the nuance it deserves, and the nuance we expect from Beyond3D forum participants.
 
Personally I think a hardware enthusiatic website or YouTube channel should take a "pro new tech" stance, so I generally don't read or watch a site that's crticial of new techs without actually understands the nuances. Of course, I don't think they shouldn't exist, it's just that I'm not going to read/watch them. The problem of many of today's hardware websites and YouTube channels is that they go for sensationalism, either from their own strong personal opinion, or because they want to attract viewers through controversy. I don't like any of these.

The hard truth is that it's difficult for new tech to get market shares, the more "different" the tech is, the more difficult. This is why despite that we know ARM CPU are more efficient but most people are still happily using x86 CPU. So when there're new techs I'm more willing to cut them some slacks. Obvious, vendors are always going to oversell their special new techs, that's to be expected. However, I do think a hardware website or YouTube channel should be able to analyze fairly, otherwise why do we need them? Furthermore, in many situation a new tech has to start somewhere. It won't be perfect immediately. If the market just can't tolerate a imperfect new tech, it's more likely that we won't have new tech at all, or will be at much slower pace, because the vendors who don't do innovations on new techs tend to win on cost and efficiency.

Take this "2060 is not good at raytracing" video, for example. I didn't watch it and I don't intend to watch it, but saying "2060 is not good at RT so it's 'misleading customers'" is just bad. Is 2080 Ti's RT good enough? If it is, then what do you expect NVIDIA to do? Make 2060 with no RT? Is that good for the industry? I don't think so. Not to mention that people'd just say "NVIDIA is deliberately keeping RT to very expensive cards" or something like that.

Note that I'm not saying people should buy into unproven new techs. They should not, in general. But as people mentioned 2060 was a good GPU at its time even without RT. So it's just that "do you want to pay a little more for a card that's faster and also have this new tech?" I think it's quite reasonable for people to make this decision, even if such new tech didn't pan out as expected, people still got their reasonably good GPU.
 
I understand why you'd want to use consoles as the benchmark for pc's since according to the steam hardware survey, consoles are more powerful than ~85% of pcs

This isn't really accurate. Ignoring mobile parts there are around 37% of PC GPU's in the Steam Hardware survey that are capable of delivering a broadly equivalent experience. That's taking the slowest parts to be the 3060 and 2070Ti which may sometimes lag behind in raster at matched input resolutions but can often be ahead with RT and/or when matching output image quality thanks to DLSS.

However, console players do not view 30 fps as a viable way to play games. We have information directly from Sony saying that 75% of users choose to play at 60fps over 30fps with improved graphics and raytracing

So 25% choose to play at 30fps. Meaning that option is perfectly viable? Not only that but the overwhelming majority of games from the previous 2 generations of consoles were 30fps. I'm pretty sure no-one would claim the PS3 and PS4 generation of consoles where not viable gaming machines.

The bold is a false. DLSS from 540p might be better than FSR from 540p but it's not even better than native 1080p. Talk less of games that use FSR with a base input resolution > 1080p.

This isn't the comparison that was made. A 2060 would generally not need to be running with DLSS Performance mode at 1080p to match a consoles performance at native 1080p in an RT enabled game. The real world scenario here is all those console games that use internal resolutions in the 700-800p range and upscale with FSR2. DLSS upscaling from 540p should produce a comparable or better image to that.

I've tested this myself on multiple Nvidia gpus and monitor combinations. What might be tolerable for others may differ however, one thing that is very obvious is the performance of DLSS when the input resolution is less than 1080p. It's flaws are very visible, I don't even need to put on my glasses to see it.

No-one claimed it would be a flawless output. In fact I said the exact opposite in the quote that you were responding to: "and given the age and performance tier of this GPU, expecting some image quality compromises if you want to use RT should be a given". The question here is whether those compromises are considered "viable" which would be a matter of personal preference, and thus by definition the GPU is viable for this scenario to anyone that considers those compromises acceptable. And the guidepost for whether people would consider this viable is the console market where as you say, around 25% of people would game with this or even worse image quality thanks to FSR.

DLSS performance is only viable when the output resolution is >=4k, so an input resolution >= 1080p. Below that, it's flaws are way too obvious to ignore.

This is pure personal opinion which does not tally up with the reality that many console games ship with with much poorer image quality than this and are still enjoyed by millions.

As a counter personal opinion, I routinely game at 3840x800 using DLSS performance with an input resolution of 1920x800... on a 38" monitor which is likely much larger than what your average 2060 gamer is using. And I find it more than acceptable even from the expectations of 4070Ti tier gaming.

Imagine how biased one would have to be to suggest playing at 30 fps on pc with a mouse? Completely ridiculous really.

Ah yes you've got me there. If only is were possible to play PC games with a control pad.

On pc, the minimum expected framerate is 60fps as 30fps usually has poor frame pacing and mouse responsiveness.

You might want to tell that to the tens of millions of PC gamers still playing on Pascal or lower levels of hardware that are very likely not playing at 60fps on a regular basis. And if they are, are very likely not doing so with greater than 4K DLSS Performance levels of image quality which you also claim to me the minimum bar for viability.

Furthermore, if you play on an oled screen, 30fps is not viable at all. It looks like a slideshow due to oled's pixel response time and real lack of persistence blur.

Which I'm sure is far more relevant to all the RTX 2060 gamers out there rocking OLED monitors than it is to the console gamers playing at 30fps on TV's.
 
Back
Top