Value of Hardware Unboxed benchmarking

B3D is a special place :). All the sentiment I've seen outside of here is that the game looks fine and runs well. Either the graphics aren't mentioned or it's stuff like this:

I'd rather go back to Morrowind water than look at this SSR abomination. Get that shit outa here. If they don't wanna do RT then fine but SSR is not suitable for bodies of water.
 
In a game that looks like this it’s not good at all.

I think the game looks fine. It doesn’t look cutting edge or super “next gen” but it’s not bad. I think it has a pretty realistic art direction and colour palette that makes it look much better than Dragon Age which likely has much better tech but terrible art design.

I think mainly it seems people are impressed with how it scales down to run on older systems. Seems that I’m the reviewer community people really don’t like upscaling. I’ll pretty much always use it to increase performance. There’s no good enough when it comes to fps.
 
The fact that only 4090 and 5090 manage more than 60 fps in 4K with "experimental" preset is a fun thing though considering the performance hit it has for what is pretty much zero improvement to graphics. You'd think this is where some RT would be beneficial but apparently this is somehow better according to Steve.
This gets us back to the old subject of high end GPUs not achieving high fps even without ray tracing, the 4090 is barely able to do 45fps in many UE5 titles at native 4K, (titles such as Immortals of Aveum, Fort Solis, Lego Horizon Adventures, and many others), yet I don't see HardwareUnboxed calling these titles bad. Whereas, if any of them had ray tracing they would be screaming from the rooftops about the bad ray tracing being bad for costing so much performance.
 
I think the game looks fine. It doesn’t look cutting edge or super “next gen” but it’s not bad. I think it has a pretty realistic art direction and colour palette that makes it look much better than Dragon Age which likely has much better tech but terrible art design.

I think mainly it seems people are impressed with how it scales down to run on older systems. Seems that I’m the reviewer community people really don’t like upscaling. I’ll pretty much always use it to increase performance. There’s no good enough when it comes to fps.

I don’t think it looks bad. It just doesn’t look like anything that should demand that sorta performance. It is very much last gen.
 
This is what I find fascinating about this issue.

In the 10-15 years ago range roughly the predominant sentiment was that PC gaming performance demands were way too low and that there was negligible separation based on spend. People wanted developers to optimize for the PC by pushing the hardware further so there was a need to buy newer and more expensive GPUs. The feeling was that more spend just meant more resolution/fps without any real change in fidelity.

Now it's the performance requirements are too high and that there is too much separation based on spend. People want developers to optimize so as much hardware can run the game as possible and to avoid the need to upgrade. The only difference for higher hardware really being more resolution/fps.

Well the vocal community in enthusiasts circles anyways. I think as always the constant is the larger actual gaming population just keeps chugging along and is actually playing the games.
 
This is what I find fascinating about this issue.

In the 10-15 years ago range roughly the predominant sentiment was that PC gaming performance demands were way too low and that there was negligible separation based on spend. People wanted developers to optimize for the PC by pushing the hardware further so there was a need to buy newer and more expensive GPUs. The feeling was that more spend just meant more resolution/fps without any real change in fidelity.

Now it's the performance requirements are too high and that there is too much separation based on spend. People want developers to optimize so as much hardware can run the game as possible and to avoid the need to upgrade. The only difference for higher hardware really being more resolution/fps.

Well the vocal community in enthusiasts circles anyways. I think as always the constant is the larger actual gaming population just keeps chugging along and is actually playing the games.

I don’t see those as opposing positions and are both true. PC requirements in the past were largely due to running at native resolution vs consoles and maybe a few upgrades here and there - antialiasing, AO etc. It’s also true that PCs typically don’t gain much in visuals (aside from resolution) for the additional performance costs vs consoles.

I play mostly older games and while I love the crisp IQ that PCs offer the underlying art assets, shading and particle effects etc are decidedly mediocre. So when I see the same things in a 2025 game it’s disappointing.
 
Sorry, I don't understand this word. You mean benchmarking? Or is this a shorthand for pixel counting?
I've noticed something about some gamers. They'll run a PC hardware benchmark like Kingdom Come: Deliverance 2 and not even pay attention to the performance metrics or image quality. They don't even know what a bounding volume hierarchy is. They "play" the benchmark and find some entertainment in this. They'll run around in the world and interact with NPCs, complete quests, engage in combat, and generally do a bunch of stuff that doesn't involve 99percentile frametime averages or deducing the native rendering resolution. I don't get it.
 
I'm sure it plays great but I'm too much of a graphics snob to subject myself to this in 2025.
So an aside: I don’t really like the wider AMD-sphere online but I remember a particular Radeonhead saying something in a Reddit comment that went something like “fans of Nvidia and RTX don’t buy GPUs to play games, they buy games to use their new graphics cards”.

The point being, there is more to a game than being a tech demo for whatever tech Nvidia is pushing, and frankly closing yourself off to games that might look a bit outdated means you are going to miss most of the good ones.
 
So an aside: I don’t really like the wider AMD-sphere online but I remember a particular Radeonhead saying something in a Reddit comment that went something like “fans of Nvidia and RTX don’t buy GPUs to play games, they buy games to use their new graphics cards”.

The point being, there is more to a game than being a tech demo for whatever tech Nvidia is pushing, and frankly closing yourself off to games that might look a bit outdated means you are going to miss most of the good ones.

That's ok. There's plenty on my "to be played" (Hogwarts, Alan Wake 2, Indy in progress atm) list that meet my baseline for quality.

This game is miles away from worrying about needing RTX to look better. They could start with textures and assets. There's so many games these days that there's something for everyone and their criteria. I'll assume in keeping up with the dark ages, this game doesn't even support HDR?
 
It always amuses me when people comment on this forum about graphics not being important.

On this forum. Lol.

I think it's more that some can appreciate a project that seems to have largely fulfilled a well-defined scope. It scales extremely well, relatively stable, and has no shader/traversal stutters of VRAM concerns to boot - these are not exactly attributes that are as commonplace for PC release as we would all like these days.
 
I think it's more that some can appreciate a project that seems to have largely fulfilled a well-defined scope. It scales extremely well, relatively stable, and has no shader/traversal stutters of VRAM concerns to boot - these are not exactly attributes that are as commonplace for PC release as we would all like these days.

All good things for sure. It’s also fine to point out all those great things aren’t being delivered at the same fidelity of other modern games. I don’t think that should translate into “cutting edge graphics don’t matter”.
 
All good things for sure. It’s also fine to point out all those great things aren’t being delivered at the same fidelity of other modern games. I don’t think that should translate into “cutting edge graphics don’t matter”.
I want someone to convince me that cutting edge graphics matter, because it matters to me but seemingly not at all to most gamers. A couple days ago I was watching a massive Twitch streamer play KCD2 and he commented that the game looks really good. The overwelming consensus is that the game looks good and runs well.
 
I want someone to convince me that cutting edge graphics matter, because it matters to me but seemingly not at all to most gamers. A couple days ago I was watching a massive Twitch streamer play KCD2 and he commented that the game looks really good. The overwelming consensus is that the game looks good and runs well.
It is really hard to ignore the shadow and reflection artifacting unless you are blind in this game 🤷‍♂️
I like the gameplay, but the rendering artifacts are killing the desire to play this game for me :runaway:
 
It is really hard to ignore the shadow and reflection artifacting unless you are blind in this game 🤷‍♂️
I like the gameplay, but the rendering artifacts are killing the desire to play this game for me :runaway:
How did you manage to play games before the year 2021 then?

It always amuses me when people comment on this forum about graphics not being important.

On this forum. Lol.
I didn't say they weren't important, just that if using the latest graphics tech was your main filter for which games you'd play, you'd probably be playing mostly junk.
 
It is really hard to ignore the shadow and reflection artifacting unless you are blind in this game 🤷‍♂️
I like the gameplay, but the rendering artifacts are killing the desire to play this game for me :runaway:
IDK how people look past SSR artifacting without even seeming to notice it. It is a nonsense effect used to make screenshots look good. In gameplay it does more harm than good.
 
Back
Top