Value of Hardware Unboxed benchmarking

A bit of history about APIs .. During Half Life 2 development, Valve had to create many rendering systems, spanning 3 DirectX versions: DX7, DX8 and DX9, the differences between these 3 APIs were vast, each requiring different data storage, programming language and rendering approach. In the end Valve had to create 9 rendering systems and spent a very long time making sure all of them look consistent.

Contrast this to today, where Hardware Unboxed seems to think that ray tracing is bad because it forces developers to do two rendering paths in their games (one for raster and one for RT). Someone needs to brush up on their history.

It was necessary. From 2000 to 2004 there was three different DX versions and huge advancements in rendering. And nVidia was responsible that DX7 was still a factor in 2004 with the Geforce 4MX...
Today we have EPIC who thinks that UE5 should be run on 10 years old GPU like Maxwell so that they can not make a proper engine with modern features. 4A Games developed a full DXR renderer with Metro Exodus EE.
 
Last edited:

The crusade continues.
Considering that it's id tech you could probably run maxed out textures with low texture streaming pool even on 8GB cards without issues.
The quoted source even says as much:
Yes, Indiana Jones and the Great Circle actually has not only an Ultra preset, but also a Giga and a Hyper preset - the latter is the maximum. The actual graphics options between Ultra and Hyper are the same except for the anisotropic filtering, the latter only increases from 8× to 16×. What's actually changing with Giga and Hyper is the texture pool size, which has long been a quirk of the id-Tech engine.

So reserved and presumably the game allocates a fixed amount of graphics card memory (the “pool”) and if this size is not enough, the game stutters. It doesn't matter whether that much VRAM is actually needed at the time, which has the disadvantage that the game sometimes runs worse than it theoretically could when there is a shortage of memory, but has the advantage that the memory allocation and occupancy while playing is so good like not changed. So if the graphics card memory is enough at the beginning of a game, it will probably also be sufficient at the end.

Unfortunately, it is virtually impossible to say how much VRAM the game really needs. The game allocates around 14 GB in the benchmark with the Hyper preset, 13 GB with Giga and a little more than 12 GB with Ultra - so the differences are small. With a 16 GB graphics card, there were no problems with the Hyper preset in Ultra HD, even after playing for a long time. It is unclear whether this also applies to a 12 GB graphics card.
But hey you gotta peddle the agenda which you've been peddling. We're almost in 2025 now and games where 8-10-12GBs aren't enough are still hard to come by!
 

The crusade continues.
Considering that it's id tech you could probably run maxed out textures with low texture streaming pool even on 8GB cards without issues.
The quoted source even says as much:

But hey you gotta peddle the agenda which you've been peddling. We're almost in 2025 now and games where 8-10-12GBs aren't enough are still hard to come by!
Nah, there is a point where the texture streaming pool does affect texture quality in this engine, but its mostly in the distance, thats how it was in Doom. We have yet to see a tester comparing the texture quality on the different texture pool options. Computerbase didn't write anything about the texture quality.

There might be a chance that the texture quality itself doesn't change much, even on the lowest texture pool setting, but given how it integrates Raytracing (which needs VRAM) I don't think that's likely. Perhaps on the low option, the texture quality will be significantly degraded but fine at medium and above.

We will have to see. But please don't defend 8 GB cards, it really gets tight and buyers who buy new cards with 8 GB will have a card with a severe achilles heel that won't last long. And no, we have many, many titles where 8 GB VRAM is simply not enough even on moderate settings. The problem is that in many games, the issues arrise after a certain amount of playtime. So you won't see it happening in benchmarks, let me tell you that as an user of a 6 GB card. The difference between benchmarking and actual play is extreme.
 
But please don't defend 8 GB cards, it really gets tight and buyers who buy new cards with 8 GB will have a card with a severe achilles heel that won't last long.
A. No one is saying that 8GBs are enough forever. However they are still enough now and will likely be for some time. Those like Steve are distorting the reality in which the majority of PC GPUs actually have 8 or less GBs of VRAM - and yet people somehow still play all these new games.

B. If you'd stop demanding that games have to run on such cards then you'd be doing all these people a huge disservice. This is what Steve is doing with his constant peddling of cards with more VRAM - making it sound like developers shouldn't care about VRAM usage cause apparently it's impossible to play on 8GBs anyway. Guess what - this may actually happen faster if people like Steve will continue asking for that.

C. Note that this is about 8GB cards. I've been listening to Steve saying how bad these are since 2020 at the very least. We're at a fifth year now of these cards aren't being suitable for gaming according to Steve - and yet I still can't think of more than a dozen titles which actually run into issues on them when you blindly set maximum settings. In this game in particular you could lower the texture pool (as in all recent id tech titles) and get playable framerates with minimal issues - according to PCGH bench you would be in 40s to 50s with such settings in 1440p. This seems like a solid enough bottom line for performance here, no need to play a "spot the VRAM size game" at all.

Also - you can actually play this one on a 2060 6GB at 30 fps.
 
Ha who could’ve guessed that Steve would’ve rejoiced that ultra mega supreme (literally) settings don’t work on 8GB cards.

8GB isn’t enough but come on dude at least try to have some kinda journalistic integrity.
 
Note that even if a game seems fine in benchmarks, playing for a reasonable amount of time (more than a benchmark run) often yields a very different experience. I've seen benchmarks showing 12GB being enough for some games only to find out that's not exactly the case when I'm actually playing the game.
 
For my video I made sure to play the game for 2 hours on each GPU type, 8, 10 and 12 GB. No frame gen though.
There one can find that later levels are more VRAM heavy than earlier ones. The vatican is about 3 hours into the game I think and that is the first hub area that stresses VRAM more and thus reveals extra bottlenecks.
 
B. If you'd stop demanding that games have to run on such cards then you'd be doing all these people a huge disservice. This is what Steve is doing with his constant peddling of cards with more VRAM - making it sound like developers shouldn't care about VRAM usage cause apparently it's impossible to play on 8GBs anyway. Guess what - this may actually happen faster if people like Steve will continue asking for that.
It's perfectly compatible to claim that games should be able to run on 8GB cards, but that you shouldn't buy a GPU today with that amount of memory if you want to avoid compromises.
 
It's perfectly compatible to claim that games should be able to run on 8GB cards, but that you shouldn't buy a GPU today with that amount of memory if you want to avoid compromises.
So which one would you buy then at say $300?
Arc A, Radeon, Geforce all have 8GBs of memory at this price point right now.
Arc B will have 12 but if it will have similar issues to A would you actually buy it just for the VRAM and would this actually grant you the lack of compromises?
Did buying a 16GB card in 2020 lead to many titles running on them without compromises these days?

It's completely misleading to use the VRAM comparisons as if it alone decides what compromises one would have to make in the future.
 
There is not a problem to buy a card with higher amount of VRAM. But the texture streaming function is just allocating VRAM without any benefit and it is a static number over all resolutions.
I did the test from Low -> Supreme:
1080p: 7.3Gb -> 10.3GB
2160p: 8.9GB -> 12GB

The texture quality seems to be always the same. This game doesnt even have a texture quality setting...
 
Last edited:
I think more VRAM allocation helps when there's unexpected heavy load on streaming, e.g. in some unfortunate rare case where suddenly you need a texture which happened to just be purged. However, even in such situation it's probably just a few frames of lower texture resolution. A good engine should not block just because a higher MIP map level was not loaded yet.
 
So which one would you buy then at say $300?
Arc A, Radeon, Geforce all have 8GBs of memory at this price point right now.
Arc B will have 12 but if it will have similar issues to A would you actually buy it just for the VRAM and would this actually grant you the lack of compromises?
Did buying a 16GB card in 2020 lead to many titles running on them without compromises these days?

It's completely misleading to use the VRAM comparisons as if it alone decides what compromises one would have to make in the future.
The claim is not that having enough VRAM is sufficient to avoid compromises. The claim is that it is a necessary condition. For example you can have enough VRAM and your games can still be plagued with graphical issues due to terrible drivers. The purpose of a review is to evaluate all factors, not just VRAM, and come to an overall conclusion. I don't think Steve's idea is that we replace all quantitative and qualitative analysis with the question "is there enough VRAM?".

However, at a given price point and target resolution there will be an expectation of playing with certain level of settings, and whether that is achievable will depend on the amount of VRAM.
 
Note that even if a game seems fine in benchmarks, playing for a reasonable amount of time (more than a benchmark run) often yields a very different experience. I've seen benchmarks showing 12GB being enough for some games only to find out that's not exactly the case when I'm actually playing the game.
No no, 8 GB is clearly enough for everyone, you'll know when it'll change. Based on leaks it won't be next gen yet though.
 
Does anyone know what video @Dictator was talking about? The one with 2 hour test runs. I only see the Indy video about Xbox. Is this an onlyfans exclusive or something?
 
Why is the streaming falling down on smaller caches? What's the assumption for virtualised textures that's wrong, or at least where the worst-case scenario means the standard caching fails? Virtual textures should hit a perfect set of predictable operating parameters.
 
Back
Top