Value of Hardware Unboxed benchmarking

Yeh once fsr and xess came out before then it was screw rt and upscaling you shouldn't need it 4k native or go home, unfortunately for them eventually you do have to come to terms with reality.

To be fair, you were possibly right because I did stop watching them after ampere/rdna 2 releases so I went and looked at their recommended gpu video from 2 months ago to see if your actually right or not. The 1000 dollar segment they say the 4080s is the superior pick if you don't care about nvidias features the 7900xtx is an ok buy because it's like 50usd less, we can call it a draw here although if your dropping 1k on a gpu is 50 bucks really a deal breaker?

Next segement he mentions the 7900xt is faster at native 4k as a win then mentions overall the 4070ti is 15% more expensive but 16% faster across all games then says he leans towards the 7900xt here because it provides more value........... I didn't watch any further because I think that confirmed i'm not out of touch with reality. Don't worry I don't expect an apology.
Prior to FSR2’s release was still early days for DLSS, we’re talking 2020 and 2021, before RT was popular (still isn’t really but it’s a lot more ubiquitous than back then).

The 4070ti was a bad product so I’m not sure what you mean in your final paragraph. I wouldn’t recommend either of those GPUs but if I had to pick I would also go with the 4070ti. It had far too little VRAM for the price.
 
The 4070ti was a bad product so I’m not sure what you mean in your final paragraph. I wouldn’t recommend either of those GPUs but if I had to pick I would also go with the 4070ti. It had far too little VRAM for the price.
Here's a hint, if you buy 2 socks for 5 bucks and 4 socks for 10 bucks the 5 buck option is not better value. BTW you just said you would buy the 4070ti but HWU said he is leaning towards the 7900xt because it's better value, so you don't agree with him either. I think we are done here.
 
However, that binary bucketing of real-time graphics into RT and non-RT parts is such an overly simplified way of thinking about it, to the point where it clearly actively harms reasoning (and here on Beyond3D, discourse) about the whole. We could do with just moving past that to think and talk about the much more complicated interplay of everything that constitutes real-time graphics together.
It's a language issue. However, it's a difficult one to solve, I think, because a full conversation requires talking about which aspects of image generation one is talking about, and which techniques are being used to solve it. We have model representation and deconstruction, geometry drawing, material shading, lighting, and optimisations like LOD. And even ML inference. We can solve some of these by mathematically rasterising, or by tracing rays, and/or using acceleration structures. BVH can be used for more than just ray-tracing geometry.

An ideal conversation would eschew the blanket terms and focus more on the specifics of the current argument and POV ("GPUs should focus less on accelerating triangle intersects and more on accelerating maths used in more non-ML activities") but human language is always going to gravitate towards short-form terms. Which in itself might be a cause of wider conflicts where so many debates boil down to an A/B disagreement, and them versus us.
 
Here's a hint, if you buy 2 socks for 5 bucks and 4 socks for 10 bucks the 5 buck option is not better value. BTW you just said you would buy the 4070ti but HWU said he is leaning towards the 7900xt because it's better value, so you don't agree with him either. I think we are done here.
Sorry, I misspoke lol. I meant the 7900XT. I would never buy a 12GB card in 2024 because I play at 4k, and frankly the 4070ti was way too expensive when it released to be a 12GB card.
 
The 4070ti was a bad product so I’m not sure what you mean in your final paragraph. I wouldn’t recommend either of those GPUs but if I had to pick I would also go with the 4070ti. It had far too little VRAM for the price.

I'm probably biased because I have one, and I do acknowledge it was massively overpriced for what you are getting vs previous Nvidia offerings. But against the 7900XT? Nope, sorry it's the clearly better choice at a similar price.

So many games have RT these days and it will be faster in all of them (massively so in some cases). So unless you're going to spend $800 on a GPU but be happy to forgo the most advanced rendering features in most of todays biggest visual showpieces then of course you're going to want the GPU that will be faster with those rendering features enabled.

And that's before you even consider DLSS which alone pretty much puts the 4070Ti into a whole different performance tier vs the 7900XT in any game that supports it (which these days is most of them). I know there was a period there where we all pretended that we should be comparing DLSS/FSR at matched presets for a fair performance comparison but thanks in part to the many PSSR (which itself so far seems to lag behind DLSS) showcases demonstrating the enormous quality advantages of AI based upscaling vs FSR when using the same input resolution, I hope by now people accept that comparing the two upscaling methods at matched presets is not at all a fair way of assessing the end user experience.

Essentially we should be looking at how the 4070Ti performs with DLSS Performance vs the 7900XT with FSR Quality in any game that support both for a more realistic view of which gives the better end user experience.

As to the 12GB VRAM. Yes it's too low. 16GB would have been much nicer. But the reality is it doesn't impact that many games - certainly no-where near as many as the DLSS and RT advantages impact.
 
I'm probably biased because I have one, and I do acknowledge it was massively overpriced for what you are getting vs previous Nvidia offerings. But against the 7900XT? Nope, sorry it's the clearly better choice at a similar price.

So many games have RT these days and it will be faster in all of them (massively so in some cases). So unless you're going to spend $800 on a GPU but be happy to forgo the most advanced rendering features in most of todays biggest visual showpieces then of course you're going to want the GPU that will be faster with those rendering features enabled.

And that's before you even consider DLSS which alone pretty much puts the 4070Ti into a whole different performance tier vs the 7900XT in any game that supports it (which these days is most of them). I know there was a period there where we all pretended that we should be comparing DLSS/FSR at matched presets for a fair performance comparison but thanks in part to the many PSSR (which itself so far seems to lag behind DLSS) showcases demonstrating the enormous quality advantages of AI based upscaling vs FSR when using the same input resolution, I hope by now people accept that comparing the two upscaling methods at matched presets is not at all a fair way of assessing the end user experience.

Essentially we should be looking at how the 4070Ti performs with DLSS Performance vs the 7900XT with FSR Quality in any game that support both for a more realistic view of which gives the better end user experience.

As to the 12GB VRAM. Yes it's too low. 16GB would have been much nicer. But the reality is it doesn't impact that many games - certainly no-where near as many as the DLSS and RT advantages impact.
There are more games today that shit the bed due to 12GB than games that absolutely require RT. DLSS is nice but using software features to paper over a glaring hardware deficiency makes this a bad product.

Like I said I’d buy neither, but the 4070ti was so bad they had to refresh it with 16GB. We’re talking about an $800 card here.
 
The "average PC gamer" has a 1080P60Hz monitor, can't tell the difference between DLSS Quality at 1080P and native 1080P, doesn't use an FPS counter, and has never watched a video from HUB, LTT, or DF in their life. They play their game, arbitrarily turn down the settings if it feels choppy, complain on forums or refund the game if it still feels choppy after turning the settings down, and turn them up if it's running smoothly and they're curious about what those settings do. If said "average gamer" has a 4060 or 3070, they can already get performance acceptable to them with ray tracing. DLSS may be necessary, but the average gamer doesn't complain about "fake pixels", "fake frames", or how "upscaling is a crutch" either. They just see DLSS as a switch that gives them free frames. Now the average PC gamer also has a card that is worse at ray tracing than a 4060 or 3070, but average PC gamers rarely buy single-player graphical showcases on release anyways.
 
The 4070ti was a bad product so I’m not sure what you mean in your final paragraph. I wouldn’t recommend either of those GPUs but if I had to pick I would also go with the 4070ti. It had far too little VRAM for the price.

There was nothing wrong with the 4070ti, and at the resolution it's meant for (1440p) the VRAM amount is fine and I never hit the limits in any game.
 
For me it's not that I think they don't like nvidia, it's they tell people to buy AMD gpus then use nvidia in their own personal systems then try to make excuses down the track when the hardware they recommended didn't pan out. Steve even put nvidia hardware in his daughters pc for predominantly fortnite and when asked in a q&a why said something like I just had the gpu sitting there (was like a 3080ti I think). So he didn't have an rdna gpu sitting around?

RT sucks, upscaling sucks, less than 16gb vram sucks, but i'm gonna use nvidia but you should totally use amd. That's how it comes across, money where your mouth is and all that.
So y'all are gonna keep making up lies about HUB being anti-ray tracing or anti-reconstruction? You guys genuinely go out of your way to ignore the more nuanced talking points HUB presents in favor of some dishonest, simple narrative that they're just haters or something. It's embarrassing how transparently purposeful the agenda is here.

And they use Nvidia for their own systems cuz Nvidia have the best GPU's and they aren't bound by the more value-based considerations that most consumers find themselves with. That's really not complicated. They are in a more privileged position than most(and also have higher needs than most doing both gaming and content production) and of course will take advantage of that. Most people watching are just ordinary consumers looking for information to get a decent idea of what is a good buy or not. HUB do a generally good job of this. You do not need to watch HUB or nearly anybody else if you're simply out for whatever performs the best, regardless of cost.

As for picking a GPU for his daughter, again, he was going by what was already around. Not what he'd have to go and buy himself. Also a very common sense explanation.
 
Last edited:
There was nothing wrong with the 4070ti, and at the resolution it's meant for (1440p) the VRAM amount is fine and I never hit the limits in any game.
An $800 should not be constrained to 1440, and I frequently see VRAM related stutters when using DLSS performance at 4k, aka 1080p.

If there was nothing wrong with it then why did Nvidia refresh it with more VRAM?
 
Yeh once fsr and xess came out before then it was screw rt and upscaling you shouldn't need it 4k native or go home, unfortunately for them eventually you do have to come to terms with reality.
This is just straight up shameless lies. Insanely ironic for somebody asking anybody to come to terms with reality. Absolute gaslighting here.
 
An $800 should not be constrained to 1440.

Costing $800 doesn't make a 1440p GPU suddenly become a good 4k GPU.

and I frequently see VRAM related stutters when using DLSS performance at 4k, aka 1080p.

Without actually studying the frame it's impossible to say it's the VRAM.

DLSS itself can cause issues.

I had a 4k monitor towards the end of my PC life and never had issues with my 4070ti when targeting a locked 60fps.

If there was nothing wrong with it then why did Nvidia refresh it with more VRAM?

The same reason why Nvidia released the GTX 260 core 216 or GTX 465 back in the day.
 
Costing $800 doesn't make a 1440p GPU suddenly become a good 4k GPU.



Without actually studying the frame it's impossible to say it's the VRAM.

DLSS itself can cause issues.

I had a 4k monitor towards the end of my PC life and never had issues with my 4070ti when targeting a locked 60fps.



Because AMD kept dropping prices, it's not rocket science.

It was the same reason why Nvidia released the GTX 260 core 216.
No, it’s a horrible 4k GPU, that’s my point. If you want to sell a 12GB 1440p card that’s fine but if you price it at $800 I’m going to bitch and moan about it.

AMD is not all that relevant for PC gaming, I highly doubt the Super lineup was a response to anything in RDNA3. Besides, if they just wanted to compete on price they’d just drop prices, but they refreshed it with VRAM because it was a common complaint that this GPU is held back by its VRAM.
 
No, it’s a horrible 4k GPU, that’s my point. If you want to sell a 12GB 1440p card that’s fine but if you price it at $800 I’m going to bitch and moan about it.
In which games does 7900XT perform better than 4070Ti in 4K with RT?

Something can only be "horrible" if there is a better alternative.
 
In which games does 7900XT perform better than 4070Ti in 4K with RT?

Something can only be "horrible" if there is a better alternative.
None as far as I know, but most games aren’t RT.

For raster I can think of a few titles where 12GB isn’t enough, particularly at 4k output (even with DLSS).
 
No, it’s a horrible 4k GPU, that’s my point.

So was the GTX 1060.

If you want to sell a 12GB 1440p card that’s fine but if you price it at $800 I’m going to bitch and moan about it.

You're not bitching about the price of a 1440p tier GPU.

You're bitching about how a 1440p tier GPU performs at 4k.

AMD is not all that relevant for PC gaming, I highly doubt the Super lineup was a response to anything in RDNA3.

ATI/AMD and Nvidia have a large history of releasing new skus to combat each other.

It's nothing new.

Besides, if they just wanted to compete on price they’d just drop prices,

It's not that simple.

But they refreshed it with VRAM because it was a common complaint that this GPU is held back by its VRAM.

No, they wanted more performance for the 4070ti Super, so they used the 4080 die which has a larger bus and more cores, so with the larger bus they had to release it with 8GB or 16GB VRAM.

What would you have them do? Release it with 8GB VRAM?

It's also worth remembering that the 4070ti was discontinued with the release of the Super version.
 
There are more games today that shit the bed due to 12GB than games that absolutely require RT.

Why are you drawing equivalence between games that require RT to run, and games where you can turn settings up high enough to cause problems for a 12GB card? There's no equivalence there at all.

The number of games that "absolutely require 12GB VRAM" is precisely zero. Less than the number of games that "absolutely require RT".

There are probably about 5 games out there that will run into serious issues on the 4070Ti due to it's VRAM at otherwise playable settings where the 7900XT would not run into the same issues due to it's larger frame buffer. And pretty much all of those can be resolved by turning textures from Ultra to High.

There are dozens of games which would run into serious issues on a 7900XT at settings which the 4070Ti would have no issue with due to its superior RT performance.

DLSS is nice but using software features to paper over a glaring hardware deficiency makes this a bad product.

DLSS is a combination of hardware and software but that's entirely irrelevant to the end user experience. With a 4070Ti, you have it. With a 7900XT you don't. That absolutely needs to be factored into the value proposition.

Like I said I’d buy neither, but the 4070ti was so bad they had to refresh it with 16GB. We’re talking about an $800 card here.

I already agreed the price was too high for what you were getting compared to previous generations. The argument is that the 7900XT offered even less value when accounting for it's lack of AI upscaling and performant RT.
 
For raster I can think of a few titles where 12GB isn’t enough, particularly at 4k output (even with DLSS).
Could you list them please? From my everyday tracking of games benchmarks I think there were a couple of such titles maybe - which got patched later to solve this particular issue.
 
So was the GTX 1060.



You're not bitching about the price of a 1440p tier GPU.

You're bitching about how a 1440p tier GPU performs at 4k.



ATI/AMD and Nvidia have a large history of releasing new skus to combat each other.

It's nothing new.



It's not that simple.



No, they wanted more performance for the 4070ti Super, so they used the 4080 die which has a larger bus and more cores, so with the larger bus they had to release it with 8GB or 16GB VRAM.

What would you have them do? Release it with 8GB VRAM?

It's also worth remembering that the 4070ti was discontinued with the release of the Super version.
How much did the GTX 1060 cost?

I am bitching that a 1440p card costs $800. Either make it appropriate for 4k or charge less, until then I will bitch lol.

I would love to hear why it’s ’not that simple’ for Nvidia to drop prices when they did literally just that for the 4070, and AMD does it with almost all of their releases lol.
Could you list them please? From my everyday tracking of games benchmarks I think there were a couple of such titles maybe - which got patched later to solve this particular issue.
I play a lot of call of duty and I’ve run into many scenarios where it stutters until I turn down texture resolution. I’m using DLSS Perf so 1080p -> 4k.

I’ve had issues with Spider-Man as well, particularly the more RT I turn on.

Friend of mine has had tons of issues with FH5.

There’s also some niche flight sims that use more than 12 even at 1440p (even 1080!) but I won’t count that against them because they’re so niche.
 
Back
Top