Value of Hardware Unboxed benchmarking *spawn

Has anyone here been put off by frame gen added latency? I can't recall any people saying that they've been personally bothered by this. I understand the numbers, but my subjective experience in Witcher 3 is that I couldn't notice any difference in input lag with FG, but I could definitely notice the increased fluidity. This was with base framerate typically between 40-60.

I also never noticed any FG related artifacting, and I am an artifact connoisseur. I can see it in the youtube videos when they zoom in and slomo it, so I kind of know what to look for. But I'll be damned if I could ever spot it while playing the game, even when keeping an eye out for it.
 
It's been my fairly long term observation that most people overestimate their perceptive sensitivity to input lag to some decent degree. Especially PC gamers. I also think most gamers dont understand how much inherent input latency there is in most games to begin with simply through the rendering process, before we even start talking hardware. Outside of competitive titles and whatnot, it's pretty typical of AAA games to be running with like 70, 80, even 100ms+ of inherent latency at something like 60fps.

As for frame gen, results do seem to vary. In some games, Reflex seems to claw back a good chunk of the added latency, and in others it doesn't. So that can effect things there. But yes, in general, I think most people can easily stomach or likely not even notice a 15-20ms difference, especially for the types of more demanding, typically single-player games where using frame gen will be most used. It's something I expect most people will just turn on and subconsciously adjust to it fairly quickly.
 
It's been my fairly long term observation that most people overestimate their perceptive sensitivity to input lag to some decent degree. Especially PC gamers. I also think most gamers dont understand how much inherent input latency there is in most games to begin with simply through the rendering process, before we even start talking hardware. Outside of competitive titles and whatnot, it's pretty typical of AAA games to be running with like 70, 80, even 100ms+ of inherent latency at something like 60fps.

As for frame gen, results do seem to vary. In some games, Reflex seems to claw back a good chunk of the added latency, and in others it doesn't. So that can effect things there. But yes, in general, I think most people can easily stomach or likely not even notice a 15-20ms difference, especially for the types of more demanding, typically single-player games where using frame gen will be most used. It's something I expect most people will just turn on and subconsciously adjust to it fairly quickly.
I expected to be able to notice some kind of difference, even in a third person game like The Witcher 3. Laggy mouse input is extremely annoying to me. Bethesda games used to (maybe still do but I dunno) have terrible mouse lag when vsync was turned on. I found that if you set a FPS cap in D3DOverrider at 1 fps below the vsync cap the mouse lag would go away for some reason. Point is I've gone to great lengths to minimize mouse lag in many games. And yet with DLSS frame gen I can't even tell it's on aside from the massively higher frame rate. This was surprising to me and I wonder if I'm an outlier. Or possibly it's a sign of me getting old :LOL:
 
I alternate between a 144Hz VRR local setup and a 60Hz VRR streaming setup and input lag has never been an issue on the streaming machine. Streaming over Ethernet probably adds less latency than frame gen though.
 
Feb 9, 2024

0:00 - Intro
01:12 - AMD Ryzen 7 5700 Recap
13:57 - Survey Results
14:53 - GPU Market Share
19:07 - How Much Did Your GPU Cost?
21:48 - How Often Do You Upgrade?
28:23 - Rasterization vs Ray Tracing Importance
33:57 - How Much is Too Much for 8GB of VRAM?
43:32 - Buying Same Brand Again vs Switching Brands
54:55 - What GPU Features are Important
1:08:43 - Updates From Our Boring Lives

Survey results based on 2000 respondents.
 
Last edited by a moderator:
I can't believe it.


Instead of benchmarking the recent interesting games like Alan Wake 2, Avatar, Horizon and Ratchet they just showed a video of the 5700XT did in these titles without benchmarking. Instead, they benchmark the old cross gen games again which we know don't use modern features that RDNA1 lacks.

So by using that methode, the 5700XT can be shown in a good light because it doesn't allow for a comparison to GPUs with modern features like Turing, Ampere, RDNA2 and RDNA3. We know how bad the 5700XT performs relative to these architectures in modern games like Alan Wake 2 and Avatar.

This guy really gets on my nerves sometimes. I really wish someone would call out their modus operandi.
 
I really wish someone would call out their modus operandi.
I really wish people would realize not every review has to cater to your specific taste. If people didn't like their stuff they wouldn't get views.
 
I can't believe it.


Instead of benchmarking the recent interesting games like Alan Wake 2, Avatar, Horizon and Ratchet they just showed a video of the 5700XT did in these titles without benchmarking. Instead, they benchmark the old cross gen games again which we know don't use modern features that RDNA1 lacks.

So by using that methode, the 5700XT can be shown in a good light because it doesn't allow for a comparison to GPUs with modern features like Turing, Ampere, RDNA2 and RDNA3. We know how bad the 5700XT performs relative to these architectures in modern games like Alan Wake 2 and Avatar.

This guy really gets on my nerves sometimes. I really wish someone would call out their modus operandi.
A video showing RDNA1 lacking mesh shader support isn’t interesting, we already know how poorly those games perform. It seems more realistic and relevant to test games that are actually appropriate for the card instead of what are essentially tech demos in game format that specifically take advantage of features RDNA1 lacks.

Plus, as you say, he does include those titles, just not on the chart.

HUB makes videos for their audience, most of which do not care much about raytracing, at least according to some of the polling data I’ve seen them gather from subscribers. Other channels focus almost exclusively on bleeding edge graphical technology, ie DF. Neither approach is wrong, it’s just different, and not every reviewer will be to your liking.
 
Instead of benchmarking the recent interesting games like Alan Wake 2, Avatar, Horizon and Ratchet they just showed a video of the 5700XT did in these titles without benchmarking.

Maybe that is because they don't have data for others at comparable settings for those games.

We know how bad the 5700XT performs relative to these architectures in modern games like Alan Wake 2 and Avatar.

Do we? They spoke of recent patches improving Alan Wake 2 on that card.
 
I really wish someone would call out their modus operandi.
There is none. The only people with an agenda are people like you who try and tarnish them with nonsense allegations because they aren't covering things in the more biased way that you'd prefer.

They covered those games you mention to demonstrate them in a plausible, usable way for such an old part. That is entirely reasonable.

This is becoming straight up shameful at this point with some of y'all. The sort of stuff I expect to see from clowns in the Youtube comment sections, not on an enthusiast tech forum. smh
 
HUB makes videos for their audience, most of which do not care much about raytracing, at least according to some of the polling data I’ve seen them gather from subscribers.
That's not even it here. In the video they straight up say this card is rubbish if you care about ray tracing. Which should go without saying. Spending a bunch of time going over how a card without ray tracing acceleration is bad at ray tracing is a waste of everybody's time.
 
There is none. The only people with an agenda are people like you who try and tarnish them with nonsense allegations because they aren't covering things in the more biased way that you'd prefer.

They covered those games you mention to demonstrate them in a plausible, usable way for such an old part. That is entirely reasonable.

This is becoming straight up shameful at this point with some of y'all. The sort of stuff I expect to see from clowns in the Youtube comment sections, not on an enthusiast tech forum. smh

I am the one with the agenda? Then tell me, why didn't the channel benchmark the most recent current gen only titles in a video about how the 5700XT performs today? Why does Steve feel the need to constantly tell people how much lower end cards suck at Raytracing despite there existing a ton of examples where even a 2060 can get 1080p at 60 FPS with Raytracing on? Is it so far-fetched to believe that they reached a significant audience in the days of the 5700XT and do not want to admit to them that they made a short-sighted purchase recommendation?

You're were clearly not following them for long enough. You have a very naive idea of this channel.

Let me just tell you I've have tried my very best to have some civil decisions on Twitter with Steve about different topics. Yes I did speak to him directly, just as we do so here in this very forum. Instead of being civilized, that guy was incredibly unprofessional, called me names and straight up insulted me, without me resorting to personal attacks even once.

There's a valid reason why I'm so wary of them. So I would appreciate it if you could refrain from twisting my words in a way that makes it look like I'm running a YouTube clown show.
 
Yep that's pretty much it.

Even if that’s true nobody cares today so it’s weird they would feel the need to validate such an old recommendation.

Steve has been open about his disdain for IQ and advanced graphics since apparently he’s a hardcore competitive multiplayer aficionado who prefers exorbitant fps. HUBs content isn’t exactly tailored to that crowd though so it’s really unclear who this audience is that people keep referring to.
 
Even if that’s true nobody cares today so it’s weird they would feel the need to validate such an old recommendation.
While true, their critics and haters have successfully weaponized this against them, as the 5700XT lacks ray tracing, mesh shaders, DLSS, DLAA, DLDSR, ray reconstruction, Reflex and a myriad of other new features like RTX HDR, RTX Remix, RTX Video, RTX Chat .. etc.

Pretty much anytime NVIDIA comes with a new feature that works on Turing, HUB gets reminded with harsh words how his 5700XT choice was and still is a dead duck.

HUBs content isn’t exactly tailored to that crowd though so it’s really unclear who this audience is that people keep referring to
It's just an excuse to cover the real reason, HUB has to be biased in favor of AMD to satisfy his real audience: the AMD crowd.

Daniel Own, one of the more balanced techtubers recently admitted this in a personal video, videos that are positive about AMD and negative about NVIDIA gets more clicks and revenue than any other videos. So, many techtubers have to factor that in when they make videos to increase their revenue, HUB is just one of the biggest people doing this, if not the biggest.

 
I am the one with the agenda? Then tell me, why didn't the channel benchmark the most recent current gen only titles in a video about how the 5700XT performs today? Why does Steve feel the need to constantly tell people how much lower end cards suck at Raytracing despite there existing a ton of examples where even a 2060 can get 1080p at 60 FPS with Raytracing on? Is it so far-fetched to believe that they reached a significant audience in the days of the 5700XT and do not want to admit to them that they made a short-sighted purchase recommendation?

You're were clearly not following them for long enough. You have a very naive idea of this channel.

Let me just tell you I've have tried my very best to have some civil decisions on Twitter with Steve about different topics. Yes I did speak to him directly, just as we do so here in this very forum. Instead of being civilized, that guy was incredibly unprofessional, called me names and straight up insulted me, without me resorting to personal attacks even once.

There's a valid reason why I'm so wary of them. So I would appreciate it if you could refrain from twisting my words in a way that makes it look like I'm running a YouTube clown show.
Link to those tweets? It certainly seems like you just view everything from an IHV war lens.
 
It's just an excuse to cover the real reason, HUB has to be biased in favor of AMD to satisfy his real audience: the AMD crowd.

Just because an incentive exists does not mean people act on it. Claiming so without proof exposes you to similar accusations.
 
Back
Top