Value of Hardware Unboxed benchmarking *spawn

But they (Remedy) didn't make the switch, AW2 works workout mesh shaders too despite the initial wrong information

I mean, you get a warning when the game starts. The legacy pipeline is still in the engine. They didn't remove it. The initial information from a dev was that the legacy pipeline was abandoned because it performed really poorly, which seems to be the case. The dev wasn't sure if the pipeline was still in the game. So the dev was basically correct.

But I think devs are basically waiting to fully switch over to mesh shaders rather than supporting two pipelines. I am by no means an expert, but they have very different characteristics and it's probably not viewed as being worth the effort to support both. Otherwise we'd have more games supporting mesh shaders already.
 
Last edited:
...

Your argument falls apart when discussing alternatives priced way down due to market forces. The Arc A770 is priced way lower than it should be, I should now compare vs the RTX 3050 because they are similar in price?

Of course you should, what else? You don't pay for what you think card x based on reason y should cost, you pay what the manufacturer (and by extension all the steps to retail) put on the sticker. That's also the reason why some cards can offer notably better value than some other cards.
 
It's one game now, who knows how many games in a year from now. It's also a heavy game where this difference isn't very important but what if the next one will be a lighter game where 2060S will be capable of outputting >60 fps?
So we are talking five years after the GPU's launched and there's now two games? Yeah, still next to zero significance. The further we go from launch the less relevant they become.

I also fail to see how the performance of products right now suddenly doesn't matter for when someone was recommending one product over the other. HUB's typical recommendation which they repeat in every GPU benchmark is to get a GPU with more VRAM because this will be beneficial in the future. How is that different from getting a GPU with mesh shaders because that will be beneficial in the future?
If someone made the bet four years ago that extra VRAM would turn out to have a larger impact in games than mesh shaders I'd say they were right. I'm sure we disagree here though.

Yes, it is funny how people can't see that the same logic should lead to similar conclusions when they are blinded by brand allegiances.
Not sure whose brand allegiance you are talking about here.
 
So we are talking five years after the GPU's launched and there's now two games? Yeah, still next to zero significance.
We are three years since 8GB Ampere being "bad" and there's like half a dozen of games out of several hundred where there are VRAM related issues on them. Is that also next to zero significance then?

If someone made the bet four years ago that extra VRAM would turn out to have a larger impact in games than mesh shaders I'd say they were right. I'm sure we disagree here though.
But it didn't. You can run any game on any GPU from 4 years ago without VRAM issues. Just dial down settings. No such option with mesh shaders, so the impact here is a lot more severe.
 
They literally stuck themselves into a corner in the conclusion of the video, resorting to mind bending justifications as lame as the 2060 Super can't play the game too! Or DLSS doesn't matter at 1080p, or DLSS in general wasn't a good advantage for the 2060 Super over its life span!

DLSS alone should have been more than enough for them to change their recommendation, but they never did. And now their RDNA1 has no DLSS, no RT, and can't play modern mesh shader based titles. How many lame execuses will they give before they admit they were wrong?!

Again, no. The latest example of them downplaying the DLSS advantage is loud and clear here. They still refuse to say they were wrong about 5700XT being better than 2070 Super on the basis of lacking DLSS, and they still refuse to do it in Alan Wake 2, claiming that DLSS on 1080p is useless even if it's vastly better than FSR.

They couldn't have implicated themselves more, with all of their contradicting logic and mental gymnastics.


Disagree, NVIDIA cards command a premium over AMD GPUs, especially since the introduction of RTX, on the basis of offering more features and vastly more RT performance.
They didn't 'stick themselves into a corner'. It's late 2023. The battles of the 2018/2019 GPU are hardly that relevant today, and dont take away from the contextually sensible takes of the time. They have no need to 'admit they were wrong' about what was a perfectly reasonable take back then.

And pointing out that a 2060S is still not performing well in this game is entirely reasonable. Because it's true - it isn't. Hardly this game, either. Nvidia has a tendency to neglect anything past two generations old in newer games in terms of driver support. Also, the idea that reconstruction, even DLSS, doesn't work as well at low resolution targets like 1080p is not new. But you're going to of course rewrite history on this perfectly well agreed upon notion, because it's convenient to your agenda here.

Your obsession with trying to bash on HUB is, as I've said several times by now, completely embarrassing. I've got no bone in this fight, either. I've called out HUB very specifically in direct conversation with them on Reddit before on things I disagree with. But I'm not 'out to get them'. People like you are genuinely locked in on trying to crap on them no matter how desperate your arguments have to get to do it. Get a grip.
 
They didn't 'stick themselves into a corner'. It's late 2023. The battles of the 2018/2019 GPU are hardly that relevant today, and dont take away from the contextually sensible takes of the time. They have no need to 'admit they were wrong' about what was a perfectly reasonable take back then.

And pointing out that a 2060S is still not performing well in this game is entirely reasonable. Because it's true - it isn't. Also, the idea that reconstruction, even DLSS, doesn't work as well at low resolution targets like 1080p is not new. But you're going to of course rewrite history on this perfectly well agreed upon notion, because it's convenient to your agenda here.

Your obsession with trying to bash on HUB is, as I've said several times by now, completely embarrassing. I've got no bone in this fight, either. I've called out HUB very specifically in direct conversation with them on Reddit before. People like you are genuinely locked in on trying to crap on them no matter how desperate your arguments have to get to do it. Get a grip.

I'd be interested in seeing post processing low. They basically ruled it out subjectively saying it looks too bad to recommend, but it would have been nice to see the numbers so people could make their own judgement. Budget gamers are used to making big compromises. Seems like post process low offers very big performance improvements, so I think with DLSS quality at 1080p it might hit 60 fps and DLSS will handle the shader aliasing much better than FSR2 does.
 
I'd be interested in seeing post processing low. They basically ruled it out subjectively saying it looks too bad to recommend, but it would have been nice to see the numbers so people could make their own judgement. Budget gamers are used to making big compromises. Seems like post process low offers very big performance improvements, so I think with DLSS quality at 1080p it might hit 60 fps and DLSS will handle the shader aliasing much better than FSR2 does.

This is a regular 2060, not 2060S, but you can still do a little mental adjusting to realize it's not going to hit 60fps.
 
We are three years since 8GB Ampere being "bad" and there's like half a dozen of games out of several hundred where there are VRAM related issues on them. Is that also next to zero significance then?
Regarding VRAM capacity, the GPU's in question were affected by the limitation earlier after launch (about 2 years after) and more games had issues. Mesh shaders: one game more than four years after launch. Not equal significance in my book.

To me these are two different issues that aren't directly, objectively 1:1 comparable. Do we agree even on this point?
 

This is a regular 2060, not 2060S, but you can still do a little mental adjusting to realize it's not going to hit 60fps.

I don't like this guy, but I'll make an exception and check his video out.

Edit:
Okay so he's getting maybe 45 fps at the recommended settings with DLSS, but post processing is HIGH because he's using the Low preset. That means all post processing is done at 1080p instead of the internal resolution. I think it's supposed to be one of the heaviest settings in the game. That's why if you keep changing the DLSS/FSR quality down you don't necessarily get the gains you think you're going to get. The post processing is still done at native resolution with post processing on high. Now maybe the internal resolution of 720p would just be too ugly with post processing low, but as a budget gamer what else are you going to do?
 
Last edited:
I don't like this guy, but I'll make an exception and check his video out.

Edit:
Okay so he's getting maybe 45 fps at the recommended settings with DLSS, but post processing is HIGH because he's using the Low preset. That means all post processing is done at 1080p instead of the internal resolution. I think it's supposed to be one of the heaviest settings in the game. That's why if you keep changing the DLSS/FSR quality down you don't necessarily get the gains you think you're going to get. The post processing is still done at native resolution with post processing on high. Now maybe the internal resolution of 720p would just be too ugly with post processing low, but as a budget gamer what else are you going to do?
I think it demonstrates well enough that a 2060/2060S aint gonna do it. It's always possible to find some discrepancies to propose some plausible situation how it could make everything different, but the most obvious and realistic take here is that the game is too demanding for such 'low level' GPU's at this point. I'll agree 100% that it's playable, but I also know that PC gamers are generally loathe to consider anything less than a reasonably solid 60fps as 'playable', so I think we need to keep that kind of standard in mind.

I think if it weren't useful to bash the 5700XT here and HUB, that the people with a clear agenda here would be very harsh on the 2060/2060S performance, rather than trying to act like it's perfectly fine.
 
I think it demonstrates well enough that a 2060/2060S aint gonna do it. It's always possible to find some discrepancies to propose some plausible situation how it could make everything different, but the most obvious and realistic take here is that the game is too demanding for such 'low level' GPU's at this point. I'll agree 100% that it's playable, but I also know that PC gamers are generally loathe to consider anything less than a reasonably solid 60fps as 'playable', so I think we need to keep that kind of standard in mind.

I think if it weren't useful to bash the 5700XT here and HUB, that the people with a clear agenda here would be very harsh on the 2060/2060S performance, rather than trying to act like it's perfectly fine.

I have the game so I'll do some fps comparisons with the low preset and everything actually turned as low as possible. Mind you I have a 3080, but we'll see how big the gain is.
 
I think it demonstrates well enough that a 2060/2060S aint gonna do it.
2060/S should be able to run the game at PS5 level of performance. 5700/XT won't.

Regarding VRAM capacity, the GPU's in question were affected by the limitation earlier after launch (about 2 years after) and more games had issues. Mesh shaders: one game more than four years after launch. Not equal significance in my book.
"Issues" with VRAM were and are solvable by adjusting settings. Good luck doing the same with mesh shaders.
 
"Issues" with VRAM were and are solvable by adjusting settings.
Will they be five years after launch (2025)? And how many games will be affected and how severe will the compromises in graphical fidelity be then?

PS. Five years is a bit of a long timeframe imo, I expect poor performance and support at that point. But it's the timeframe you used earlier in your hypothetical so I think the question is valid.
 
Back
Top