Value of Hardware Unboxed benchmarking *spawn

In the most recent titles, sadly console level RT is not possible anymore on the 2060. I have a 2060 and it lacks the VRAM and horse power to compete. Ratchet and Clank for example does not run at high textures with Raytraced reflections even with DLSS Performance (720p to 1440p) and low resolutions. I can't even get a stable 30 FPS on Torren IV. While the PS5 does 1440p to 4K60 with Raytracing enabled, and also uses higher settings in general, I'm running settings worse than PS5 approximate performance equivalents. Even without RT, PS5 performs much better can't even get 60 FPS at DLSS Performance@1080p and medium settings with RT off... I've also seen how the 2060 performs in Alan Wake 2, it cannot even dream of reaching 60 FPS at lower settings, even though the consoles have a performance mode. This is the reality @pjbliverpool @DavidGraham.

I was always satisfied with my 2060 but Ratchet completely destroys it, regardless what I'm trying. And according to @Dictator Nixxes ports have great optimization, so it's not the game.

I certainly won't cheap out on VRAM next gen. I will wait for Nvidia's unified memory SoC next year and get atleast 64 GB of unified memory. I also won't fall into Jensen's trap of 8 GB 5070 laptops planned obsolescence is that, nothing more. I would rather wait years running my current laptop than buying that. Next gen consoles will have atleast 32 GB of unified memory, so even cards that have sufficient VRAM now, like the 5080 might struggle with just 16 GB when the PS6 releases in 2026.

I value hardware unboxed a lot more now because they really say how it is with cards that have low amounts of VRAM, they are just not great products that should never be bought.
 
Last edited:
In the most recent titles, sadly console level RT is not possible anymore on the 2060. I have a 2060 and it lacks the VRAM and horse power to compete. Ratchet and Clank for example does not run at high textures with Raytraced reflections even with DLSS Performance (720p to 1440p) and low resolutions. I can't even get a stable 30 FPS on Torren IV. While the PS5 does 1440p to 4K60 with Raytracing enabled, and also uses higher settings in general, I'm running settings worse than PS5 approximate performance equivalents. Even without RT, PS5 performs much better can't even get 60 FPS at DLSS Performance@1080p and medium settings with RT off... I've also seen how the 2060 performs in Alan Wake 2, it cannot even dream of reaching 60 FPS at lower settings, even though the consoles have a performance mode. This is the reality @pjbliverpool @DavidGraham.

I was always satisfied with my 2060 but Ratchet completely destroys it, regardless what I'm trying. And according to @Dictator Nixxes ports have great optimization, so it's not the game.

I certainly won't cheap out on VRAM next gen. I will wait for Nvidia's unified memory SoC next year and get atleast 64 GB of unified memory. I also won't fall into Jensen's trap of 8 GB 5070 laptops planned obsolescence is that, nothing more. I would rather wait years running my current laptop than buying that. Next gen consoles will have atleast 32 GB of unified memory, so even cards that have sufficient VRAM now, like the 5080 might struggle with just 16 GB when the PS6 releases in 2026.

I value hardware unboxed a lot more now because they really say how it is with cards that have low amounts of VRAM, they are just not great products that should never be bought.

Well I did say a "roughly console level experience" ;) and that would be dependent on the level of RT implemented in the game. If we're looking at very light RT that's been tailored to the capabilities of the consoles then the consoles fairly significant raster advantage could give it the lead. That would be especially true in a Sony console exclusive like Ratchet that is later ported to the PC which will generally have relatively better performance on the console on account of it being the lead/target platform. Granted an exact match for the console on all levels isn't always to be expected thanks to the much smaller vram pool on the 2060, but setting textures to medium, or even low doesn't make a game unviable on that GPU.

In terms of Alan Wake, that doesn't use RT on the consoles so they would certainly be faster than the 2060 at equivalent settings.

Also your laptop 2060 is quite a bit slower than the desktop variant that I was discussing in my initial post.
 
Well I did say a "roughly console level experience" ;) and that would be dependent on the level of RT implemented in the game. If we're looking at very light RT that's been tailored to the capabilities of the consoles then the consoles fairly significant raster advantage could give it the lead. That would be especially true in a Sony console exclusive like Ratchet that is later ported to the PC which will generally have relatively better performance on the console on account of it being the lead/target platform. Granted an exact match for the console on all levels isn't always to be expected thanks to the much smaller vram pool on the 2060, but setting textures to medium, or even low doesn't make a game unviable on that GPU.

In terms of Alan Wake, that doesn't use RT on the consoles so they would certainly be faster than the 2060 at equivalent settings.

Also your laptop 2060 is quite a bit slower than the desktop variant that I was discussing in my initial post.
This is medium at low textures and 1080p@DLSS Performance. No chance of getting 60 FPS. Desktop 2060 is around 15-20% faster, even then its far away from 60 FPS. Medium textures look horrible by the way, some textures look straight out of a gamecube game. I would consider that unplayable.

Screenshot 2024-11-21 143151.png


Also yes, while the PS5 has much better raster performance, remember I'm using a much lower render resolution here. The game performs catastrophically on that planet in relative to the console.

As for Alan Wake 2, remember I'm always taking the raw performance difference into account by running much lower rendering resolution. I've not tested it yet but in videos, 60 FPS was not even possible with DLSS Performance in 1080p (which is just 540p)
 
Last edited:
This is medium at low textures and 1080p@DLSS Performance. No chance of getting 60 FPS. Desktop 2060 is around 15-20% faster, even then its far away from 60 FPS.

Again, 60fps is not the minimum criteria for a "viable gaming experience". The PS5's 30fps quality mode in this game should be ample proof of that along with almost every console game of the past 2 generations.

Medium textures look horrible by the way, some textures look straight out of a gamecube game. I would consider that unplayable.

I'm sorry but I don't agree with that at all. This is the same scene with Medium textures. Claiming this to be unplayable on the grounds of poor graphics is at best a highly niche opinion. Remember this is a 6 year old lower midrange GPU with 6GB VRAM. Users of it should not be expecting a high end like experience with everything maxed out.

Ratchet.jpg

Also yes, while the PS5 has much better raster performance, remember I'm using a much lower render resolution here. The game performs catastrophically on that planet in relative to the console.

But what relevance does that have to whether the game is viable on a 2060 with RT enabled or not? And again, this one game - a Sony exclusive with very highly optimised console RT is hardly representative of the wider population of RT enabled games that would perform relatively much better on the 2060. You're basically using a best case scenario on the console side as evidence of a general performance differential.

As for Alan Wake 2, remember I'm always taking the raw performance difference into account by running much lower rendering resolution. I've not tested it yet but in videos, 60 FPS was not even possible with DLSS Performance in 1080p (which is just 540p)

Alan Wake 2 isn't relevant to the comparison since it features no RT on the base consoles. Those consoles should therefore be expected to be much faster than the 2060. That said, the base resolution of the PS5 60fps mode in this game (which can drop to the low 50's) is 872p so it's pretty low in itself.
 
This isn't really accurate. Ignoring mobile parts there are around 37% of PC GPU's in the Steam Hardware survey that are capable of delivering a broadly equivalent experience. That's taking the slowest parts to be the 3060 and 2070Ti which may sometimes lag behind in raster at matched input resolutions but can often be ahead with RT and/or when matching output image quality thanks to DLSS.
My initial claim was not about broad equivalence but about performance. Using DLSS to match imagine quality does not make the GPU equivalent in performance. However thats not the point of this discussion.
So 25% choose to play at 30fps. Meaning that option is perfectly viable? Not only that but the overwhelming majority of games from the previous 2 generations of consoles were 30fps. I'm pretty sure no-one would claim the PS3 and PS4 generation of consoles were not viable gaming machines.
Well it’s about options isn’t it? When you don’t have an option and you want to play the game, you just deal with it. When players have options, they’re rejecting the bad option.
This isn't the comparison that was made. A 2060 would generally not need to be running with DLSS Performance mode at 1080p to match a consoles performance at native 1080p in an RT enabled game. The real world scenario here is all those console games that use internal resolutions in the 700-800p range and upscale with FSR2. DLSS upscaling from 540p should produce a comparable or better image to that.
How many console games have RT modes? Of those, which console games are using 700p-800p as an input resolution in RT mode? How many of them fall in that category? What percentage of console games using RT does that represent? Of the console games using RT, how many are using fsr2 vs an alternative upscaling method? You surely can’t expect to throw that statement out there with absolutely no data to back that up and expect us to just accept that as true?
No-one claimed it would be a flawless output. In fact I said the exact opposite in the quote that you were responding to: "and given the age and performance tier of this GPU, expecting some image quality compromises if you want to use RT should be a given". The question here is whether those compromises are considered "viable" which would be a matter of personal preference, and thus by definition the GPU is viable for this scenario to anyone that considers those compromises acceptable. And the guidepost for whether people would consider this viable is the console market where as you say, around 25% of people would game with this or even worse image quality thanks to FSR.
Well that only works if viability is defined as being used by at least 1 person. In the context of this discussion, that’s certainly not how I’d define viability.
This is pure personal opinion which does not tally up with the reality that many console games ship with with much poorer image quality than this and are still enjoyed by millions.
Console gamers are enjoying it so much that there have been numerous complaints have been made about games with poor image quality this gen? So much enjoyment that the general sentiment regarding ue5 is generally poor across pc and console gamers if various discussion forums(here to even YouTube comments) are to be believed? Like I said above, the options posed to console gamers are to either deal with the issues or not play the game. Do not equate the lack of an option to enjoyment.
As a counter personal opinion, I routinely game at 3840x800 using DLSS performance with an input resolution of 1920x800... on a 38" monitor which is likely much larger than what your average 2060 gamer is using. And I find it more than acceptable even from the expectations of 4070Ti tier gaming.
3840x800? Firstly, is that a typo or do you really game at 24:5? Secondly, finding acceptable does not negate my earlier statement. I said DLSS flaws are far too visible when the input resolution is lower that 1920x1080. If you’re happy ignoring the flaws, that is fine. It doesn’t change the fact that the flaws are visible.
Ah yes you've got me there. If only is were possible to play PC games with a control pad.
Yes, anything is possible for sure. You can game on console with a keyboard and mouse. It doesn’t make it the predominant preferred input choice for the platform….
You might want to tell that to the tens of millions of PC gamers still playing on Pascal or lower levels of hardware that are very likely not playing at 60fps on a regular basis. And if they are, are very likely not doing so with greater than 4K DLSS Performance levels of image quality which you also claim to me the minimum bar for viability.
The most popular games on pc do not require high end hardware. That is why most pc gamers have worse systems than consoles. Sometimes, I don’t think people on here realize how irrelevant they are in terms of market sentiment. Basically the discussions that exist on this forum are almost not ever reflective of general market sentiment. Like you use a 4070ti so only ~3% of pc gamers have a better GPU than you. I think you should keep this in mind when making arguments.
 
consoles are more powerful than ~85% of pcs
According to who? PS5 and Xbox Series X combined sales are ~80 million (after excluding the weak Series S), There are hundreds of millions of PCs out there, most are older than PS5/Series X because well, they were built before consoles, so that's not something to brag about. What we want is to focus on recently built PCs, specifically those built right before or after the PS5 launch.

As for accurate figures, NVIDIA alone sold ~120 RTX GPUs, most of which are more powerful than the PS5, we are also not counting AMD GPUs. So right off the bats that 85% figure is not accurate at all, the most probable figure would be under 40% at the most.

DLSS from 540p might be better than FSR from 540p but it's not even better than native 1080p
It's certainly leaps and bounds better than playing games at low graphics settings, which is what Hardware Unboxed did. Any user with a brain will turn DLSS to performance mode first and see what he gains from there instead of turning everything down to low.

Imagine how biased one would have to be to suggest playing at 30 fps on pc with a mouse?
I played many single player games at max settings and 30fps in the past (when I had my GTX 1070), for many people image quality trumps fps.

In the most recent titles, sadly console level RT is not possible anymore on the 2060
Yeah VRAM problems, It's a separate issue, doesn't make the GPU itself incapable of RT, a 2060 Super with 8Gb is capable just fine.
 
I've not watched the video so can someone please tell me. Does the 2060 have enough memory to run modern games with RT on? I am certain it does not but I've been wrong before.
 
I'd assume it depends on the game?

But I think we might need some perspective here in that it is a mid range (and entry RT) GPU from 6 years ago.

How well did 2016 era GPUs of this class run 2022 games?

2014 era GPUs run 2020 games?

2012 era GPUs run 2018 games?

and etc.

The other issue is that 2018 was the pre "next gen" console GPU generation. Hardware jump aren't linear and tend to accelerate initially with every console cycle. That generation of GPUs was bound to have longevity issues just due to that.
 
5800x3d vs 5900x with a rtx 4090

Alan Wake at 4k - 63.5 vs 63.7
Alan Wake at 4k with RT - 40.2 vs 39.8
Alan Wake 2 is not a CPU limited title, far from it, it needs faster GPUs for better CPUs to be able to be able to show a difference. We need a CPU limited title, preferably a strategy game.

Even in GPU limited titles, you also need the faster CPU because it will get you through the unoptimzied sections of the game, the single threaded sections where the GPU is underutilized, which is wide spread these days in most recent titles. Ray Tracing complicates things further, more powerful CPUs provide significantly better frame pacing with ray tracing even if average fps are the same.
 
Alan Wake 2 is not a CPU limited title, far from it, it needs faster GPUs for better CPUs to be able to be able to show a difference. We need a CPU limited title, preferably a strategy game.

Even in GPU limited titles, you also need the faster CPU because it will get you through the unoptimzied sections of the game, the single threaded sections where the GPU is underutilized, which is wide spread these days in most recent titles. Ray Tracing complicates things further, more powerful CPUs provide significantly better frame pacing with ray tracing even if average fps are the same.

I just use Alan Wake 2 as one example to avoid clutter. But if you click the TPU link you'll see again basically none of the games that are GPU graphics driven have significant separation at 4k.

I think an issue I might not be conveying with this discussion is what the original point of actual contention was.

I am not saying there aren't CPU limited games. I've actually stated repeatedly that they exist and there's plenty of choose from that showcase CPU differences in real world tests. They just aren't used, and most tests seem to just pull from GPU test suites with contrived non real world settings.

The original point was the issue of testing at 720p and how reflective of that was of real world issue for graphics/GPU driven games. The premise was that it is done partly because it will show itself in real world usage at higher resolutions for future games as they are more CPU demanding. I am simply questioning whether or not that is the case, as while the CPU requirements do go up the GPU requirements go up even more.

Again CPU driven games exist currently and have always existed. You can just test those directly. Stellaris, as the example was brought up, was released in 2016. If you want to examine CPU performance in real gaming there's plenty of titles like that to choose from. You can also benchmark them at typically played resolutions and settings even up to 4k and they will still show the difference. You do not need to contrive a test by pulling games from your GPU test suite and just running them at 720p. And then assume that test is somehow relevant for future game forecasting at 4k.
 
Back
Top