Value of Hardware Unboxed benchmarking

In the most recent titles, sadly console level RT is not possible anymore on the 2060. I have a 2060 and it lacks the VRAM and horse power to compete. Ratchet and Clank for example does not run at high textures with Raytraced reflections even with DLSS Performance (720p to 1440p) and low resolutions. I can't even get a stable 30 FPS on Torren IV. While the PS5 does 1440p to 4K60 with Raytracing enabled, and also uses higher settings in general, I'm running settings worse than PS5 approximate performance equivalents. Even without RT, PS5 performs much better can't even get 60 FPS at DLSS Performance@1080p and medium settings with RT off... I've also seen how the 2060 performs in Alan Wake 2, it cannot even dream of reaching 60 FPS at lower settings, even though the consoles have a performance mode. This is the reality @pjbliverpool @DavidGraham.

I was always satisfied with my 2060 but Ratchet completely destroys it, regardless what I'm trying. And according to @Dictator Nixxes ports have great optimization, so it's not the game.

I certainly won't cheap out on VRAM next gen. I will wait for Nvidia's unified memory SoC next year and get atleast 64 GB of unified memory. I also won't fall into Jensen's trap of 8 GB 5070 laptops planned obsolescence is that, nothing more. I would rather wait years running my current laptop than buying that. Next gen consoles will have atleast 32 GB of unified memory, so even cards that have sufficient VRAM now, like the 5080 might struggle with just 16 GB when the PS6 releases in 2026.

I value hardware unboxed a lot more now because they really say how it is with cards that have low amounts of VRAM, they are just not great products that should never be bought.
 
Last edited:
In the most recent titles, sadly console level RT is not possible anymore on the 2060. I have a 2060 and it lacks the VRAM and horse power to compete. Ratchet and Clank for example does not run at high textures with Raytraced reflections even with DLSS Performance (720p to 1440p) and low resolutions. I can't even get a stable 30 FPS on Torren IV. While the PS5 does 1440p to 4K60 with Raytracing enabled, and also uses higher settings in general, I'm running settings worse than PS5 approximate performance equivalents. Even without RT, PS5 performs much better can't even get 60 FPS at DLSS Performance@1080p and medium settings with RT off... I've also seen how the 2060 performs in Alan Wake 2, it cannot even dream of reaching 60 FPS at lower settings, even though the consoles have a performance mode. This is the reality @pjbliverpool @DavidGraham.

I was always satisfied with my 2060 but Ratchet completely destroys it, regardless what I'm trying. And according to @Dictator Nixxes ports have great optimization, so it's not the game.

I certainly won't cheap out on VRAM next gen. I will wait for Nvidia's unified memory SoC next year and get atleast 64 GB of unified memory. I also won't fall into Jensen's trap of 8 GB 5070 laptops planned obsolescence is that, nothing more. I would rather wait years running my current laptop than buying that. Next gen consoles will have atleast 32 GB of unified memory, so even cards that have sufficient VRAM now, like the 5080 might struggle with just 16 GB when the PS6 releases in 2026.

I value hardware unboxed a lot more now because they really say how it is with cards that have low amounts of VRAM, they are just not great products that should never be bought.

Well I did say a "roughly console level experience" ;) and that would be dependent on the level of RT implemented in the game. If we're looking at very light RT that's been tailored to the capabilities of the consoles then the consoles fairly significant raster advantage could give it the lead. That would be especially true in a Sony console exclusive like Ratchet that is later ported to the PC which will generally have relatively better performance on the console on account of it being the lead/target platform. Granted an exact match for the console on all levels isn't always to be expected thanks to the much smaller vram pool on the 2060, but setting textures to medium, or even low doesn't make a game unviable on that GPU.

In terms of Alan Wake, that doesn't use RT on the consoles so they would certainly be faster than the 2060 at equivalent settings.

Also your laptop 2060 is quite a bit slower than the desktop variant that I was discussing in my initial post.
 
Well I did say a "roughly console level experience" ;) and that would be dependent on the level of RT implemented in the game. If we're looking at very light RT that's been tailored to the capabilities of the consoles then the consoles fairly significant raster advantage could give it the lead. That would be especially true in a Sony console exclusive like Ratchet that is later ported to the PC which will generally have relatively better performance on the console on account of it being the lead/target platform. Granted an exact match for the console on all levels isn't always to be expected thanks to the much smaller vram pool on the 2060, but setting textures to medium, or even low doesn't make a game unviable on that GPU.

In terms of Alan Wake, that doesn't use RT on the consoles so they would certainly be faster than the 2060 at equivalent settings.

Also your laptop 2060 is quite a bit slower than the desktop variant that I was discussing in my initial post.
This is medium at low textures and 1080p@DLSS Performance. No chance of getting 60 FPS. Desktop 2060 is around 15-20% faster, even then its far away from 60 FPS. Medium textures look horrible by the way, some textures look straight out of a gamecube game. I would consider that unplayable.

Screenshot 2024-11-21 143151.png


Also yes, while the PS5 has much better raster performance, remember I'm using a much lower render resolution here. The game performs catastrophically on that planet in relative to the console.

As for Alan Wake 2, remember I'm always taking the raw performance difference into account by running much lower rendering resolution. I've not tested it yet but in videos, 60 FPS was not even possible with DLSS Performance in 1080p (which is just 540p)
 
Last edited:
This is medium at low textures and 1080p@DLSS Performance. No chance of getting 60 FPS. Desktop 2060 is around 15-20% faster, even then its far away from 60 FPS.

Again, 60fps is not the minimum criteria for a "viable gaming experience". The PS5's 30fps quality mode in this game should be ample proof of that along with almost every console game of the past 2 generations.

Medium textures look horrible by the way, some textures look straight out of a gamecube game. I would consider that unplayable.

I'm sorry but I don't agree with that at all. This is the same scene with Medium textures. Claiming this to be unplayable on the grounds of poor graphics is at best a highly niche opinion. Remember this is a 6 year old lower midrange GPU with 6GB VRAM. Users of it should not be expecting a high end like experience with everything maxed out.

Ratchet.jpg

Also yes, while the PS5 has much better raster performance, remember I'm using a much lower render resolution here. The game performs catastrophically on that planet in relative to the console.

But what relevance does that have to whether the game is viable on a 2060 with RT enabled or not? And again, this one game - a Sony exclusive with very highly optimised console RT is hardly representative of the wider population of RT enabled games that would perform relatively much better on the 2060. You're basically using a best case scenario on the console side as evidence of a general performance differential.

As for Alan Wake 2, remember I'm always taking the raw performance difference into account by running much lower rendering resolution. I've not tested it yet but in videos, 60 FPS was not even possible with DLSS Performance in 1080p (which is just 540p)

Alan Wake 2 isn't relevant to the comparison since it features no RT on the base consoles. Those consoles should therefore be expected to be much faster than the 2060. That said, the base resolution of the PS5 60fps mode in this game (which can drop to the low 50's) is 872p so it's pretty low in itself.
 
This isn't really accurate. Ignoring mobile parts there are around 37% of PC GPU's in the Steam Hardware survey that are capable of delivering a broadly equivalent experience. That's taking the slowest parts to be the 3060 and 2070Ti which may sometimes lag behind in raster at matched input resolutions but can often be ahead with RT and/or when matching output image quality thanks to DLSS.
My initial claim was not about broad equivalence but about performance. Using DLSS to match imagine quality does not make the GPU equivalent in performance. However thats not the point of this discussion.
So 25% choose to play at 30fps. Meaning that option is perfectly viable? Not only that but the overwhelming majority of games from the previous 2 generations of consoles were 30fps. I'm pretty sure no-one would claim the PS3 and PS4 generation of consoles were not viable gaming machines.
Well it’s about options isn’t it? When you don’t have an option and you want to play the game, you just deal with it. When players have options, they’re rejecting the bad option.
This isn't the comparison that was made. A 2060 would generally not need to be running with DLSS Performance mode at 1080p to match a consoles performance at native 1080p in an RT enabled game. The real world scenario here is all those console games that use internal resolutions in the 700-800p range and upscale with FSR2. DLSS upscaling from 540p should produce a comparable or better image to that.
How many console games have RT modes? Of those, which console games are using 700p-800p as an input resolution in RT mode? How many of them fall in that category? What percentage of console games using RT does that represent? Of the console games using RT, how many are using fsr2 vs an alternative upscaling method? You surely can’t expect to throw that statement out there with absolutely no data to back that up and expect us to just accept that as true?
No-one claimed it would be a flawless output. In fact I said the exact opposite in the quote that you were responding to: "and given the age and performance tier of this GPU, expecting some image quality compromises if you want to use RT should be a given". The question here is whether those compromises are considered "viable" which would be a matter of personal preference, and thus by definition the GPU is viable for this scenario to anyone that considers those compromises acceptable. And the guidepost for whether people would consider this viable is the console market where as you say, around 25% of people would game with this or even worse image quality thanks to FSR.
Well that only works if viability is defined as being used by at least 1 person. In the context of this discussion, that’s certainly not how I’d define viability.
This is pure personal opinion which does not tally up with the reality that many console games ship with with much poorer image quality than this and are still enjoyed by millions.
Console gamers are enjoying it so much that there have been numerous complaints have been made about games with poor image quality this gen? So much enjoyment that the general sentiment regarding ue5 is generally poor across pc and console gamers if various discussion forums(here to even YouTube comments) are to be believed? Like I said above, the options posed to console gamers are to either deal with the issues or not play the game. Do not equate the lack of an option to enjoyment.
As a counter personal opinion, I routinely game at 3840x800 using DLSS performance with an input resolution of 1920x800... on a 38" monitor which is likely much larger than what your average 2060 gamer is using. And I find it more than acceptable even from the expectations of 4070Ti tier gaming.
3840x800? Firstly, is that a typo or do you really game at 24:5? Secondly, finding acceptable does not negate my earlier statement. I said DLSS flaws are far too visible when the input resolution is lower that 1920x1080. If you’re happy ignoring the flaws, that is fine. It doesn’t change the fact that the flaws are visible.
Ah yes you've got me there. If only is were possible to play PC games with a control pad.
Yes, anything is possible for sure. You can game on console with a keyboard and mouse. It doesn’t make it the predominant preferred input choice for the platform….
You might want to tell that to the tens of millions of PC gamers still playing on Pascal or lower levels of hardware that are very likely not playing at 60fps on a regular basis. And if they are, are very likely not doing so with greater than 4K DLSS Performance levels of image quality which you also claim to me the minimum bar for viability.
The most popular games on pc do not require high end hardware. That is why most pc gamers have worse systems than consoles. Sometimes, I don’t think people on here realize how irrelevant they are in terms of market sentiment. Basically the discussions that exist on this forum are almost not ever reflective of general market sentiment. Like you use a 4070ti so only ~3% of pc gamers have a better GPU than you. I think you should keep this in mind when making arguments.
 
consoles are more powerful than ~85% of pcs
According to who? PS5 and Xbox Series X combined sales are ~80 million (after excluding the weak Series S), There are hundreds of millions of PCs out there, most are older than PS5/Series X because well, they were built before consoles, so that's not something to brag about. What we want is to focus on recently built PCs, specifically those built right before or after the PS5 launch.

As for accurate figures, NVIDIA alone sold ~120 RTX GPUs, most of which are more powerful than the PS5, we are also not counting AMD GPUs. So right off the bats that 85% figure is not accurate at all, the most probable figure would be under 40% at the most.

DLSS from 540p might be better than FSR from 540p but it's not even better than native 1080p
It's certainly leaps and bounds better than playing games at low graphics settings, which is what Hardware Unboxed did. Any user with a brain will turn DLSS to performance mode first and see what he gains from there instead of turning everything down to low.

Imagine how biased one would have to be to suggest playing at 30 fps on pc with a mouse?
I played many single player games at max settings and 30fps in the past (when I had my GTX 1070), for many people image quality trumps fps.

In the most recent titles, sadly console level RT is not possible anymore on the 2060
Yeah VRAM problems, It's a separate issue, doesn't make the GPU itself incapable of RT, a 2060 Super with 8Gb is capable just fine.
 
I've not watched the video so can someone please tell me. Does the 2060 have enough memory to run modern games with RT on? I am certain it does not but I've been wrong before.
 
I'd assume it depends on the game?

But I think we might need some perspective here in that it is a mid range (and entry RT) GPU from 6 years ago.

How well did 2016 era GPUs of this class run 2022 games?

2014 era GPUs run 2020 games?

2012 era GPUs run 2018 games?

and etc.

The other issue is that 2018 was the pre "next gen" console GPU generation. Hardware jump aren't linear and tend to accelerate initially with every console cycle. That generation of GPUs was bound to have longevity issues just due to that.
 
5800x3d vs 5900x with a rtx 4090

Alan Wake at 4k - 63.5 vs 63.7
Alan Wake at 4k with RT - 40.2 vs 39.8
Alan Wake 2 is not a CPU limited title, far from it, it needs faster GPUs for better CPUs to be able to be able to show a difference. We need a CPU limited title, preferably a strategy game.

Even in GPU limited titles, you also need the faster CPU because it will get you through the unoptimzied sections of the game, the single threaded sections where the GPU is underutilized, which is wide spread these days in most recent titles. Ray Tracing complicates things further, more powerful CPUs provide significantly better frame pacing with ray tracing even if average fps are the same.
 
Alan Wake 2 is not a CPU limited title, far from it, it needs faster GPUs for better CPUs to be able to be able to show a difference. We need a CPU limited title, preferably a strategy game.

Even in GPU limited titles, you also need the faster CPU because it will get you through the unoptimzied sections of the game, the single threaded sections where the GPU is underutilized, which is wide spread these days in most recent titles. Ray Tracing complicates things further, more powerful CPUs provide significantly better frame pacing with ray tracing even if average fps are the same.

I just use Alan Wake 2 as one example to avoid clutter. But if you click the TPU link you'll see again basically none of the games that are GPU graphics driven have significant separation at 4k.

I think an issue I might not be conveying with this discussion is what the original point of actual contention was.

I am not saying there aren't CPU limited games. I've actually stated repeatedly that they exist and there's plenty of choose from that showcase CPU differences in real world tests. They just aren't used, and most tests seem to just pull from GPU test suites with contrived non real world settings.

The original point was the issue of testing at 720p and how reflective of that was of real world issue for graphics/GPU driven games. The premise was that it is done partly because it will show itself in real world usage at higher resolutions for future games as they are more CPU demanding. I am simply questioning whether or not that is the case, as while the CPU requirements do go up the GPU requirements go up even more.

Again CPU driven games exist currently and have always existed. You can just test those directly. Stellaris, as the example was brought up, was released in 2016. If you want to examine CPU performance in real gaming there's plenty of titles like that to choose from. You can also benchmark them at typically played resolutions and settings even up to 4k and they will still show the difference. You do not need to contrive a test by pulling games from your GPU test suite and just running them at 720p. And then assume that test is somehow relevant for future game forecasting at 4k.
 
According to who? PS5 and Xbox Series X combined sales are ~80 million (after excluding the weak Series S), There are hundreds of millions of PCs out there, most are older than PS5/Series X because well, they were built before consoles, so that's not something to brag about. What we want is to focus on recently built PCs, specifically those built right before or after the PS5 launch.
All of that to say that you didn’t read my message. It’s according to the steam hardware survey. Console spec is better than ~85% of pcs. By consoles I obviously mean series x, ps5/pro not the series s or switch.
As for accurate figures, NVIDIA alone sold ~120 RTX GPUs, most of which are more powerful than the PS5, we are also not counting AMD GPUs. So right off the bats that 85% figure is not accurate at all, the most probable figure would be under 40% at the most.
I said console spec is better than a certain percentage of pcs. The amounts sold is really irrelevant as we’re referring to the pc user base as a whole.
It's certainly leaps and bounds better than playing games at low graphics settings, which is what Hardware Unboxed did. Any user with a brain will turn DLSS to performance mode first and see what he gains from there instead of turning everything down to low.
Any user with eyes will not turn performance DLSS at 1080p. Its much better to lower graphics settings than play at badly upscaled psvita resolutions.
I played many single player games at max settings and 30fps in the past (when I had my GTX 1070), for many people image quality trumps fps.
Ok, well it’s good to know that you’re not representative of the majority.
Yeah VRAM problems, It's a separate issue, doesn't make the GPU itself incapable of RT, a 2060 Super with 8Gb is capable just fine.
Firstly the video is not about the 2060 super but the 2060 og with 6gb of ram. Secondly, the 2060 super is a garbage product. The smart people skipped the 2000 series from nvidia. I was a sucker and bought both a 2060 and 2070 super. Both were terrible.
 
My initial claim was not about broad equivalence but about performance. Using DLSS to match imagine quality does not make the GPU equivalent in performance. However thats not the point of this discussion.

You claimed specifically that "consoles are more powerful than ~85% of pcs". And you specifically referenced the Steam Hardware Survey in that claim, despite clearly not checking the Steam hardware Survey before you made it.

As already noted, both the 3060 and 2070S generally offer more performance than the PS5 in RT enabled games. So the claim that the consoles are more powerful than those GPU's is already on shaky ground, especially in a debate that is focused around RT performance.

However even removing those 2 GPU's from the equitation bringing the lowest performers down to the 6700XT/RTX 2080, we are still looking at about 30% of PC GPU's being faster than the consoles (again excluding mobile parts completely), so either way, your claim was wrong.

And to add to that, the original argument I put forward was that the 2060 "should be able to offer a roughly console level RT experience". So you're attempting to create a strawman by making this about raw raster performance rather than the end result seen on screen in RT enabled games - where DLSS absolutely plays a huge factor vs the consoles.

Well it’s about options isn’t it? When you don’t have an option and you want to play the game, you just deal with it. When players have options, they’re rejecting the bad option.

Which again, is a totally different argument to the question of whether gaming is viable at 30fps. It clearly is since entire console generations were based around that frame rate and even today virtually every console game ships with a viable 30fps mode which many gamers choose to use.

The question being debated here is whether the 2060 is a viable gaming card for RT use. And the answer is that in the vast majority of cases it can achieve a solid 30fps in most RT titles with some level of RT enabled.

Arguing that 30fps isn't a viable gaming frame rate when their are literally 100's of millions of purchases for 30fps only games out there makes no sense. And arguing that the 2060, a 6 year old lower mid range part at release MUST play all games at 60fps also makes no sense. If people want 60fps gaming, higher tier options are available in the PC space. If people are content with 30fps gaming and middling image quality coupled with some graphical compromises, then the 2060 has been a viable option for the last half decade+. Obviously expecting it to run every single latest and greatest RT enabled game 6 years after it's launch with RT enabled is unrealistic, but that is every different to saying it wasn't a viable GPU for RT when launched, and for many years after.

How many console games have RT modes? Of those, which console games are using 700p-800p as an input resolution in RT mode? How many of them fall in that category? What percentage of console games using RT does that represent? Of the console games using RT, how many are using fsr2 vs an alternative upscaling method? You surely can’t expect to throw that statement out there with absolutely no data to back that up and expect us to just accept that as true?

Ah the old 'I'll ask a question that is literally impossible for anyone to answer so that I look good when the person I'm arguing against isn't able to answer it'... nope, we're not doing that. Of course I don't know the above statistics, nor do I need to in order to defend the point I made, which is; that if console RT modes are considered viable (which they must be or they would not exist and would not be being used by anyone) then the 2060 should also be considered a viable GPU for RT because it will generally be able to offer a roughly equivalent experience to those consoles in those RT modes thanks to it's higher RT performance and ability to run at lower internal resolutions for an overall similar experience thanks to DLSS.

Well that only works if viability is defined as being used by at least 1 person. In the context of this discussion, that’s certainly not how I’d define viability.

Except it's not "just 1 person" defining 30fp as viable, it's hundreds of millions of gamers who were perfectly happy gaming at that framerate for the last several console generations along with the tens of millions who continue to do so today on the current gen consoles despite having 60fps options in many cases.

And from a resolution perspective, HWU own results show that at 1080p DLSS Quality most modern RT enabled games can hit a minimum of 30fps with some level of RT enabled on the 2060. Are you claiming that 1080p DLSS quality is an unviable level of image quality for a 6 year old lower mid range GPU?

Console gamers are enjoying it so much that there have been numerous complaints have been made about games with poor image quality this gen? So much enjoyment that the general sentiment regarding ue5 is generally poor across pc and console gamers if various discussion forums(here to even YouTube comments) are to be believed? Like I said above, the options posed to console gamers are to either deal with the issues or not play the game. Do not equate the lack of an option to enjoyment.

Wishing for better image quality is entirely different to the game being unviable at said image quality. Sure console gamers don't have a choice and have to accept whatever image quality they get. But on PC, you tailor your GPU purchase for the image quality and framerates that you desire. The 2060 is a 6 year old lower mid range part and so anyone choosing it should expects compromises in settings, resolution and frame rate. But if they can play those games with at least 30fps and passable image quality, then they are viable.

3840x800? Firstly, is that a typo or do you really game at 24:5? Secondly, finding acceptable does not negate my earlier statement. I said DLSS flaws are far too visible when the input resolution is lower that 1920x1080. If you’re happy ignoring the flaws, that is fine. It doesn’t change the fact that the flaws are visible.

It's quite obviously a typo considering the very next sentence states my input resolution is 1920x800 at DLSS performance. And as I noted, I generally find image quality to be excellent at these settings. I totally accept that you may prefer better image quality and that's entirely your prerogative. However that's not the argument you're making here. You're trying to claim that this resolution is actually unviable to play games in. That it's literally so bad, the game cannot be played. This is clearly absurd. This level of image quality is better than the vast majority of console games 60fps modes which you've previously argued are the only way games should be played on consoles.

Yes, anything is possible for sure. You can game on console with a keyboard and mouse. It doesn’t make it the predominant preferred input choice for the platform….

You're basing your argument that PC gaming is unviable at 30fps because you must use a mouse. Yet you do not need to use a mouse for modern PC gaming which invalidates that argument. PC gaming is viable at 30fps using a control pad, which is an entirely viable method of playing all modern RT enabled PC games.

The most popular games on pc do not require high end hardware. That is why most pc gamers have worse systems than consoles. Sometimes, I don’t think people on here realize how irrelevant they are in terms of market sentiment. Basically the discussions that exist on this forum are almost not ever reflective of general market sentiment. Like you use a 4070ti so only ~3% of pc gamers have a better GPU than you. I think you should keep this in mind when making arguments.

I don't see how me gaming on a high end GPU has anything to do with whether significant numbers of PC gamers are willing to accept a lower than 60fps target or not? Do you have any kind of evidence suggesting that all (or at least the overwhelming majority of) PC gamers are gaming at at least 60fps all the time?

We certainly do have ample evidence that hundreds of millions of console gamers are content to play at 30fps where no other options exist so is it your suggestion that PC gamers simply have higher standards as a rule? Even if gaming on much weaker hardware than consoles? Seems a bit of a stretch....
 
It’s according to the steam hardware survey. Console spec is better than ~85% of pcs. By consoles I obviously mean series x, ps5/pro not the series s or switch.
Then this is a redundant and irrelevant statement, it holds the same relevance as me saying that the 4090 is more powerful than 99% of gaming hardware! I mean of course it is! It was released only recently using the most advanced technology. Likewise consoles got released 4 years ago, it's only natural they are more powerful than most other PCs built before them.

So, to make this comparison relevant again, we need to compare based on a logical and relevant period. For the relevant period (say between 2018 and 2024), consoles are only more powerful than about 40% of PCs.

Any user with eyes will not turn performance DLSS at 1080p. Its much better to lower graphics settings than play at badly upscaled psvita resolutions.
That's completely illogical, you mean turn off shadows, turn down textures, lighting and reflections so that the game look like a PS3 title is better looking than upscaling from 540p? I don't think so. This has never been the case.

Ok, well it’s good to know that you’re not representative of the majority.
Actually the majority of people play at 30fps on consoles, mobile and old PCs just fine, only people with decent hardware (current console level or higher) prefer higher fps because their hardware is capable enough.

Firstly the video is not about the 2060 super but the 2060 og with 6gb of ram.
The 2060 6GB is the lowest capable RTX GPU, it isn't supposed to be capable of doing serious ray tracing, just as much as the lowest DX12 or DX11 or DX10 GPU is cabaple of doing any of these APIs. The GT 430 was barely capable of running any DX11 effects, so did the GT 8400 before it, it was barely able to run any DX10 effect, yet nobody called these GPUs a hoax or misleading. That's how the industry always ran.

The RTX 2060 6GB is no different in that regard, it's the least capable DXR in the NVIDIA lineup, there are other lesser capable DXR GPUs than the 2060 6GB, there is consoles, there is the mobile GPUs, there is also the lower half of the AMD RX 6000 series, yet nobody calls any of these misleading.

Hardware Unboxed wants to treat DXR as some kind unique effect detached from the history of GPUs, which is absurd, DXR is not and shouldn't be any different than DX11, DX10, DX9 or any DX before it.
 
The 2060 6GB is the lowest capable RTX GPU, it isn't supposed to be capable of doing serious ray tracing
I haven’t watched the video but isn’t this kind of the point Tim is making?

My issue with the 20 series was it was before mainstream RT titles came out so there wasn’t really anything ‘period appropriate’ for it to run, but particularly the lower end cards aren’t all that useful for modern RT.

Like yes, I expect a 6 year old budget card to perform rather poorly these days, but almost all the relevant RT titles came out way after this card was considered ‘modern’, so that’s my gripe.
 
Like yes, I expect a 6 year old budget card to perform rather poorly these days, but almost all the relevant RT titles came out way after this card was considered ‘modern’, so that’s my gripe.
Again, that's how the industry always ran, the lowest capable card is barely able to run the newest DirectX. The 2060 6GB is no different in that regard, DXR was the newest version of DirectX, the 2060 6GB was the lowest DXR capable GPU, do the math.

Despite this, in 2019 it was able to run Battlefield V, Metro Exodus, and Shadow of the Tomb Raider with semi decent fps, but expecting this card with it's meager 6GB to handle the newest games after 6 years of it's release is the kind of absurd standards I have come to expect from Hardware Unboxed.
 
The prevalence of FSR in console games combined with the prevalence of DLSS on PC has really eroded the benefit of consoles IMO. Consoles are punching below their weight as any performance advantage gets lost in the sea of FSR pollution.

Sony exclusives are the only titles that still maintain that old status quo.
 
The prevalence of FSR in console games combined with the prevalence of DLSS on PC has really eroded the benefit of consoles IMO. Consoles are punching below their weight as any performance advantage gets lost in the sea of FSR pollution.
It's not like DLSS is very good at reconstructing from 640p to 4K either so this issue isn't really just with FSR2 being used on consoles.
 
Back
Top