I don't understand the claim the hardware is not fast enough either. If you are not setting everything to ultra and running at native resolution, even a 2060 will give you great Raytracing performance as demonstrated by Dictator's videos many times.
I've been running nearly every RT supported game there is on my RTX 2060 laptop (which just uses around 80 watts) and I struggle to come across a game that is not running at a locked 60 FPS with my personal optimized settings (usually around high base settings and medium Raytracing). I've been getting great looking experiences using DLSS Performance at a stable 1440p60. RT in games where it's implemented correctly adds a ton to visual fidelity.
Cyberpunk being an exception though, that one is definately not possible at 60 FPS with RT. Maybe it is at 1080p with DLSS Performance, but my CPU is too weak for that.
Keep in mind that's with DLSS enabled, which for me isn't an option for most titles that support DLSS.
Here's my requirements for any game I'd run with a modern HDMI 2.1 graphics card. NOTE - this only applies to me and there is no implications other than that these are my requirements. Other people may or may not share similar requirements, but this particular list is just for me.
- 120 Hz. This includes not more than about 1% framerates (frame times would be more accuracte) being lower than 120 Hz.
- Very rare exceptions for games that I REALLY like which can't hit that requirement. For example, Elden Ring.
- Resolution of 3200x1800
- If that isn't possible then 2560x1440p as a compromise resolution for most titles.
I will adjust any and all other settings in order to achieve those requirements (obviously this includes RT). It's the same with my current GTX 1070 card, except it can only do 60 Hz rendering over HDMI at my display's native 4k resolution (games run in a window). So, I'm quite definitely not running anything at Ultra because I don't care enough about Ultra graphics to degrade my playing or viewing experience.
Since I use a 55" 4k display as my monitor, it means that at the distance I sit from the display it has a similar pixel density to a 30" 2560x1600 PC monitor. This means that detail loss is more visible than if I was using a 27" - 32" 4k monitor.
Thus DLSS Quality is only usable on a few titles due to rendering anomalies introduced to the game versus without DLSS Quality enabled. So, there are a very few titles where I could use DLSS Quality in hopes of having RT at 1440p or 1800p run at a locked or near locked 120 Hz. So right off the bat a lot of titles are just not going to be worth even trying to run with RT on. For example, Dyling Light 2 had a pretty bad DLSS implementation at launch which I'm not sure whether they've fixed yet or not. That's also a title that without DLSS can't even hit my framerate requirements at 1080p with RT enabled, much less 1440p or 1800p with an RTX 3090. Granted I have no idea what settings TechRadar are using other than RT enabled.
Dying Light 2 PC performance: a new benchmark in ray tracing | TechRadar
Even they acknowledge that many people might not find the quality of RT in that game to be worth the performance hit for enabling RT.
Basically not all people will choose to disable or enabled certain graphic IQ settings in order to achieve a playable experience. Some people might choose to sacrifice more quality options in order to have RT. Some people might choose to sacrifice RT in favor of other quality settings.
If I wanted to run at 1080p or lower, then RT certainly becomes more feasible. Alternatively, if I was willing to accept DLSS image quality compromises to hit the resolutions I require, especially the lower DLSS quality levels then it becomes more feasible.
Again, NOTE, that while I don't find DLSS quality to be adequate compared to DLSS off in most games, that doesn't mean I think DLSS is bad (I don't) and it certainly has no implications as to whether other people find DLSS quality to be good enough to be worth using all the time.
Regards,
SB