Nvidia giving free GPU samples to reviewers that follow procedure

Developers like RT because it eases their development. That does not mean that it is actually practical for the end user at this point in time with the current hardware.

Gamers that think RT is a practical feature at this point in time are either content with 30fps, are simply blind nVidia followers or are unknowledgeable and fall for marketing. Just to copy one person with an actual RTX card;

"I've got a 3080, I tried the whole DLSS and RT, play with it enabled in Cyberpunk BUT at 1400p I still can't always maintain 60fps with DLSS set to quality, I would not use any other setting because you can absolutely tell the difference so for me RT is a bust atm. I paid £875 for my GPU and with the Quality setting in DLSS which basically renders the game at 1100p? I don't always get 60fps!! So if anyway here is buying a GPU that is not a 3080 don't bother with the RT because even DLSS wont save you"
https://www.techspot.com/community/...-41-game-benchmark.266962/page-3#post-1862014

So let us compare the performance to The Witcher 3 benchmarks five years ago:
https://www.computerbase.de/2015-06...dia-titan/5/#diagramm-the-witcher-3-2560-1440

The GT980TI for $649 (that is ~$700 today) was able to hit 43 FPS in 1440p. The 3080 with raytracing hits 38 FPS in 1440p in Cyberpunk: https://www.computerbase.de/2020-12...agramm-raytracing-in-cyberpunk-2077-2560-1440
 
Anyone downplaying RT in 2020 or 2021 is not keeping up and are thus irrelevant.

TL;DR
Do benches without the last features in the lastest games = You have declared yourself obsolete and irrelevant.

Right, everyone who has different priorities from you are obsolete and irrelevant, got it. :rolleyes:

No-one is saying RT won't be a big thing, but the fact is that it's impact on graphics is for many far too small to justify the performance hit on current hardware. It shouldn't be too hard concept to grasp.

So let us compare the performance to The Witcher 3 benchmarks five years ago:
https://www.computerbase.de/2015-06...dia-titan/5/#diagramm-the-witcher-3-2560-1440

The GT980TI for $649 (that is ~$700 today) was able to hit 43 FPS in 1440p. The 3080 with raytracing hits 38 FPS in 1440p in Cyberpunk: https://www.computerbase.de/2020-12...agramm-raytracing-in-cyberpunk-2077-2560-1440
One might take into account that 1440p 5 years ago is more like 4K today, resolutions do tend to grow over time and 3080 is strongly marketed as 4K card (and actually capable of it too, at least in rasterization)
 
Last edited:
So let us compare the performance to The Witcher 3 benchmarks five years ago:
https://www.computerbase.de/2015-06...dia-titan/5/#diagramm-the-witcher-3-2560-1440

The GT980TI for $649 (that is ~$700 today) was able to hit 43 FPS in 1440p. The 3080 with raytracing hits 38 FPS in 1440p in Cyberpunk: https://www.computerbase.de/2020-12...agramm-raytracing-in-cyberpunk-2077-2560-1440

Constructing an argument with cherry pciking always fails.
If you cranked The witcher 3 up to "11" back in the days:
witcher-bench-4k-u.jpg


It broke EVERYTHING on the market.
Do the SAME with CyberPunk 2077 and the SAME things happens. (Most benchmark doesn't even bench with "Psycho" settings)...you can even get CyberPunk 2077 running slower than the Witcher 3 back in the day.
(and yes, people whined about how HairWorks "broke their e-peen, same with raytracing today)

Difference today is DLSS, so you are getting MORE than back then...but somehow I suspect your don't care about those facts.
It sounds like a PC game scaled to "11" make you sad...perhaps consoles would be better suited for you?
 
No-one is saying RT won't be a big thing, but the fact is that it's impact on graphics is for many far too small to justify the performance hit on current hardware. It shouldn't be too hard concept to grasp.

That’s true which is why it’s important to provide buyers with enough information to make an educated purchase. For many others it is worth it.
 
Constructing an argument with cherry pciking always fails.
If you cranked The witcher 3 up to "11" back in the days:
witcher-bench-4k-u.jpg


It broke EVERYTHING on the market.
Do the SAME with CyberPunk 2077 and the SAME things happens. (Most benchmark doesn't even bench with "Psycho" settings)...you can even get CyberPunk 2077 running slower than the Witcher 3 back in the day.
(and yes, people whined about how HairWorks "broke their e-peen, same with raytracing today)

Difference today is DLSS, so you are getting MORE than back then...but somehow I suspect your don't care about those facts.
It sounds like a PC game scaled to "11" make you sad...perhaps consoles would be better suited for you?

That would be equivalent to benching Cyberpunk 2077 at Psycho at 8k. Almost nobody had a 4k display back when the Witcher 3 launched, just like almost nobody has an 8k display now. 4k displays are likely almost as common now as 1440p/1600p displays were back then.

Regards,
SB
 
That’s true which is why it’s important to provide buyers with enough information to make an educated purchase. For many others it is worth it.
And many reviews offer more than enough RT content, including their own other videos. Their general audience supports the view that rasterization is still king, which is why there's room for reviews focusing on just that.
 
And many reviews offer more than enough RT content, including their own other videos. Their general audience supports the view that rasterization is still king, which is why there's room for reviews focusing on just that.

Rasterization is still king of course. It had a 30 year head start. I agree that it’s not the job of any one reviewer to cover all the bases. It’s up to people to view content that they find valuable. If HWUB isn’t covering tech the way you want then don’t watch their content. I find their videos to be very good but I’m certainly not looking at them for insights into RT.
 
Where users have to accept RT basically even if their against it.

Nobody is against RT.

But not everyone is ready or willing to sacrifice as much performance as RT requires in some games.

Or to accept some of the trade-offs in using DLSS in some games in order to be able to use RT at acceptable performance. While quality mode for DLSS is mostly not noticeable, settings under that generally make the image in game too soft for my liking as well as destroying detail in the distance. Of course this is all subjective, so some people find the IQ compromises for DLSS either unnoticeable or acceptable, while others can't ignore it and it drives them crazy.

Not everyone values things the same. I still remember back when NV's AA was pretty crap compared to ATi, but it didn't matter to many gamers because they didn't want to take the performance hit to enable AA. OTOH - some gamers preferred a better and more stable image and the performance hit was worth it to them.

Different strokes for different folks. I have no problems with a reviewer catering to their audience while at the same time doing deep dives (in separate articles) into tech their audience doesn't find particularly compelling ... yet. Just like AA wasn't for everyone back in the day, RT isn't quite there yet for some gamers.

Regards,
SB
 
Nobody is against RT.

But not everyone is ready or willing to sacrifice as much performance as RT requires in some games.

Or to accept some of the trade-offs in using DLSS in some games in order to be able to use RT at acceptable performance. While quality mode for DLSS is mostly not noticeable, settings under that generally make the image in game too soft for my liking as well as destroying detail in the distance. Of course this is all subjective, so some people find the IQ compromises for DLSS either unnoticeable or acceptable, while others can't ignore it and it drives them crazy.

Not everyone values things the same. I still remember back when NV's AA was pretty crap compared to ATi, but it didn't matter to many gamers because they didn't want to take the performance hit to enable AA. OTOH - some gamers preferred a better and more stable image and the performance hit was worth it to them.

Different strokes for different folks. I have no problems with a reviewer catering to their audience while at the same time doing deep dives (in separate articles) into tech their audience doesn't find particularly compelling ... yet. Just like AA wasn't for everyone back in the day, RT isn't quite there yet for some gamers.

Regards,
SB

Anyone not using AA today is a muppet though...and RT is getting close to being the same same thing now.
 
Anyone not using AA today is a muppet though...and RT is getting close to being the same same thing now.
Although this is true, that would not necessarily justify the decision of buying a graphics card with superior AA at the time AA just rolled around. In 10 years we likely will all be buying cards with RT, and comparing RT performance at that point is warranted.

Right now, basing your graphics card buying decision primarily on RT performance is like basing your current car purchase primarily on how good its self-driving / autonomous driving capability is.
 
Although this is true, that would not necessarily justify the decision of buying a graphics card with superior AA at the time AA just rolled around. In 10 years we likely will all be buying cards with RT, and comparing RT performance at that point is warranted.

Right now, basing your graphics card buying decision primarily on RT performance is like basing your current car purchase primarily on how good its self-driving / autonomous driving capability is.

How about anyone looking for a new GPU or console then? Whatever they will buy (new) will have hardware raytracing in it, be it a AMD gpu, NV gpu or any of the new consoles. Isn't it intresting at the least to know what your paying for? It cant be totally ignored if its there, people pay in some way for hardware ray tracing. Its impossible to not get hardware rt equipped devices aside from buying last gen consoles or gpus. Even if one is planning to never use it, you certainly bought into it. For consoles, you almost certainly cant disable ray tracing going forward in all titles.
 
How about anyone looking for a new GPU or console then? Whatever they will buy (new) will have hardware raytracing in it, be it a AMD gpu, NV gpu or any of the new consoles. Isn't it intresting at the least to know what your paying for?
Not if you're not going to use it. And the ones buying consoles, I doubt most of them know or even care what RT is.

It cant be totally ignored if its there, people pay in some way for hardware ray tracing.
Sure it can. You can safely ignore parenting controls if you don't have any children for example. Or ignore video encoding capabilities if all you do is watch videos. It can be ignored just as how everyone pretty much ignored async compute being in pretty much everything at the time and no one finding it 'interesting to know what you're paying for'.

Its impossible to not get hardware rt equipped devices aside from buying last gen consoles or gpus.
It's also pretty much impossible to buy the newer consoles and the newer GPUs at all. This argument is baseless.

Even if one is planning to never use it, you certainly bought into it. For consoles, you almost certainly cant disable ray tracing going forward in all titles.
No. Just because something packs a feature does not mean you 'bought into it'. That's quite the wordplay you got there... Just so we're clear;

buy into something
phrasal verb1
informal to accept that an idea is right and allow it to influence you
https://www.ldoceonline.com/dictionary/buy-into

You're conflating buying a product with buying an idea as if they are the same thing to support your bias.
If you don't know something exists, you can't accept that as an idea. And that counts pretty much for the majority of console players.
And more importantly, you can even know it exists, and not want to buy into it or use it, while having it. You know. Like air bags in your car.
 
Not if you're not going to use it. And the ones buying consoles, I doubt most of them know or even care what RT is.

Same on the pc side of things, anyone going for a new GPU/system cant possibly omit RT hardware by now, all Ampere have it, all RDNA2 have it, and soon Intel. Same story for the consoles, all the way from XSS to PS5 and finally XSX, your somewhere paying for it. You have the option to enable it on consoles and studios are going to advertise it.

Sure it can. You can safely ignore parenting controls if you don't have any children for example. Or ignore video encoding capabilities if all you do is watch videos. It can be ignored just as how everyone pretty much ignored async compute being in pretty much everything at the time and no one finding it 'interesting to know what you're paying for'.

Then one can wonder why Sony and MS even bothered with it then? Ray tracing was the most thrilling graphical feature talked and hyped about pre-launch.

It's also pretty much impossible to buy the newer consoles and the newer GPUs at all. This argument is baseless.

I fail to detect if your serious and not being ironic there?

You're conflating buying a product with buying an idea as if they are the same thing to support your bias.

What kind of bias would that be, you think? All new hardware, and yes, even playstation, is equiped with ray tracing and games are currently using it.

Like air bags in your car.

Well, now i see, your not serious.
 
I understand what you are saying. RT is being adopted. No one is arguing against that. And unless you can see that no one is arguing against that, you will not understand the other side of the argument.
Or maybe it's not in your interest to understand. But whatever.

Now I remember why I left this conversation.
 
Back
Top