The only reasonable argument against hardware RT was that transistors could/should have been better spent elsewhere back in 2018 with higher ROI. That theory has completely collapsed now that we have seen the best of what's feasible without RT. Maybe there are games where inclusion of RT has made the game objectively worse but I can't think of any examples. However, there are tons of examples where lack of RT has clear consequences.
Literally for the majority of games and for the majority of people, turning on RT is not worth it. Either the RT is light enough that the visual difference is negligible, or the visual difference is striking, but performance tanks. There is no in-between, at least, not yet.
It's also funny to me that first the 3090 was considered the holy grail of RT, and that even the RTX 2060 was argued as being viable when it came out. When the 7800XT and 7900XTX came out, which offers pretty much similar RT to a 3090, suddenly only the RT performance of the 4090 is somehow viable. It seems funny to me, considering everyone is always arguing in favor of RT and its viability for the whole stack of nVidia cards. But when it comes to AMD, they have to better than the absolute top. It suddenly doesn't matter if they have cards that perform better at RT than an RTX 2060, 3060, 4060, or 4070.
That alone makes it clear that the whole discussion around RT is not an honest conversation. It is purely to inflate the egos of the ones that like nVidia and to impose their views onto everyone else.
Also, I'd love to hear where the lack of RT has consequences. There are zero games where RT actually influences gameplay. And in the majority of games, even though the visuals are better, they often still don't justify the performance hit. The main exceptions seem to be old games like Quake and Tomb Raider, i.e. late 90s & pre-2010 games.
So the only thing left to complain about is that RT = Nvidia and we don't like Nvidia or consequently AMD has struggled with RT and we love AMD so therefore RT is bad.
Not at all. It is still a fact that RT is expensive to render. nVidia has struggled (and arguably still is struggling) with RT too. Why do you think they invented DLSS? It was created specifically to push RT. Things turned out a bit differently, where people love to use DLSS on its own rather than with RT, because you know, it's actually useful. We see it clearly in Starfield for example. The game doesn't have RT, but everyone is crying to have DLSS in it. Did you hear people complaining about the lack of RT? If there are, they aren't many, especially with how heavy the game already is without it.
Just because nVidia is at the moment faster with RT, doesn't mean that they are good at it. No card is good enough at it today. Unless you want to pay 4 figures for a smooth 1080p RT experience, which I don't think many would. That price is normally reserved for 4K performance.
Is that why the criticism of DLSS and frame generation evaporated overnight after AMD started doing the same thing?
It's easy to argue things when you put them in a vacuum. DLSS literally sucked when it came out, because rendering at an 80% lower resolution on any graphics card, without any other type of processing or effect gave a better image quality than the original DLSS. And once again, it was clear that DLSS was only there to push the primary tech RT. Remember too that this was the 2000 series, where for example the RTX 2080 was ~20% faster than the GTX 1080 while costing 40% more (i.e. terrible value over previous gen), and RT was used as the reason for this. DLSS had to be used in conjunction with it for it to give any sort of respectable framerates, but it degraded image quality too much. It was bashed, and for good reason.
Then came DLSS2, which was actually usable. The complaints regarding DLSS didn't stop because AMD came out with FSR. They stopped because it actually became useful after necessary improvements due to community backlash. So once again, a wrong narrative is being painted in support of green, and once again it's not an honest conversation.
As for frame generation, there's still enough criticism on it, and justifiably so. It increases latency, i.e. you need a high framerate to use it in the first place. Not to mention that it's used deceitfully in marketing material. But the most important criticism is it being limited to the 4000 series. If AMD can allegedly make it work on everything, including cards that are not their own, why can't nVidia make their version work on at least their 3000 series? Even if it's an inferior version, there's no reason they couldn't do it. It's likely that it was simply a tactic to once again try and push their own customers to upgrade again, keeping them on the hamster wheel. I admit that it's speculation, but considering nVidia's track record, it's probably true.
And if you notice, they have now deliberately made the DLSS naming confusing. I guess they wouldn't want their own users to feel left out, so now, every RTX card supports DLSS 3.5, but not really, since features may be missing.
Of course they downplay RT because it's not seen as an AMD strength and I think you know that. There's plenty reason to celebrate alternative methods that deliver similar results to RT. There are no reasons to blame RT for our problems.
I refer back to the above, about RT being too expensive to render for the average gamer. That is a fact. It might theoretically be smart to go for the card right now that has better RT performance, but, that is only useful if the RT performance is already viable. We can all agree that the more RT is used, the harder it will be for these cards to run it. So you have to think from where we are, and their RT performance dropping from here. And most (if not all) RT cards launched on their last legs, if they had legs in the first place. That is why basing your choice on RT speeds of current cards is not the smartest decision.
You may argue that I'm downplaying RT with this argument, and that is your prerogative. In my view, saying that something is too expensive is not downplaying it. Am I downplaying a Ferrari when I say that a Toyota is enough for me? Imagine how ridiculous it would be if everyone that had a Ferrari had to go around telling everyone that they should all buy a Ferrari and that Toyota sucks. Rasterization is still king, and this puts AMD as a great value for money option in the current market. In a way, nVidia is helping AMD here, but gamers refuse to it and keep flocking to nVidia, mostly. Some are starting to wake up.
What's the opportunity cost of including RT hardware in chips and RT rendering in games? Any cost argument against RT should be accompanied by an alternative proposal. Flappy said it above. If HWRT is expensive and a poor use of resources then it should be trivial for IHVs and ISVs to produce competitive results using alternative methods. Otherwise it's just hand waving.
I wouldn't necessarily say it's a poor use of resources. We have to start somewhere, so I am not against including features in hardware. And it's probably harder to design two chips completely differently, rather than include RT in all of them.
But that doesn't mean that the end user has to buy it.
In practical terms, it doesn't matter if an RTX 2060 has RT or not, because it can't run it properly anyway. Yet, when the 5700XT came out, people argued constantly against buying it, because it didn't have RT. They recommended buying the slower RTX 2060 instead, because on paper it had RT, even if it was unusable in practice, it was somehow "future proofing". Now THAT is a waste of resources. And that is what we're talking about. It's not about downplaying RT. It's about doing right by the consumer and not deceiving them into buying things they don't need. But you know, you were then "downplaying RT" too.
nVidia is good at letting people buy things they don't need, and apparently their blind followers want to bring down the rest with them. Some of us see through it, and want to make people to make more conscious choices with their money. It's more about RT being overhyped, and anyone seeing through it is labeled as being a hater or AMD fanboy or that they're trying to downplay it.
And oh, one more thing. If you want to use your card for a long time, it's much more likely that having an adequate amount of VRAM is going to be a much more important "feature" than having RT. That's another gripe. Anyone would have been much better off buying a 6800XT 16GB rather than an RTX 3080 10GB in the long term. And prices on the secondary market definitely reflect that (feel free to take a look on Ebay). But once again, there are too many people shouting "RT RT RT RT RT RTX RT RT", and people get burnt unnecessarily, making them upgrade much sooner than they would otherwise and mess up the whole gaming market for the rest of us.
The adoption of hardware RT has nothing to do with whether hardware is affordable, especially AMD hardware where the investment in RT transistors has been modest. But yes, enthusiasts especially on PC are frustrated in general at the lack of progress in rendering tech and poor utilization of their expensive hardware. The recent spate of poor showings hasn't helped the situation. I don't think we have a problem with objective interpretation of results. Most people seem to agree with what their eyes are telling them when a AAA game lands with mediocre IQ and performance.
The problem is that actual results are constantly being dismissed in favor of power point promises.
I have no idea what "lack of progress in rendering tech" you're talking about. Before RT, a lot of things were being developed to improve visual quality. And we have UE5 with Lumen, which doesn't need hardware RT. We have other newer engines being created also... So the progress was always there.
If you're used to buying AMD, poor utilization of the hardware is just another Tuesday, considering that the majority of developers focus only on nVidia. So if you truly are for maximizing the hardware that we have, to get the best results possible, supporting AMD would seem to be the most logical thing to do. They have the consoles, the have untapped hardware, but most importantly, they push for implementations that works across the whole industry.
And it's also kind of a hard pill to swallow to say that RT has nothing to do with whether hardware is affordable, when nVidia deliberately used the initial RTX line to bump up prices significantly. But nVidia has been doing it before as well, so... There is that. RT is definitely not the core of the problem, but it is a contributor.