IHV Business strategies and consumer choice

Replace Nvidia with AMD and Lumen with RT in that same sentence and you get the other side of the same coin.

None of that matters of course to AMD’s bottom line. Pretending that RT is just some Nvidia marketing thing that AMD can choose to ignore and still be successful is pure fantasy at this point. Cat’s out the bag and it’s not going back in. There have been suggestions from some users here that AMD should actively sabotage adoption of raytracing in games. On this forum of all places. That’s simply embarrassing and pales in comparison to any pro-RT fanboyism from Nvidia fans. Clearly those people don’t actually care about graphics.

This is a symptom of the whole vibe around AMD’s graphics division of late. Instead of being associated with pushing tech forward their biggest supporters are more concerned with stifling progress so they don’t look bad. And that’s all you need to know to explain their current market position.

Baffling how the so called holy grail of rendering has been relegated to having a “side”. Imagine if we had sides for 3D vs 2D rendering here based on IHV preference. Utterly stupid.
If you think that locking out previous buyers of your cards from new tech is "progress", I don't know what to tell you.

As for the other side of the same coin, it never is a same-sized coin. nVidia "supporters" (to remain polite) are everywhere and infinitely worse. Saying AMD's are the same would be like comparing the army of the US vs the army of Bermuda and pretending that they are equally as powerful. Everybody knows that's not the case.

Everyone has been complaining about graphics card prices the last couple of years, but it's the blind consumers that are buying overpriced nVidia products that are the problem. And before you go and blame it on AMD again, saying that they do it too to once again excuse nVidia for their nonsense, I think we all agree that nVidia is the one dominating the market, which naturally means that they are the ones that determine market pricing. The only way that they cannot do it is when the consumers stop being gullible, which this thread has proven is an uphill battle.

At least people are starting to wake up, even though it's too late. The graphics card market is already in ruins, and it's the fault of the blind nVidia buyers, because they like the latest shiny new feature, rather than actually caring about value for money.
 
The idea of AMD "stifling progress" is a narrative that's only shared by their most adamant opponents who refuses acknowledge other paths. The industry isn't some ideal vacuum where any one hardware vendor can operate in complete impunity. The industry operates with mutual cooperation in mind between developers and vendors and if some here feel threatened by AMD in that regard then it falls to their rival's imperative to do a better job to proliferate their so called "holy grail" to other ISVs besides their closest tech demo partners ...

AMD was able to court Epic Games on their side as one of the biggest ISVs to help standardize work graphs with Microsoft to better optimize nanite despite not inventing the technology themselves originally. It's easier for the industry to participate in unison on a technology to reach their goals as we see with nanite than it is to unilaterally force an industry down into a singular direction without regards to their requirements as is the case with AMD's competitor ...
 
AMD was able to court Epic Games on their side as one of the biggest ISVs to help standardize work graphs with Microsoft to better optimize nanite despite not inventing the technology themselves originally. It's easier for the industry to participate in unison on a technology to reach their goals as we see with nanite than it is to unilaterally force an industry down into a singular direction without regards to their requirements as is the case with AMD's competitor ...

When will we see the results of the partnership and optimization?
 
I'd describe it as a game with last gen geometry and current/next gen lighting (RT path). Which means it can sometimes look amazing (when the low geometry doesn't get in the way) or it can sometimes look awful (when the RT lighting exposes the low geometry).

Overall I'd put it somewhere between last and current gen or maybe even current gen if the geometry doesn't bother the viewer with RT on. But then, I've been a proponent of the need for vastly increased geometric complexity in games for over a decade now. I had hopes for tesselation but that never panned out due to difficulty of implementation combined with less than stellar implementation when actually used in games (Crysis 2 for example).

Nanite finally might give us the geometry we need which RT quality lighting desperately (IMO) needs. But then RT hardware needs to get better and significantly faster in order to deal with more complex geometry ... /sigh.

Regards,
SB
But there is hardly any game with Current Gen geometry. For a generation jump, I expect a multiple of polygons per object. Walls always have to be complex 3D geometry. Which game meets these conditions with 3D walls? Fortnite?
 
But there is hardly any game with Current Gen geometry. For a generation jump, I expect a multiple of polygons per object. Walls always have to be complex 3D geometry. Which game meets these conditions with 3D walls? Fortnite?
Fortnite is probably the only game that pushes geometry to a far higher level. Titles like Forbidden West and R&C do seem to have a notable increase in geometry over Cyberpunk though, which honestly seems to be fairly low poly.
 
If you think that locking out previous buyers of your cards from new tech is "progress", I don't know what to tell you.

As for the other side of the same coin, it never is a same-sized coin. nVidia "supporters" (to remain polite) are everywhere and infinitely worse. Saying AMD's are the same would be like comparing the army of the US vs the army of Bermuda and pretending that they are equally as powerful. Everybody knows that's not the case.

Everyone has been complaining about graphics card prices the last couple of years, but it's the blind consumers that are buying overpriced nVidia products that are the problem. And before you go and blame it on AMD again, saying that they do it too to once again excuse nVidia for their nonsense, I think we all agree that nVidia is the one dominating the market, which naturally means that they are the ones that determine market pricing. The only way that they cannot do it is when the consumers stop being gullible, which this thread has proven is an uphill battle.

At least people are starting to wake up, even though it's too late. The graphics card market is already in ruins, and it's the fault of the blind nVidia buyers, because they like the latest shiny new feature, rather than actually caring about value for money.

A little heavy handed on the parody even for r/amd, but not bad. Solid 7/10
 
It's still about fanboyism. Whatever Nvidia does is the way, therefore RT is the only way. That's why the same users argue so much against software Lumen in the UE thread. It's part of the agenda to promote Nvidia constantly.
Yes, but due to the latter, i'm not sure if 'fanboyism' properly describes what is going on at all. : )
However, what i meant was that the arguments are often valid, and we would discuss them also if a relation to IHVs would not exist.
Or maybe they argue against s/w Lumen because it is both slower and result in worse image quality than what you get by using h/w RT on GPUs which are capable of running it well. And it's in fact an agenda to downplay h/w RT constantly because it's adoption hurts your favorite brand.

See? This game of yours can be played both ways.
No - we can't play both ways, because there is only one IHV pushing bleeding edge technology. Other vendors don't do such proposals on software development, they just make hardware. Thus we can't play the same game by claiming AMDs lighting solution is cheaper and delivers higher quality than NVs.
And talking down Epics software solutions, although Epic already actually offers PT+Restir as well anyway (afaik), does not make any sense at all. If you want to make your UE5 game with PT, you can. So no reason to complain.
I also doubt AMD fans downplay RT just because theirs is slower.

No, i am not excited about RT because it's cost is too high. Years ago i believed RT is the future and every GPU, including entry level, will have it.
But the current reality is different: Entry level GPUs no longer exist at all (looking at their prices). RT adoption is high, but aside rare exceptions it's just some bolt on gfx options like shadows / reflections. And GPUs powerful enough to play those PT exceptions at acceptable compromise are not affordable.
I no longer think it becomes better. RT is everywhere, but PT is an enthusiast high end niche, and it will stay there for a while. Suggesting ALL games should be path traced instead using X is just bullshit.
(I'll take everything back if Switch2 comes with path traced games.)

That's what it is, and because it generates a two classes society within the formerly united gaming community, we have a little problem now to deal with this new situation.
But the problem goes away once we accept this two classes society to exist,
or if we can develop software able to gradually blend the two extremes, which i personally think is not yet possible due to missing RT API flexibility.

Baffling how the so called holy grail of rendering has been relegated to having a “side”. Imagine if we had sides for 3D vs 2D rendering here based on IHV preference. Utterly stupid.
Define the 'holy grail of rendering'. It was many things across time. Hidden Surface Removal, LOD, lighting. Can be anything.
But there are indeed people why think 2D Super Mario is the better game than 3D CP2077. I'm one of them. And i neither want nor need a path traced Super Mario, so if i have to pay 1000 bucks for a huge steam machine just to can play the next Super Mario, i simply won't play it.

Times where we can have gfx always getting better for free are over. Thus people start to draw their line of 'it's good enough at this point - i won't pay to go further for diminishing returns.' What's so hard about accepting their decision?
If you want more, you can still request it, and you'll get it. But because you enter niche enthusiast territory, you'll pay even more. Simple economy.

Look at it this way: For the first time gfx enthusiasts get their own high end versions of games. Your awesome GPU not only gives more frames, but an entirely different and better image.
What's bad about that? It is what enthusiasts want.
I think i know what's bad about it:
Enthusiasts are afraid that affordable mainstream will block creation of high end content. (But that's not the case, since part of the money you pay for your GPU goes into pushing the creation of that content at least indirectly)
Low cost mainstream gamers are afraid they might no longer can afford their hobby. (But that's not the case either - games industry won't ignore their budgets and interests.)

So what should we do? Divide Forums into enthusiast and mainstream sections? Or just keep going accepting some friction until our fears have turned out unfounded?
Does not matter i guess. Let's vote with our wallets, that's all we can do. Games industry is large enough they can and want support niches. We get what we demand.

Yup, if we actually focused on results and not philosophy these discussions would be a lot more productive.
Haha, obviously our lack to deal with varying philosophical / ideological mindsets is exactly what prevents us to interpret results objectively. Only time will cure.
 
Yes, but due to the latter, i'm not sure if 'fanboyism' properly describes what is going on at all. : )
However, what i meant was that the arguments are often valid, and we would discuss them also if a relation to IHVs would not exist.

The only reasonable argument against hardware RT was that transistors could/should have been better spent elsewhere back in 2018 with higher ROI. That theory has completely collapsed now that we have seen the best of what's feasible without RT. Maybe there are games where inclusion of RT has made the game objectively worse but I can't think of any examples. However, there are tons of examples where lack of RT has clear consequences.

So the only thing left to complain about is that RT = Nvidia and we don't like Nvidia or consequently AMD has struggled with RT and we love AMD so therefore RT is bad.

I also doubt AMD fans downplay RT just because theirs is slower.

Is that why the criticism of DLSS and frame generation evaporated overnight after AMD started doing the same thing? Of course they downplay RT because it's not seen as an AMD strength and I think you know that. There's plenty reason to celebrate alternative methods that deliver similar results to RT. There are no reasons to blame RT for our problems.

No, i am not excited about RT because it's cost is too high.

What's the opportunity cost of including RT hardware in chips and RT rendering in games? Any cost argument against RT should be accompanied by an alternative proposal. Flappy said it above. If HWRT is expensive and a poor use of resources then it should be trivial for IHVs and ISVs to produce competitive results using alternative methods. Otherwise it's just hand waving.

Enthusiasts are afraid that affordable mainstream will block creation of high end content.

Haha, obviously our lack to deal with varying philosophical / ideological mindsets is exactly what prevents us to interpret results objectively. Only time will cure.

The adoption of hardware RT has nothing to do with whether hardware is affordable, especially AMD hardware where the investment in RT transistors has been modest. But yes, enthusiasts especially on PC are frustrated in general at the lack of progress in rendering tech and poor utilization of their expensive hardware. The recent spate of poor showings hasn't helped the situation. I don't think we have a problem with objective interpretation of results. Most people seem to agree with what their eyes are telling them when a AAA game lands with mediocre IQ and performance.

The problem is that actual results are constantly being dismissed in favor of power point promises.
 
Last edited:
Moved comparison out of AMD thread to own discussion. A discussion that's likely going to go to shit as people won't keep their emotions in check and will get all personal over arrangements of silicon atoms. But here it is, a chance to compare business strategies and market performance across IHVs without bringing that into the IHV news and tech threads.

Point of interest; AFAIK there's currently only one active moderator on this forum. If you can't self moderator, I'm happy to walk and let this place go to hell. Play nice. ;)
 
Last edited:
The only reasonable argument against hardware RT was that transistors could/should have been better spent elsewhere back in 2018 with higher ROI. That theory has completely collapsed now that we have seen the best of what's feasible without RT.
No. Because of the economical crisis we are still at the same state as in 2018. Now GPUs are crazy expensive, but all that's changed is the new RT feature. That's a very bad start, generating the impression RT is the reason for higher costs.
On the other hand, the best of what's feasible runs well only on high end GPUs. Better said it barely runs using a lot of crutches like upscaling, interpolation, and physical crutches to hold the weight of your ridiculously large and heavy HW.
So my impression is the cost is too high, it is and remains an optional feature, and simply isn't worth it just to burn some spare time with playing silly video games.

Both our perspectives make sense and can't be debunked by the other.
So the only thing left to complain about is that RT = Nvidia and we don't like Nvidia or consequently AMD has struggled with RT and we love AMD so therefore RT is bad.
I can't tell how many think that way. I don't. But i feel that's rather an assumption NV fanboys make up to avoid objective focus on real arguments. A defense which isn't needed or helpful.

Is that why the criticism of DLSS and frame generation evaporated overnight after AMD started doing the same thing? Of course they downplay RT because it's not seen as an AMD strength and I think you know that.
A similar topic. So if you insist, i guess there are indeed AMD fanboys who act that way, and for reasons you speculate. But what i said above likely applies just as much.
If you ask me about my favorite chipmaker it's AMD, so maybe i'm a fanboy. But i can't comment on fanboyism to be a potential cause of ideological differences, nor can i tell if there's any sponsorship / shilling involved beyond that.
I do not care, because this does notreally matter imo. The arguments - if valid - matter. A lot of water will flow down the river until we converge on anything.
What's the opportunity cost of including RT hardware in chips and RT rendering in games? Any cost argument against RT should be accompanied by an alternative proposal. Flappy said it above. If HWRT is expensive and a poor use of resources then it should be trivial for IHVs and ISVs to produce competitive results using alternative methods. Otherwise it's just hand waving.
It's not hand waving. We talk about technologies which are under constant development. So we can't say X is, was, and will be better than Y.
Even if we focus just on present day, we don't have enough PT and Lumen games to draw solid conclusions. IMO, both is bleeding edge, and neither is sufficiently efficient or future proof at its current state. If you ask me what's better, you ask me to pick the lesser evil.
As far as i can predict the future, i assume PT practices become widely used and standard (though, still eventually optional), while not much of Lumens current methods will stay.
But what we really want are methods which can scale down to lower power HW. And we want to use the same content on any platform too. The GI solutions we have seen now are not there yet, so there is no point to declare a winner already now. And ideally we keep moving and this day never comes.

Regarding NV RT vs. AMD RT it is clear AMD has to improve theirs. But i'm fine with AMDs tempo at getting there. From my perspective, HW RT is currently useless because it can't do LOD which is mandatory for efficiency. Thus if AMD spends more area on HW traversal i'm both happy and sad about it.
But yes, enthusiasts especially on PC are frustrated in general at the lack of progress in rendering tech and poor utilization of their expensive hardware.
Yeah, and we have the same problem on the other end too. The range of HW devs want to support is quickly growing on both ends. That's a really hard problem, and i'm not sure if we can solve it with scalable software.
If not, what we need is a finer fragmentation of the market into more performance tiers. Which may not be practical for the way current AAA game production works. Though, i would not mind AAA studios splitting up into smaller AA studios to target niches. It would only increase our chances to get a game we really like.

Most people seem to agree with what their eyes are telling them when a AAA game lands with mediocre IQ and performance.
Peoples disappointment is expected currently, it seems. I'm mostly disappointed too, but rarely about performance. I'm just tired about AAA standard recipes, and them assuming i'm a dumb teen willing to sink 200h into a silly game and caring about cosmetics.
That's why personally actually hope for fragmentation and targeting niches instead. I'm fine with smaller games, and Indie actually serves me way better.

But such discussion goes way beyond gfx, or how much of a game changer current RT really is or not. This feels like a small issue, mainly interesting to us tech nerds. But i propose the same solution: Target smaller audiences, to give them what they really want.
 
No. Because of the economical crisis we are still at the same state as in 2018. Now GPUs are crazy expensive, but all that's changed is the new RT feature. That's a very bad start, generating the impression RT is the reason for higher costs.
On the other hand, the best of what's feasible runs well only on high end GPUs. Better said it barely runs using a lot of crutches like upscaling, interpolation, and physical crutches to hold the weight of your ridiculously large and heavy HW.
So my impression is the cost is too high, it is and remains an optional feature, and simply isn't worth it just to burn some spare time with playing silly video games.

If RT is too expensive and the alternatives are also too expensive where does that leave us? I'm sure many people will be happy to get better games running with last generation rendering tech but that's not what this debate is about.

A similar topic. So if you insist, i guess there are indeed AMD fanboys who act that way, and for reasons you speculate. But what i said above likely applies just as much.
If you ask me about my favorite chipmaker it's AMD, so maybe i'm a fanboy. But i can't comment on fanboyism to be a potential cause of ideological differences, nor can i tell if there's any sponsorship / shilling involved beyond that.

If that's true the argument needs to be coherent and offer real, tangible alternatives. In the absence of that the default conclusion is IHV favoritism / hate.

It's not hand waving. We talk about technologies which are under constant development. So we can't say X is, was, and will be better than Y.

No that doesn't fly any more. It's been 5 years that we've been hearing HWRT is too early and a waste of resources and something cheaper and better is right around the corner. How long do we need to wait for proof? 5 more years? 10?

But what we really want are methods which can scale down to lower power HW. And we want to use the same content on any platform too. The GI solutions we have seen now are not there yet, so there is no point to declare a winner already now. And ideally we keep moving and this day never comes.

Regarding NV RT vs. AMD RT it is clear AMD has to improve theirs. But i'm fine with AMDs tempo at getting there. From my perspective, HW RT is currently useless because it can't do LOD which is mandatory for efficiency. Thus if AMD spends more area on HW traversal i'm both happy and sad about it.

Well that's where we fundamentally disagree. You think the incremental progress is useless (even though nobody thought it would be remotely possible). I think it's amazing. We already have GI solutions that scale down to lower power HW - baked light maps. Dynamic GI is a high end feature and there's nothing wrong with that. With time it will trickle down.

But such discussion goes way beyond gfx, or how much of a game changer current RT really is or not. This feels like a small issue, mainly interesting to us tech nerds.

Right, RT has nothing to do with gameplay innovation for good or bad. That's a whole other topic. Good thing us tech nerds have a whole forum dedicated to discussing the small issue of 3D rendering :)
 
The idea that only nVidia is pushing tech forward is pure nonsense.

Who came up with the unified shader architecture?
Who came up with tessellation first?
Who had DX12 support first?
Who invented HBM?
Who invented Vulkan?
Who came with async compute first?
Who had chiplets first?
Who actually has a user interface that doesn't appear to be from the 2000s?

AMD's contributions and innovations are many. Just because it's not shiny and shoved down your throat doesn't mean they are not innovating or not driving tech forward. In a way, AMD is doing it more than nVidia, for the simple fact that they push the whole industry forward rather than closing technology at the expense of everyone else. nVidia is the Apple of graphics, and that is not a compliment.
 
The only reasonable argument against hardware RT was that transistors could/should have been better spent elsewhere back in 2018 with higher ROI. That theory has completely collapsed now that we have seen the best of what's feasible without RT. Maybe there are games where inclusion of RT has made the game objectively worse but I can't think of any examples. However, there are tons of examples where lack of RT has clear consequences.
Literally for the majority of games and for the majority of people, turning on RT is not worth it. Either the RT is light enough that the visual difference is negligible, or the visual difference is striking, but performance tanks. There is no in-between, at least, not yet.
It's also funny to me that first the 3090 was considered the holy grail of RT, and that even the RTX 2060 was argued as being viable when it came out. When the 7800XT and 7900XTX came out, which offers pretty much similar RT to a 3090, suddenly only the RT performance of the 4090 is somehow viable. It seems funny to me, considering everyone is always arguing in favor of RT and its viability for the whole stack of nVidia cards. But when it comes to AMD, they have to better than the absolute top. It suddenly doesn't matter if they have cards that perform better at RT than an RTX 2060, 3060, 4060, or 4070.
That alone makes it clear that the whole discussion around RT is not an honest conversation. It is purely to inflate the egos of the ones that like nVidia and to impose their views onto everyone else.

Also, I'd love to hear where the lack of RT has consequences. There are zero games where RT actually influences gameplay. And in the majority of games, even though the visuals are better, they often still don't justify the performance hit. The main exceptions seem to be old games like Quake and Tomb Raider, i.e. late 90s & pre-2010 games.

So the only thing left to complain about is that RT = Nvidia and we don't like Nvidia or consequently AMD has struggled with RT and we love AMD so therefore RT is bad.
Not at all. It is still a fact that RT is expensive to render. nVidia has struggled (and arguably still is struggling) with RT too. Why do you think they invented DLSS? It was created specifically to push RT. Things turned out a bit differently, where people love to use DLSS on its own rather than with RT, because you know, it's actually useful. We see it clearly in Starfield for example. The game doesn't have RT, but everyone is crying to have DLSS in it. Did you hear people complaining about the lack of RT? If there are, they aren't many, especially with how heavy the game already is without it.
Just because nVidia is at the moment faster with RT, doesn't mean that they are good at it. No card is good enough at it today. Unless you want to pay 4 figures for a smooth 1080p RT experience, which I don't think many would. That price is normally reserved for 4K performance.

Is that why the criticism of DLSS and frame generation evaporated overnight after AMD started doing the same thing?
It's easy to argue things when you put them in a vacuum. DLSS literally sucked when it came out, because rendering at an 80% lower resolution on any graphics card, without any other type of processing or effect gave a better image quality than the original DLSS. And once again, it was clear that DLSS was only there to push the primary tech RT. Remember too that this was the 2000 series, where for example the RTX 2080 was ~20% faster than the GTX 1080 while costing 40% more (i.e. terrible value over previous gen), and RT was used as the reason for this. DLSS had to be used in conjunction with it for it to give any sort of respectable framerates, but it degraded image quality too much. It was bashed, and for good reason.

Then came DLSS2, which was actually usable. The complaints regarding DLSS didn't stop because AMD came out with FSR. They stopped because it actually became useful after necessary improvements due to community backlash. So once again, a wrong narrative is being painted in support of green, and once again it's not an honest conversation.

As for frame generation, there's still enough criticism on it, and justifiably so. It increases latency, i.e. you need a high framerate to use it in the first place. Not to mention that it's used deceitfully in marketing material. But the most important criticism is it being limited to the 4000 series. If AMD can allegedly make it work on everything, including cards that are not their own, why can't nVidia make their version work on at least their 3000 series? Even if it's an inferior version, there's no reason they couldn't do it. It's likely that it was simply a tactic to once again try and push their own customers to upgrade again, keeping them on the hamster wheel. I admit that it's speculation, but considering nVidia's track record, it's probably true.
And if you notice, they have now deliberately made the DLSS naming confusing. I guess they wouldn't want their own users to feel left out, so now, every RTX card supports DLSS 3.5, but not really, since features may be missing.

Of course they downplay RT because it's not seen as an AMD strength and I think you know that. There's plenty reason to celebrate alternative methods that deliver similar results to RT. There are no reasons to blame RT for our problems.
I refer back to the above, about RT being too expensive to render for the average gamer. That is a fact. It might theoretically be smart to go for the card right now that has better RT performance, but, that is only useful if the RT performance is already viable. We can all agree that the more RT is used, the harder it will be for these cards to run it. So you have to think from where we are, and their RT performance dropping from here. And most (if not all) RT cards launched on their last legs, if they had legs in the first place. That is why basing your choice on RT speeds of current cards is not the smartest decision.

You may argue that I'm downplaying RT with this argument, and that is your prerogative. In my view, saying that something is too expensive is not downplaying it. Am I downplaying a Ferrari when I say that a Toyota is enough for me? Imagine how ridiculous it would be if everyone that had a Ferrari had to go around telling everyone that they should all buy a Ferrari and that Toyota sucks. Rasterization is still king, and this puts AMD as a great value for money option in the current market. In a way, nVidia is helping AMD here, but gamers refuse to it and keep flocking to nVidia, mostly. Some are starting to wake up.

What's the opportunity cost of including RT hardware in chips and RT rendering in games? Any cost argument against RT should be accompanied by an alternative proposal. Flappy said it above. If HWRT is expensive and a poor use of resources then it should be trivial for IHVs and ISVs to produce competitive results using alternative methods. Otherwise it's just hand waving.
I wouldn't necessarily say it's a poor use of resources. We have to start somewhere, so I am not against including features in hardware. And it's probably harder to design two chips completely differently, rather than include RT in all of them. But that doesn't mean that the end user has to buy it.
In practical terms, it doesn't matter if an RTX 2060 has RT or not, because it can't run it properly anyway. Yet, when the 5700XT came out, people argued constantly against buying it, because it didn't have RT. They recommended buying the slower RTX 2060 instead, because on paper it had RT, even if it was unusable in practice, it was somehow "future proofing". Now THAT is a waste of resources. And that is what we're talking about. It's not about downplaying RT. It's about doing right by the consumer and not deceiving them into buying things they don't need. But you know, you were then "downplaying RT" too.
nVidia is good at letting people buy things they don't need, and apparently their blind followers want to bring down the rest with them. Some of us see through it, and want to make people to make more conscious choices with their money. It's more about RT being overhyped, and anyone seeing through it is labeled as being a hater or AMD fanboy or that they're trying to downplay it.

And oh, one more thing. If you want to use your card for a long time, it's much more likely that having an adequate amount of VRAM is going to be a much more important "feature" than having RT. That's another gripe. Anyone would have been much better off buying a 6800XT 16GB rather than an RTX 3080 10GB in the long term. And prices on the secondary market definitely reflect that (feel free to take a look on Ebay). But once again, there are too many people shouting "RT RT RT RT RT RTX RT RT", and people get burnt unnecessarily, making them upgrade much sooner than they would otherwise and mess up the whole gaming market for the rest of us.

The adoption of hardware RT has nothing to do with whether hardware is affordable, especially AMD hardware where the investment in RT transistors has been modest. But yes, enthusiasts especially on PC are frustrated in general at the lack of progress in rendering tech and poor utilization of their expensive hardware. The recent spate of poor showings hasn't helped the situation. I don't think we have a problem with objective interpretation of results. Most people seem to agree with what their eyes are telling them when a AAA game lands with mediocre IQ and performance.

The problem is that actual results are constantly being dismissed in favor of power point promises.
I have no idea what "lack of progress in rendering tech" you're talking about. Before RT, a lot of things were being developed to improve visual quality. And we have UE5 with Lumen, which doesn't need hardware RT. We have other newer engines being created also... So the progress was always there.

If you're used to buying AMD, poor utilization of the hardware is just another Tuesday, considering that the majority of developers focus only on nVidia. So if you truly are for maximizing the hardware that we have, to get the best results possible, supporting AMD would seem to be the most logical thing to do. They have the consoles, the have untapped hardware, but most importantly, they push for implementations that works across the whole industry.

And it's also kind of a hard pill to swallow to say that RT has nothing to do with whether hardware is affordable, when nVidia deliberately used the initial RTX line to bump up prices significantly. But nVidia has been doing it before as well, so... There is that. RT is definitely not the core of the problem, but it is a contributor.
 
Fortnite is probably the only game that pushes geometry to a far higher level. Titles like Forbidden West and R&C do seem to have a notable increase in geometry over Cyberpunk though, which honestly seems to be fairly low poly.
Horizon yes but I don't see that much geometry in Ratchet & Clank. Especially when you deviate from the most shown level. Then rather Star Wars Jedi Survivor. That game could be a draw call tech demo because it often reminds of a level of detail only seen in Nanite.

I always liked Tesselation too. Games like Ghost Recon Breakpoint or Dark Souls Remastered benefited enormously from it.

What it also needs are more games with detailed hair like Lara Crofts PureHair from Tomb Raider. However, there seems to be a problem when many characters having such detailed hair.

Literally for the majority of games and for the majority of people, turning on RT is not worth it. Either the RT is light enough that the visual difference is negligible, or the visual difference is striking, but performance tanks. There is no in-between, at least, not yet.
It's also funny to me that first the 3090 was considered the holy grail of RT, and that even the RTX 2060 was argued as being viable when it came out. When the 7800XT and 7900XTX came out, which offers pretty much similar RT to a 3090, suddenly only the RT performance of the 4090 is somehow viable. It seems funny to me, considering everyone is always arguing in favor of RT and its viability for the whole stack of nVidia cards. But when it comes to AMD, they have to better than the absolute top. It suddenly doesn't matter if they have cards that perform better at RT than an RTX 2060, 3060, 4060, or 4070.
That alone makes it clear that the whole discussion around RT is not an honest conversation. It is purely to inflate the egos of the ones that like nVidia and to impose their views onto everyone else.

Also, I'd love to hear where the lack of RT has consequences. There are zero games where RT actually influences gameplay. And in the majority of games, even though the visuals are better, they often still don't justify the performance hit. The main exceptions seem to be old games like Quake and Tomb Raider, i.e. late 90s & pre-2010 games.

1. RTX 3090 was released in 2020. The 7900XTX came over 24 months later. I would say that you get very good ray tracing starting with RTX 4070/RTX 3080 level of perfromance. Then you can play demanding games like Control with 60 fps and good image quality.

2. Whether they justify the performance hit is not for you to decide. For me, it is almost always worth it. It is immediately visible.

3. Most graphic effects do not affect the gameplay. So what? Ray tracing looks damn good when combined with modern materials/assets. I'd much rather watch a nice movie like Blade Runner 2049 or Dune than one that looks worse even if the story/content is identical.
 
Last edited:
1. RTX 3090 was released in 2020. The 7900XTX came over 24 months later.
Does this statement have an actual point, or is this the standard e-peen waving we have seen time and time again? I guess we simply dismiss that it was $500 cheaper with regards to MSRP, and the difference is much bigger considering that the 3090 price peaked over $3k.

I would say that you get very good ray tracing starting with RTX 4070 level of perfromance. Then you can play demanding games like Control with it at 60 fps and good image quality.
I guess that means that the 7900XT and XTX cards are good enough at RT as per your own metric. But I never see anyone recommending these cards, in spite of their decent RT, and more importantly the more extensive VRAM available and superior raster performance for the price.

2. Whether they justify the performance hit is not for you to decide.
I guess that privilege is reserved exclusively to the ones who like nVidia and RT, right? They get to dictate to everyone else what they should like and that nobody can live without RT.🤷‍♂️

For me, it is almost always worth it. It is immediately visible.
As I mentioned before, the fact that everyone is crying for DLSS in Starfield and not RT, says a lot. Or maybe it has something to do with the fact that AMD has RT too, but not DLSS, but that's another story.

3. Most graphic effects do not affect the gameplay. So what? Ray tracing looks damn good when combined with modern materials/assets. I'd much rather watch a nice movie like Blade Runner 2049 or Dune than one that looks worse even if the story/content is identical.
Everyone would. But would they do it if the ticket for the same movie was double, triple or quadruple the price depending on the movie...?
 
The raster performance of these two AMD cards is good to play rt heavy games. However, because of Nvidias DLSS-advantage AMD cards often need much more performance to reach the same visual output. DLSS performance often looks better than FSR quality which gives the Geforce cards a huge advantage as long as the game provides DLSS. XeSS can also reach a similar image quality like DLSS but I don't know how well it runs on AMD.
 
Last edited:
Who came up with the unified shader architecture?
On PC? NVIDIA.
Who came up with tessellation first?
NVIDIA was the one with the most powerful Tessellation engine for generations, they pushed it hard while AMD was content serving it in a superficial manner.

Who invented HBM?
Who had chiplets first?
Irrelevant tech to consumers, consumers don't get anything from HBM or Chiplets. These tech didn't add anything of value to PC gamers.

Who had DX12 support first?
Who invented Vulkan?
These are APIs, not inventions.

If we are talking impact on PC graphics, then AMD's impact on PC gamers pales in comparison to NVIDIA, this is the impact of NVIDIA on the PC market:

First to make the GPU .. and T&L (Geforce 2)
First to make Programmable Shaders (Geforce 3)
First to make Unified Shaders on PC with superb performance (Geforce 8800Ultra)
Popularized GPU Physics.
First to do advanced Hair and Fur rendering (HairWorks)
Standardized GPU Compute (CUDA)
Standardized GPU AI acceleration (cuDNN)
Most powerful Tessellation engine (Fermi)
First to make Variable Refresh Rate (G-Sync)
First to make temporal AA (TXAA)
Advanced Ambient Occlusion (HBAO and VXAO)
First to implement an automatic photo mode for dozens of games (Ansel)
First to introduce recording during gameplay with minimal impact (ShadowPlay)
First to introduce Driver Downsampling (DSR)
Ray Tracing?
AI Upscaling?
AI Anti Aliasing (DLAA/DLDSR)?
AI Denoising (Ray Reconstruction)
Frame Generation?
Path Tracing?
Remixing games with Ray/Path Tracing (RTX Remix)
Latency Reduction (Reflex)
The highest VR performance and features, and the most consistent.
First to introduce Integer Scaling
The first and only to allow modifications of game visuals through driver side post processing (FreeStyle)
The first and only to offer advanced driver side screenshot capabilities such as Green Screen and AI Up Res.
First with AI Audio Noise Cancelation, and Video Background Replacement.

7900XTX came out, which offers pretty much similar RT to a 3090
The 7900XTX is only similar to 3090 in moderate ray tracing, in heavy ray tracing and path tracing it's still far behind.
 
Last edited:
No - we can't play both ways, because there is only one IHV pushing bleeding edge technology.
This is factually incorrect, between Intel being the first with ray coherence sorting and AMD doing lots of stuff with D3D12 advancements like GPU node graphs.
So yes, we definitely can play it both ways and say that AMD "fans" are just as toxic to any discussion here as any other "fans".
I'd even say that they are way more toxic because they tend to just dismiss anything which they don't like - RT is a good example of how it's suddenly became unimportant and irrelevant because... reasons.
 
Back
Top