Yeah I respect that if you want ultra high frame rates at high resolutions above all else then RT might be a non-starter on any card. But it seems to me a strange compromise to strive for crazy high framerates and particularly for very high native res (especially where upscaling is available) on a 50-60TF $900+ GPU while at the same time having the core graphics look worse than what is available on a $300 4TF Xbox Series S - which would certainly be the case in some titles without RT.
I'm not sure what that means - could you give an example of a game where this could possibly be the case?
Like the 7900 is not a
non-RT GPU, it's just a GPU who's RT performance will likely only match a 2-year old Ampere 3080. That's definitely disappointing, but it's far, far above the RT performance of the S, X and PS5. The main advantage of the PC is
choice, you're not restricted into what the developer personally felt were the right compromises for a particular framerate. There is not going to be something released within this console generation that will only run well on Ada, as that would be financial suicide for the developer.
The argument isn't that "RT is worthless and has no future" - the argument is that the sacrifices it brings to performance and resolution are too great for the small improvements in brings in many of the titles currently - at least at the framerate standards that have been raised considerably in the past 5 years. You can mitigate those sure - but the actual cost in $ is far too high for the majority of PC gamers.
If the 4070-whatever comes out in 2023, and is in the same price ballpark of the 7900, has a 20% raster disadvantage but a 100% RT advantage, then you can definitely make the argument that it's a shortsighted decision to go with Radeon - like it would have been when Ampere was introduced and the 6800/6900 series were almost priced identically, which explains why Nvidia completely took over in marketshare (that and in many games, you could argue it was also much faster in raster due to DLSS). We'll see with actual benchmarks I guess, I don't think the 4080 will necessarily put a cork in this argument though.
e-sports players where high frame rates are essential are a clear exception to this of course. But putting aside frame rate, if I care enough about graphics to insist they must be running at 4K native, but then have a series S pushing better core graphics than me outside of resolution, I'm not sure that would sit right. If I'm going to spend that much on a GPU, I want better frame rate, better image quality and better graphics than a $300 or even a $500 console. To me, that's why RT performance is important. Without it, I can potentially only have 2 of those things, and not all 3, which isn't great if I'm spending $900+ on a GPU alone.
You get that with a 7900 series card though, in spades. And you can get that without RT as well - ports like God of War, Days Gone etc have significant raster improvements even outside of framerate/resolution. Again I don't think anyone is arguing that there is no point to ray tracing at all, just that the cost to get a product that can run full-RT games at the resolutions/framerates many gamers want is too high atm, so the sacrifice to that area of performance may be more acceptable if the other 2 can be met at a more reasonable price point.
I hope you're wrong but after this launch am starting to suspect you're right. Unfortunately I'm not sure PC gaming can survive this pricing structure if we're going to have to spend near double the price of a console 2 years after its launch on the GPU alone just to get for something that's significantly (as in 2x or more) faster.
It is definitely a concern of mine, I can appreciate the technology in something like a 4090 and it's good there's an option out there for people who value that unique experience. But from the perspective of what
grows the PC gaming base, which in turn puts more pressure on publishers to take better care of their ports, it's definitely worrying.