Should full scene RT be deprioritised until RT solutions are faster? *spawn

This is such a good point. Not only does it remove anxiety on the end user who fears that they’re missing out on some amazing IQ improvement going from ultra to medium. It also encourages developers to be more thoughtful about the settings they ship. Maybe you don’t include an ultra setting for some feature where the IQ improvement is imperceptible because you know your settings menu preview will make that obvious.

I am a huge fan of unobtainium settings in games because by the time I get around to playing the game it’s no longer unobtainium. But you got to feed that ultra settings ego trip.
The issue is very much multidimensional and related also a lot to how GPU manufacturers have been promoting their RT enabled cards at specific prices at specific specs combined with how game companies were trying to showcase the best games ever visually with those cards.

Once the RTX cards hit the market for example, NVIDIA withdrew it's high end GTX cards from the market which were pretty powerful on rasterization and very competitive with the RTX cards with RT off. The 1080ti for example could beat some of these cards with RT off since it was fast and had more VRAM.

The RTX cards were sold at high end prices that can do RT compared to their previous cards but not clear wins in all specs or powerful enough to run the target specs that devs had in mind for gamers to hit the desired performance.

For example you bought your new awesome RTX card with a premium tag price to enjoy RT visuals but VRAM size didn't see the jump. Sometimes VRAM requirements of many games jumped much higher

So on one hand you had those fast cards for RT but not enough VRAM to run games buttersmooth and without hiccups. So a lot of gamers are hyped up by marketing material, both by GPU manufacturers and game companies about the awesome perfect next gen visuals offered by those RT graphics, and awesome performance thanks to those awesome expensive super powerful cards we have never seen before. But in reality they are caught in a strange situation where they get RT (which is the reason why these cards hit the market in the first place) but the cards lack somewhere else to offer that experience smoothly so they start lowering settings with RT on, or have RT off to ha e some other settings higer. It's a situation where you are convinced you can have the cake and fully eat it but you actually cant. Or it's like having the fastest luxury supersport car ever developed with Rocket Turbo tech perfect for 418 Miles Per Hour in a highway being marketed to you, but once you buy it you discover the road it's put on is bumpy, is filled with debris, traffic lights and road blocks and there the experience is far from being the same as the marketing material. The tech is still awesome. But tough luck. But if you want to fully eat your cake and have that marketed experience there is that even more super expensive card that can traverse those roads too.
 
Once the RTX cards hit the market for example, NVIDIA withdrew it's high end GTX cards from the market which were pretty powerful on rasterization and very competitive with the RTX cards with RT off. The 1080ti for example could beat some of these cards with RT off since it was fast and had more VRAM.

2080 Ti was faster than 1080 Ti on almost everything (it has more GFLOPS, more memory bandwidth, and more texture rate, only slightly less pixel fillrate but that hardly matters). Even 2080 was on par with, or slightly faster than 1080 Ti.
 
I find quite amusing the defending/intense responses and assumptions made followed by "essays" for any statement that may remotely "offend" a certain belief.
You didn't offend anyone. Or at least, you didn't offend me. As it turns out, I can have different opinions than you -- and so can others! This isn't a "woke" or "offense" or "agenda", it's a thoughtful response to your one-line retort.

I never stated that running current games with RT features is the equivalent of "can you run games".
Then explain your sentence that I quoted.
I am stating that Crysis is an extreme case and it wasn't the norm.
If you paid attention to the posts upstream, you'll see that it wasn't an extreme case and turns out it WAS the norm. Plenty of evidence has been posted that new tech has always punished even new hardware.

This makes it seem like your opinion, while you're entitled to it, does not reflect observable reality. And if that offends you, then that's a you problem.

False equivalency.
Nope.

Just because you bought the very most expensive video card and it can't go as fast as you think it should, doesn't mean the same doesn't hold for other things you can purchase. At some point, unlimited money doesn't guarantee every possible result.
 
You didn't offend anyone. Or at least, you didn't offend me. As it turns out, I can have different opinions than you -- and so can others! This isn't a "woke" or "offense" or "agenda", it's a thoughtful response to your one-line retort.


Then explain your sentence that I quoted.

If you paid attention to the posts upstream, you'll see that it wasn't an extreme case and turns out it WAS the norm. Plenty of evidence has been posted that new tech has always punished even new hardware.

This makes it seem like your opinion, while you're entitled to it, does not reflect observable reality. And if that offends you, then that's a you problem.


Nope.

Just because you bought the very most expensive video card and it can't go as fast as you think it should, doesn't mean the same doesn't hold for other things you can purchase. At some point, unlimited money doesn't guarantee every possible result.
I disagree with your interpretation of your evidence and your insistence in interpreting my post the way you want to just because it doesn't sound aligned with what you want to hear and I am leaving it at that. Move on.
 
2080 Ti was faster than 1080 Ti on almost everything (it has more GFLOPS, more memory bandwidth, and more texture rate, only slightly less pixel fillrate but that hardly matters). Even 2080 was on par with, or slightly faster than 1080 Ti.
Only if you want to use the 2080ti as an example which launched at $999
 
Nope.

Just because you bought the very most expensive video card and it can't go as fast as you think it should, doesn't mean the same doesn't hold for other things you can purchase. At some point, unlimited money doesn't guarantee every possible result.
You said, "You don't buy a 3 million dollars car and expect it to break the sound barrier". It's false equivalency because no one buys a 3 million dollar car expecting it to break the sound barrier. No argument was made that unlimited money should guarantee every result. That's certainly a straw man if I ever saw one. My argument has built on the foundation that the increase in performance certainly does not justify the increase in price and certainly doesn't constitute a generational leap at all. A 25% increase in msrp coinciding with a 33% increase in performance is laughably bad. What makes the matter even funnier is that the street price is a 2x increase in my region for 33% performance which perhaps constitutes the worst value proposition of any Nvidia product and perhaps any gpu product ever released in history.

If the 5090 had the same msrp as the 4090, it would still be a terrible generation. This is all to say, you can't go around charging luxury prices when you're delivering economy levels of performance.
 
Economy levels of performance? What products are giving more performance to be considered in the premium category?

That’s how tiers work.
 
Economy levels of performance? What products are giving more performance to be considered in the premium category?

That’s how tiers work.
I mean, it's a "free" world. If you think the price is justifiable and reasonable, more power to you. In the absence of price, the 5090 is already a poor product. 33% more performance, 28% more power. In the context of price, it's charging luxury prices while delivering economy performance.
 
I disagree with your interpretation of your evidence...
Of the old games that were listed, that ran terribly on the hardware available when it was released, how is that to be interpreted differently? Do you challenge the view that these games maxed out (for those that even had settings!) did run terribly at launch on every flavour of hardware?
 
I mean, it's a "free" world. If you think the price is justifiable and reasonable, more power to you. In the absence of price, the 5090 is already a poor product. 33% more performance, 28% more power. In the context of price, it's charging luxury prices while delivering economy performance.

Again, if the fastest card out is 'economy performance,' what's considered premium performance?
 
Again, if the fastest card out is 'economy performance,' what's considered premium performance?
It doesn't matter if the card is the fastest. My comment is in the context of historical performances increases vs price. It's historically a poor performance & power increase while charging an exorbitant price thus the phrase economy at luxury prices.
 
Of the old games that were listed, that ran terribly on the hardware available when it was released, how is that to be interpreted differently? Do you challenge the view that these games maxed out (for those that even had settings!) did run terribly at launch on every flavour of hardware?
Some games faired far worse than others without even reaching max and thankfully it wasn't the norm.

There is no limit in what kind of target a dev wants to reach. The important thing is having in mind the available hardware and optimize for those as well. Which a lot of devs did. Having a few exceptions that aim too high even for current hardware and not being mindful of what's available is not much of an issue. They are outliers. Making that the norm starts becoming an issue

Edit: some games aren't even a matter of aiming too high, but also a matter of shitty optimization too
 
Last edited:
2080 was released at $699 with very similar or slightly better performance than 1080 Ti.
Sure it performed slightly better. But that's where things started becoming a bit of a mess. In the past we had powerful GPUs that more or less scaled linearly in power and feature sets. But with RT things changed at least in the beginning. RT required sacrificing something else because RT requirements are quite huge

Nvidia released many cards during that timeframe. While we got RT and more processing power, their prices scaled up, game requirements scaled up as well as their VRAM requirements, but VRAM in GPUs didn't scale unless we went for the super expensive high end ones. So we were getting RTX GPUs marketing themselves as next gen and having the raw processing power, but RT implementations weren't always doing much with perceivable difference, while with that turned on we started hitting the GPU limits. So as a user, similar to consoles, there was no clear winner with settings set to RT on or off because that alone was requesting more VRAM too. Obviously we wanted RTX cards for their next gen RT feature. Should I get those accurate reflections and sacrifice resolution? Framerates? Both? Should I turn it off and get better framerates? Resolution? Both? Should I sacrifice textures and other visual presets to turn it on? Or should I do the opposite? And what RT settings should I set up? Getting low quality RT reflections are "accurate" but fidelity is not good enough to be worth the sacrifice in other areas. So should I go with medium? High? Surely that's better in isolation. But what else do I have to sacrifice?
 
So should I go with medium? High? Surely that's better in isolation. But what else do I have to sacrifice?
This is no different than any other DirectX feature, Tessellation had the same problem, Pixel Shaders had the same problem, PhysX had the same problem, Soft Shadows, Anti Aliasing, advanced volumetrics, Shader Model 3, HDR lighting, Hardware T&L ... etc, all shared the same problem.

The difference is, with ray tracing there was DLSS, which relieved you of most of this conundrum, you get 95% or native quality, or 100% or even exceed it. You no longer sacrifice resolution completely, you can even get 80% or higher of native resolution for a huge boost of fps if you go with 50% upscaling. You simply didn't get these options in the past.
 
Last edited:
Sure it performed slightly better. But that's where things started becoming a bit of a mess. In the past we had powerful GPUs that more or less scaled linearly in power and feature sets. But with RT things changed at least in the beginning. RT required sacrificing something else because RT requirements are quite huge

I think this is a weird take. You don't have to use RT. Even without considering RT, each generation of NVIDIA's GPU are still faster, albeit to different degrees. I mean, people complain that 5090 is "only" 30% faster than 4090, and this is not considering RT at all. The majority of games do not require RT to run, and in many games you can run without RT with reasonable quality.

There are always "sacrifices" when a significant different technology is on the way. In a way, RT is a bit like the transition from 2D to 3D. When the earliest accelerated 3D games were released, there were also people complaining that 3D games were uglier than contemporary 2D games, performed worse, and some also worked worse (back when game developers still didn't know how to do camera well, for example). It took quite a few years for most people to accept that 3D games are the future and 2D games are basically a dead end. I think RT is in a way similar. Sometimes you need to turn back a bit to go forward.

Take shadows for example. It used to be very difficult to do, and we had some failed experiments such as Doom 3's stencil shadows, and it's obvious that correct shadows have serious impact on image quality. Now shadow maps dominate, but shadow map is a very flawed technique. One needs to be very careful to avoid various artifacts, and it gets expensive very quickly when the number of lights increases. Another example is global illumination, also very important for immmersion. Baked lighting only works with fixed light sources, and I don't think it's wise to keep doing that (imaging a world where sun never moves). In order to do global illumination with moving lights, we need RT. There are also reflections and even refractions. It's possible to fake them to an extent, just like many games do, but they are all severely limited and can't scale.

I really think it's no brainer to go forward with RT. Back in the 2080 Ti days it's probably not that obvious, but today it's very clear. People say games like KCD2 "performs very well and looks good" only because they don't know where to look. Once you know, it's very hard to ignore. Some might argue why should we do that? Why can't games just do it's the old way? Of course, you can choose to do that, but we can also choose to just make 2D games and never get into 3D. 3D games are very different, some are not even possible with 2D (e.g. FPS games). RT will not be as profound, but on the other hand maybe in the future there will be games that correct lighting will be essential for the gameplay, such as you need to look at a mirror to spot monsters or something like that, I don't know, but we can only know if we can actually do correct lightings in games.
 
My argument has built on the foundation that the increase in performance certainly does not justify the increase in price and certainly doesn't constitute a generational leap at all.

I think you'll find lots of people agree with this opinion. However it seemed earlier that you were suggesting the 5090 "should" run Cyberpunk maxed out at 4K/30 and you didn't give reasons for having that expectation. The definition of "maxed out" will vary in every game.
 
but is it possible to have some future iteration of dlss where the upscaling is only applied against the RT/PT elements while leaving the textures and assets to render at set resolution?
A new patch for Spider-Man 2 has enabled RR to work with native resolution.
NVIDIA DLSS Ray Reconstruction can now be used at Native resolution
 
Back
Top