Should full scene RT be deprioritised until RT solutions are faster? *spawn

This is such a good point. Not only does it remove anxiety on the end user who fears that they’re missing out on some amazing IQ improvement going from ultra to medium. It also encourages developers to be more thoughtful about the settings they ship. Maybe you don’t include an ultra setting for some feature where the IQ improvement is imperceptible because you know your settings menu preview will make that obvious.

I am a huge fan of unobtainium settings in games because by the time I get around to playing the game it’s no longer unobtainium. But you got to feed that ultra settings ego trip.
The issue is very much multidimensional and related also a lot to how GPU manufacturers have been promoting their RT enabled cards at specific prices at specific specs combined with how game companies were trying to showcase the best games ever visually with those cards.

Once the RTX cards hit the market for example, NVIDIA withdrew it's high end GTX cards from the market which were pretty powerful on rasterization and very competitive with the RTX cards with RT off. The 1080ti for example could beat some of these cards with RT off since it was fast and had more VRAM.

The RTX cards were sold at high end prices that can do RT compared to their previous cards but not clear wins in all specs or powerful enough to run the target specs that devs had in mind for gamers to hit the desired performance.

For example you bought your new awesome RTX card with a premium tag price to enjoy RT visuals but VRAM size didn't see the jump. Sometimes VRAM requirements of many games jumped much higher

So on one hand you had those fast cards for RT but not enough VRAM to run games buttersmooth and without hiccups. So a lot of gamers are hyped up by marketing material, both by GPU manufacturers and game companies about the awesome perfect next gen visuals offered by those RT graphics, and awesome performance thanks to those awesome expensive super powerful cards we have never seen before. But in reality they are caught in a strange situation where they get RT (which is the reason why these cards hit the market in the first place) but the cards lack somewhere else to offer that experience smoothly so they start lowering settings with RT on, or have RT off to ha e some other settings higer. It's a situation where you are convinced you can have the cake and fully eat it but you actually cant. Or it's like having the fastest luxury supersport car ever developed with Rocket Turbo tech perfect for 418 Miles Per Hour in a highway being marketed to you, but once you buy it you discover the road it's put on is bumpy, is filled with debris, traffic lights and road blocks and there the experience is far from being the same as the marketing material. The tech is still awesome. But tough luck. But if you want to fully eat your cake and have that marketed experience there is that even more super expensive card that can traverse those roads too.
 
Once the RTX cards hit the market for example, NVIDIA withdrew it's high end GTX cards from the market which were pretty powerful on rasterization and very competitive with the RTX cards with RT off. The 1080ti for example could beat some of these cards with RT off since it was fast and had more VRAM.

2080 Ti was faster than 1080 Ti on almost everything (it has more GFLOPS, more memory bandwidth, and more texture rate, only slightly less pixel fillrate but that hardly matters). Even 2080 was on par with, or slightly faster than 1080 Ti.
 
I find quite amusing the defending/intense responses and assumptions made followed by "essays" for any statement that may remotely "offend" a certain belief.
You didn't offend anyone. Or at least, you didn't offend me. As it turns out, I can have different opinions than you -- and so can others! This isn't a "woke" or "offense" or "agenda", it's a thoughtful response to your one-line retort.

I never stated that running current games with RT features is the equivalent of "can you run games".
Then explain your sentence that I quoted.
I am stating that Crysis is an extreme case and it wasn't the norm.
If you paid attention to the posts upstream, you'll see that it wasn't an extreme case and turns out it WAS the norm. Plenty of evidence has been posted that new tech has always punished even new hardware.

This makes it seem like your opinion, while you're entitled to it, does not reflect observable reality. And if that offends you, then that's a you problem.

False equivalency.
Nope.

Just because you bought the very most expensive video card and it can't go as fast as you think it should, doesn't mean the same doesn't hold for other things you can purchase. At some point, unlimited money doesn't guarantee every possible result.
 
You didn't offend anyone. Or at least, you didn't offend me. As it turns out, I can have different opinions than you -- and so can others! This isn't a "woke" or "offense" or "agenda", it's a thoughtful response to your one-line retort.


Then explain your sentence that I quoted.

If you paid attention to the posts upstream, you'll see that it wasn't an extreme case and turns out it WAS the norm. Plenty of evidence has been posted that new tech has always punished even new hardware.

This makes it seem like your opinion, while you're entitled to it, does not reflect observable reality. And if that offends you, then that's a you problem.


Nope.

Just because you bought the very most expensive video card and it can't go as fast as you think it should, doesn't mean the same doesn't hold for other things you can purchase. At some point, unlimited money doesn't guarantee every possible result.
I disagree with your interpretation of your evidence and your insistence in interpreting my post the way you want to just because it doesn't sound aligned with what you want to hear and I am leaving it at that. Move on.
 
2080 Ti was faster than 1080 Ti on almost everything (it has more GFLOPS, more memory bandwidth, and more texture rate, only slightly less pixel fillrate but that hardly matters). Even 2080 was on par with, or slightly faster than 1080 Ti.
Only if you want to use the 2080ti as an example which launched at $999
 
Nope.

Just because you bought the very most expensive video card and it can't go as fast as you think it should, doesn't mean the same doesn't hold for other things you can purchase. At some point, unlimited money doesn't guarantee every possible result.
You said, "You don't buy a 3 million dollars car and expect it to break the sound barrier". It's false equivalency because no one buys a 3 million dollar car expecting it to break the sound barrier. No argument was made that unlimited money should guarantee every result. That's certainly a straw man if I ever saw one. My argument has built on the foundation that the increase in performance certainly does not justify the increase in price and certainly doesn't constitute a generational leap at all. A 25% increase in msrp coinciding with a 33% increase in performance is laughably bad. What makes the matter even funnier is that the street price is a 2x increase in my region for 33% performance which perhaps constitutes the worst value proposition of any Nvidia product and perhaps any gpu product ever released in history.

If the 5090 had the same msrp as the 4090, it would still be a terrible generation. This is all to say, you can't go around charging luxury prices when you're delivering economy levels of performance.
 
Economy levels of performance? What products are giving more performance to be considered in the premium category?

That’s how tiers work.
 
Economy levels of performance? What products are giving more performance to be considered in the premium category?

That’s how tiers work.
I mean, it's a "free" world. If you think the price is justifiable and reasonable, more power to you. In the absence of price, the 5090 is already a poor product. 33% more performance, 28% more power. In the context of price, it's charging luxury prices while delivering economy performance.
 
I disagree with your interpretation of your evidence...
Of the old games that were listed, that ran terribly on the hardware available when it was released, how is that to be interpreted differently? Do you challenge the view that these games maxed out (for those that even had settings!) did run terribly at launch on every flavour of hardware?
 
Back
Top