Should full scene RT be deprioritised until RT solutions are faster? *spawn

OK, assuming that is correct, what do you propose the industry should do to course correct.
What is your alternative?
The argument would be, if those resources were allocated on rasterization, the game would look a lot closer...
Woulda coulda shoulda, lazy devs etc...
What's the point of asking a question if you already think you know the answer? Anyway, you can't course correct for the issues arising from smaller gains via node shrinks and generally bad architectural performance gains(see RTX 5000 series). The only thing for developers to do is cut their coat according to their cloth. Some are doing so, others are not. Until we get a breakthrough in semi-conductor manufacturing, the gains going forward will be smaller and more expensive.

Eventually, whether we like it or not, everyone will have to acknowledge the reality that we're in a consolidation period.
 
As to feeling entitled to achieve some level of satisfactory performance after spending near $3k usd and realistically $4.5k in my currency, you better believe I'll have a high level of warranted entitlement.

You should’ve probably run the math before dropping $4.5K.

You can’t just ignore that 4K 120hz is a massive increase in workload on top of inherently more taxing modern graphics and claim everything sucks because you’re not getting some random made up performance target.

I will agree with you that if games look like shit they shouldn’t also run like shit. However if a game looks amazing then yeah it should probably run at 4K 30fps because that’s a ton of pixels.
 
If you can spend that much money and be satisfied with 28fps on average in cyberpunk at native 4k almost 5 years after the game released, then more power to you. It just means we value money differently and there's nothing wrong with that?
28 FPS in the completely optional, cutting-edge path tracing mode that released two years ago. Are you arguing that developers giving users more choices is bad, and they should never add options that can't run 4K60Hz native (or whatever you think the baseline should be) on current hardware?
 
You should’ve probably run the math before dropping $4.5K.

You can’t just ignore that 4K 120hz is a massive increase in workload on top of inherently more taxing modern graphics and claim everything sucks because you’re not getting some random made up performance target.

I will agree with you that if games look like shit they shouldn’t also run like shit. However if a game looks amazing then yeah it should probably run at 4K 30fps because that’s a ton of pixels.
If it was running at 4k60 native, I'd be more than satisfied with that. If the 5090 only yielded a 33% increase on average from the 4090 while costing 25% more in msrp(much much more in street price), then it's unrealistic to expect a 100%+ gain in performance from the 6090. Realistically, we're waiting at least 2-4 generations until we hit that performance level.

Nvidia should separate the gaming line from the data center products. Gaming should lag behind on older nodes and Nvidia should seek performance gains via architecture and design while looking to drive the costs down. Data center products should look to use the latest nodes. If Nvidia comes out in 2 years with a 6090 that brings with it a 50% performance increase, that's basically 4k45fps in a near 7 year old game at that point. And we know the msrp will be north of $2000 talk less of the real price.

All of this is to say, the math makes absolutely no sense. Keep in mind that if one buys these cards just to play games, there's no roi and it's actually just a big expense especially due to the high power requirements. Personally speaking, I can never allow my excitement for technology to overrule the insanity of the proposition. Spending that much money to play games is already laughable. It only becomes more laughable when you look at the performance you get for your money.
 
If it was running at 4k60 native, I'd be more than satisfied with that.

I think you need to define the “it” you’re referring to. There are lots of games that run 4K 60 if you go back far enough. Proclaiming that full PT should run at 4K 60 for example would need to be backed up by some kind of data.

All of this is to say, the math makes absolutely no sense. Keep in mind that if one buys these cards just to play games, there's no roi and it's actually just a big expense especially due to the high power requirements. Personally speaking, I can never allow my excitement for technology to overrule the insanity of the proposition. Spending that much money to play games is already laughable. It only becomes more laughable when you look at the performance you get for your money.

Well that’s a very personal choice. $2000 for you isn’t the same as $2000 for someone else. You should follow your conscience.
 
That's fair however based on your link, we only had to wait 6 months to get a gpu which could run the highest texture settings? We're 2 gpu gens(~4+ years out) divorced from the release of the Cyberpunk 2077 and there is no gpu that can run it at max settings at 4k native.... With the rate things are progressing, we may need to wait 2-4 more gpu gens for that to exist. This would represent the slowest gpu hardware progression in history. Meanwhile, we could already at max settings 1080p on the HD 5970 a mere 3 years after release. Tbf, I can't recall if very high was the highest settings.

Source: Crysis Benchmarks

EDIT: Looking at Tech Power up, it looks like there was an extreme setting and the GTX 590 ran it fine.
Tech Powerup
second link from Techpowerup is Crysis 2

The first source of benches is for Crysis 1, where yes, 5970 technically might have scored an average above 60 fps in the bench run (completely ignore CPU there which definitely could not do Crysis at flat 60 yet at 1080p). But given how it is an ATi crossfire card, the frame-times were beyond awful. You would need to wait until the next GPU gen after that one for 1080p 60 single-GPU-wise. So the GTX 670/680 and the 7950/7970. So March 22, 2012.

That would be more than 4 years after release.

For CP 2077 - its Full Ray Tracing mode came out in 2023. April 2023 for the Beta, September 2023 for the full release. So we are just 1.5 to 2 years into its existence.
 
Last edited:
What's the point of asking a question if you already think you know the answer?
I answered to a hypothetical.
I don't presume that my answer is all-encompassing.
You might not like RT/upscaling, and that is fine, but since my answer does not apply, I'd like to know why these solutions bother you so much, especially since,

...you can't course correct for the issues arising from smaller gains via node shrinks and generally bad architectural performance gains(see RTX 5000 series).
that would imply that, new ways to advance are required.

The only thing for developers to do is cut their coat according to their cloth. Some are doing so, others are not. Until we get a breakthrough in semi-conductor manufacturing, the gains going forward will be smaller and more expensive.
What if, we don't get a breakthrough, or, it takes a very, very long time to get one.
Should everyone remain idle?
Since the course is set, and the gains are miniscule, shouldn't everyone, panel manufacturers, game developers, hardware engineers etc, etc, stop innovating and work with what they have as well?
After all, gaming on an 8K 1000hz panel is impossible using "traditional" rendering techniques, for the foreseeable future.
And supposedly, everyone did, "cut their coat according to their cloth" what would we gain from it?
What are you trying to convey?

"upscaling=bad", is an equally respected subjective opinion as "upscaling=best thing since sliced bread".
It means absolutely nothing, unless there is an alternative on offer.
What is the alternative you propose?

Eventually, whether we like it or not, everyone will have to acknowledge the reality that we're in a consolidation period.
It depends.
If there is no advancement on a specific field, then I'd say no.
If we get to a point where there is no advancement whatsoever, then, sure.
 
@Dictator hate to detract from the continued gems from Boss, but is it possible to have some future iteration of dlss where the upscaling is only applied against the RT/PT elements while leaving the textures and assets to render at set resolution?
 
second link from Techpowerup is Crysis 2

The first source of benches is for Crysis 1, where yes, 5970 technically might have scored an average above 60 fps in the bench run (completely ignore CPU there which definitely could not do Crysis at flat 60 yet at 1080p). But given how it is an ATi crossfire card, the frame-times were beyond awful. You would need to wait until the next GPU gen after that one for 1080p 60 single-GPU-wise. So the GTX 670/680 and the 7950/7970. So March 22, 2012.
Ok, but the option existed to achieve that level of performance questionable frame times aside. Sli was extremely common during that time period.
That would be more than 4 years after release.

For CP 2077 - its Full Ray Tracing mode came out in 2023. April 2023 for the Beta, September 2023 for the full release. So we are just 1.5 to 2 years into its existence.
Yea I was wrong on this. If the 6090 cannot deliver the performance required, it'll be pretty much almost 6 years by the time the 7090 releases. I personally don't expect the 6090 to get anywhere close to delivering that level of performance for 4k60pt. Hopefully I'm wrong.
 
@Dictator hate to detract from the continued gems from Boss, but is it possible to have some future iteration of dlss where the upscaling is only applied against the RT/PT elements while leaving the textures and assets to render at set resolution?
I think that depends on how you define "upscaling" as well as "texture and assets". Proper offline path tracing needs dozens of samples per pixel to render an image with reasonably low levels of noise. Real-time ray tracing uses only 1-2 samples per pixel, and while the ReSTiR algorithm used in games is more efficient than offline path tracing, that's still not enough to get rid of the noise. So spatiotemporal denoisers like NRD or DLSS Ray Reconstruction have to be applied to further reduce noise. These allow each pixel to use sample data from the neighboring pixels in the frame as well as the earlier frames in a manner that is not dissimilar to how temporal upscaling works, and can be thought of as "upscaling" the number of samples per pixel. So technically DLAA + RR can be thought of as "upscaling for RT/PT only".

There's also various papers published on neural upscalers that take a full resolution g-buffer (depth, normals, albedo textures, roughness maps, other material information), motion vectors, and a low resolution rendered frame fully shaded with lighting as input and output a high quality full resolution frame. These can still get good quality even with a 16x upscaling factor.

 
Not sure why you're referring to ray reconstruction when i was talking about dlss upscaling.
I specifically refrerenced ray reconstruction in my first post: "In 2025, the 5090 will give you the 2160p90 in Cyberpunk Path Tracing using DLSS Performance with Rey Reconstruction, which is actually better than native"!

As to feeling entitled to achieve some level of satisfactory performance after spending near $3k usd and realistically $4.5k in my currency, you better believe I'll have a high level of warranted entitlement
Spending exorbitant amounts of money and expecting to game at a certain performance level are two different things. You don't buy a 3 million dollars car and expect it to break the sound barrier.
 
Thank God it was some exceptions like Crysis and not "can you run games"
Finish this thought you've started; tell us why it's somehow different then versus now. Why is raytracing "can you run games?" when Crysis was "some exception"?

The crux of Crysis complainers were the people who set all graphics options to their highest and then cried about abysmal performance on the hardware of that time. Yet, lower graphical settings always existed, and much like modern games, the difference between mid-tier settings and all-everything-at-ultra wasn't worth the performance impact. Sure, everyone tried the ultra settings "just to see", yet most gamers found a mix of low / med / high / stupid settings to tickle their particular fancy. Also remember, even back in 2007, there were a plethora of 1600x1200, 2048x1536, and even a few 2560x1920 CRT monitors in the world -- I was one of the gamers who owned a 21" Viewsonic which supported the 2048x1536 rez at 72Hz, which is roughly 50% more pixels than modern 1080p and about 15% pixels shy of 1440p.

All of this to say, every game that absolutely REQUIRES raytracing today can still be played comfortably by essentially all raytracing-capable hardware so long as you aren't trying to push the envelope to the ragged edge. I suppose if you dig out the lowest model Turing, an RTX 2060 released five years and one month ago, If you own anything in the 40 series, or probably anything 3060 or above, you'll be absolutely fine.

It all feels very Crysis-complainer to me. If your setup can't play an RT game with all the features turned up to 11, then turn some of them down. It doesn't have to be the fault of raytracing, it can be simple humans expecting more than what makes rational sense.
 
I remember back in day of the Ti4600, where AA could kill performance, so a lot of people chose not to use AA:
And if you "dared" to run AA + 16xAF you werevery much pushing limits:

Today 16xAF is basically "free"
And AA has evolved into DLAA/DLSS TNN etc.

Technology has always taxed performance the first couple years, but that never meant that technology stopped progress, it has keept pushing.

In 2008 you could have had an argument that RT was a reach to far:

But in 2025, nope :nope:
 
Technology has always taxed performance the first couple years, but that never meant that technology stopped progress, it has keept pushing.
Bascially every technology driven game has cost so much performance that it was hard to run on the highest end of hardware, here are more examples (one more time because apparently, people forget).

Doom 3 at max settings on the highest GPUs, 42fps!

3426.png


Far Cry 1: max settings, 50 fps.

2849.png


Crysis Warhead, max settings: 22fps!

Enthusiast_03-p.webp



F.E.A.R. running at max settings: 28fps!

fear2.jpg


Oblivion at max settings: 38fps!

oblivion-highend-bloom.png


Far Cry 3 max settings: 31fps!

2560_01-p.webp

Metro 2033 max settings: 38fps!

45130.png
 
Finish this thought you've started; tell us why it's somehow different then versus now. Why is raytracing "can you run games?" when Crysis was "some exception"?

The crux of Crysis complainers were the people who set all graphics options to their highest and then cried about abysmal performance on the hardware of that time. Yet, lower graphical settings always existed, and much like modern games, the difference between mid-tier settings and all-everything-at-ultra wasn't worth the performance impact. Sure, everyone tried the ultra settings "just to see", yet most gamers found a mix of low / med / high / stupid settings to tickle their particular fancy. Also remember, even back in 2007, there were a plethora of 1600x1200, 2048x1536, and even a few 2560x1920 CRT monitors in the world -- I was one of the gamers who owned a 21" Viewsonic which supported the 2048x1536 rez at 72Hz, which is roughly 50% more pixels than modern 1080p and about 15% pixels shy of 1440p.

All of this to say, every game that absolutely REQUIRES raytracing today can still be played comfortably by essentially all raytracing-capable hardware so long as you aren't trying to push the envelope to the ragged edge. I suppose if you dig out the lowest model Turing, an RTX 2060 released five years and one month ago, If you own anything in the 40 series, or probably anything 3060 or above, you'll be absolutely fine.

It all feels very Crysis-complainer to me. If your setup can't play an RT game with all the features turned up to 11, then turn some of them down. It doesn't have to be the fault of raytracing, it can be simple humans expecting more than what makes rational sense.
I find quite amusing the defending/intense responses and assumptions made followed by "essays" for any statement that may remotely "offend" a certain belief.

I never stated that running current games with RT features is the equivalent of "can you run games". I am stating that Crysis is an extreme case and it wasn't the norm. Therefore using it an example is not the same as all the games with RT features we have today. Otherwise we would never have had the "can you run Crysis" meme in the first place.

Yes Crisis was different that it was hard to hit good IQ and good framerates even outside of highest or high settings for years down the line. Let alone the high highest/Ultra. It was the best way to market the Cry Engine but not the best way to sell a game to gamers.
 
I think one needs to look at this at a different angle. I believe there were a similar debate about this on DF, maybe the Avatar game.
I think it's good for games to have an "extreme" or "experimental" mode designed for future hardwares that probably no current hardwares are able to run. Of course it's hard to predict the future, so maybe there won't be a hardware that can run it for 5 years or so, but that's why it's experimental. It also encourages game developers to try out all technologies which may or may not be the future.
However, now it seems that game developers are afraid of doing this. This is precisely because people now expect that the expensive PC they bought should be able to run all games at highest settings. If it can't, then the game must be an "unoptimized piece of shit." I don't see how this can be healthy.

Of course, part of the problem is that many games do not communicate well with the players about what these settings are. Ideally there should be a screenshot or sample on how a setting affects the results. For example, if you set textures to "high" it should display a more detailed texture (preferably along with lower settings for comparison). Set shadow maps to high and you see a better shadow, etc. Again I believe this was also discussed on DF.
 
However, now it seems that game developers are afraid of doing this
Some developers are deploying these settings smarly now ... Avatar Pandora uses hidden "Unobtainium" settings that can only be accessed through a console command. Star Wars Outlaws did the same with their "Outlaw" preset. Kingdom Come 2 labels these settings "Experimental", I think a couple of other games (that I don't remember right now) did the same. I believe this is a good trend that needs to be expanded upon.
 
Of course, part of the problem is that many games do not communicate well with the players about what these settings are. Ideally there should be a screenshot or sample on how a setting affects the results. For example, if you set textures to "high" it should display a more detailed texture (preferably along with lower settings for comparison). Set shadow maps to high and you see a better shadow, etc.

This is such a good point. Not only does it remove anxiety on the end user who fears that they’re missing out on some amazing IQ improvement going from ultra to medium. It also encourages developers to be more thoughtful about the settings they ship. Maybe you don’t include an ultra setting for some feature where the IQ improvement is imperceptible because you know your settings menu preview will make that obvious.

I am a huge fan of unobtainium settings in games because by the time I get around to playing the game it’s no longer unobtainium. But you got to feed that ultra settings ego trip.
 
Back
Top