Current Generation Games Analysis Technical Discussion [2023] [XBSX|S, PS5, PC]

Status
Not open for further replies.
If a future GPU that's built for path tracing has raster perf just sitting there you don't think they won't find any use for it?
You can effectively replace direct visibility rasterization with rays if you can afford to path trace the scene anyway -- at a certain scene complexity it will be cheaper than trying to rasterize everything.

A GPU fast enough to path trace modern games at ~60fps/4k and run hit shaders on every bounce can probably easily simulate a legacy game's rasterization pipeline in compute, anyway, it's not like rasterizing a triangle is a magic problem that needs special hardware in principle.
 
The technology to allow performance fast enough to ditch rasterization is nowhere on the horizon.

Exactly this, path tracing 25+ year old games is only something the top 8-10 GPU's available can do....with the help of upscaling.

We're 10, 15years out being able to do games with realistic visuals.
 
Last edited:
You can effectively replace direct visibility rasterization with rays if you can afford to path trace the scene anyway -- at a certain scene complexity it will be cheaper than trying to rasterize everything.

A GPU fast enough to path trace modern games at ~60fps/4k and run hit shaders on every bounce can probably easily simulate a legacy game's rasterization pipeline in compute, anyway, it's not like rasterizing a triangle is a magic problem that needs special hardware in principle.

That kind of GPU is over a decade away and until that GPU arrives games with be raster+rt hybrid so you'll be talking about some monster GPU.

We need a path traced port of same a 10 year old game to see how that runs to be able to gauge how long we'll be waiting, Bioshock Infinite came out 10 years, lets patch trace that and see how it performs.

Simulating raster will be tricky as until your software emulation is bang on compatibility drops.

Is it viable from a marketing POV for either AMD or Nvidia to release a patch tracing GPU in say 10 years time with the patchy raster emulation?
 
You can effectively replace direct visibility rasterization with rays if you can afford to path trace the scene anyway -- at a certain scene complexity it will be cheaper than trying to rasterize everything.

A GPU fast enough to path trace modern games at ~60fps/4k and run hit shaders on every bounce can probably easily simulate a legacy game's rasterization pipeline in compute, anyway, it's not like rasterizing a triangle is a magic problem that needs special hardware in principle.

I think it’s reasonable to expect RT to consume a larger % of transistor budget in future generations from all 3 IHVs. A 4090 is ~6x faster than a 2080 Ti in Portal RTX. If that trend continues a 6090 should easily do 4K/60 pathtraced with geometric complexity of something like GTA V. The next big CP2077 graphics update is supposedly going to attempt raytraced primary visibility with RTXDI. That would be a good sign of things to come.
 
I think it’s reasonable to expect RT to consume a larger % of transistor budget in future generations from all 3 IHVs. A 4090 is ~6x faster than a 2080 Ti in Portal RTX. If that trend continues a 6090 should easily do 4K/60 pathtraced with geometric complexity of something like GTA V. The next big CP2077 graphics update is supposedly going to attempt raytraced primary visibility with RTXDI. That would be a good sign of things to come.
A 6090 could be on the penultimate or possibly even the final node shrink of silicon. The wall of performance will be hit well before we have the performance to path trace modern visuals. We also have to consider that your prediction on 6090 performance may not pan out. The power draw cant just continue to increase. By 4K/60 do you mean with DLSS-P?
 
Last edited:
Nanite and lumen go hand in hand. real time lighting makes the assets actually pop. Baked lighting can't show all the detail in the environment
Real time lighting matters more than most people think, even more than asset detail. Here is dark souls 3, a detail rich but relatively old game technology wise with it's assets directly ported from the from software engine to UE3 with lumen applied. It's an insane difference with that change alone

 
A 6090 could be on the penultimate or possibly even the final node shrink of silicon. The wall of performance will be hit well before we have the performance to path trace modern visuals.

This is certainly not true. Transistor counts and performance won’t hit a wall anytime soon. Shrinks will continue well into the angstrom range even if each node takes a bit longer. There are also yet unexplored options for scaling GPU performance including die stacking and MCMs.

We also have to consider that your prediction on 6090 performance may not pan out. The power draw just can’t continue to increase.

I’m not sure what you’re getting at. If 6090 performance doesn’t scale RT is the least of our worries. Throwing whatever transistors and power are available at raster isn’t going to help. No amount of transistors is going to make raster shadows pixel perfect or eliminate SSR artifacts.

By 4K/60 do you mean with DLSS-P?

Nope, a 4090 gets around 4K/25fps native in Portal and is 3x faster than a 3090. Like I said if that trend continues a 6090 would be ~10x faster than a 4090. 250 fps in Portal should be good enough for 60fps in something with a bit more geometric complexity. I fully expect it to continue because I think at least 2 of the IHVs will continue to spend transistors on RT.

DLSS/upscaling is bonus and in 4 years will be everywhere anyway.
 
Real time lighting matters more than most people think, even more than asset detail. Here is dark souls 3, a detail rich but relatively old game technology wise with it's assets directly ported from the from software engine to UE3 with lumen applied. It's an insane difference with that change alone

Meh? Lighting is better in Demon's Souls.
 
Real time lighting matters more than most people think, even more than asset detail. Here is dark souls 3, a detail rich but relatively old game technology wise with it's assets directly ported from the from software engine to UE3 with lumen applied. It's an insane difference with that change alone


That looks terrible, all I see is how dated the assets look.
 
I think it’s reasonable to expect RT to consume a larger % of transistor budget in future generations from all 3 IHVs. A 4090 is ~6x faster than a 2080 Ti in Portal RTX. If that trend continues a 6090 should easily do 4K/60 pathtraced with geometric complexity of something like GTA V. The next big CP2077 graphics update is supposedly going to attempt raytraced primary visibility with RTXDI. That would be a good sign of things to come.
Portal RTX seems like an extreme outlier. The 'jump' you're basing things on seems quite exaggerated compared to what we see elsewhere, including in something like Quake 2 RTX.

I would be very hesitant to use that as a basis for the rate of improvement going forward, even ignoring the very real possibilities of diminishing returns on both GPU improvements and ray tracing improvements specifically. Lovelace in general is not going to be a typical leap, far from it. Nvidia have far less room for improvement with their next generation, even if they were to adopt TSMC's absolute latest N3E node in late 2024, which they may well not do for the consumer graphics cards.
 
Worse case they could reduce the raster performance in favour of increasing the RT.

It's not as if the high-end GPU's can't afford to give up a bit of raster performance as I feel the level we have on offer at the moment will easily last the rest of this console generation.

I can't see my 4070ti needing to be replace in a few years because it can't runs multiplat games at 60fps.
 
Quake 2 RTX and Minecraft RTX shows an approximate 3.7x performance increase going from 2080Ti to 4090. Blender and other rendering apps shows about 3.5x to 3.7x too.
The 3D Mark Path Tracing feature test shows 4.3 to 4.5x performance uplift from 2080Ti to 4090.
Portal RTX uses SER (shader reordering), and pushes path tracing to its limits, and shows about 5.5x performance uplift going from 2080Ti to 4090.

The trend is obvious, the more ray tracing you push, the more the uplift increases.
 
Last edited:
The Outer Worlds: Spacers Choice Update looks to be a Witcher-3 level disaster, if not worse.

1678219973788.png

1678220549963.png

It's shit on every platform, but got it on PC, and not surprisingly - tons of shader stutter. You can lower the settings and use dynamic res to get 60, but the constant shader stutter makes that unsustainable on any hardware combo. Submitted refund immediately.

Was done by Virtuous Studios btw - the guys responsible for the port of Horizon:Zero Dawn that Guerilla had to come in and fix. Considering that and the lack of 'modern' (cough) features like DLSS/FSR, I'm not entirely hopeful we'll see patches that address these issues to any significant degree.

Really...why? What is the impetus to push something out this broken right now? It's March 7th my man, not Dec 23rd. It can wait a few more weeks.
 
Last edited:
The Outer Worlds: Spacers Choice Update looks to be a Witcher-3 level disaster, if not worse.

View attachment 8414

View attachment 8415

It's shit on every platform, but got it on PC, and not surprisingly - tons of shader stutter. You can lower the settings and use dynamic res to get 60, but the constant shader stutter makes that unsustainable on any hardware combo. Submitted refund immediately.

Was done by Virtuous Studios btw - the guys responsible for the port of Horizon:Zero Dawn that Guerilla had to come in and fix. Considering that and the lack of 'modern' (cough) features like DLSS/FSR, I'm not entirely hopeful we'll see patches that address these issues to any significant degree.

Really...why? What is the impetus to push something out this broken right now? It's March 7th my man, not Dec 23rd. It can wait a few more weeks.
JFC
 
The Outer Worlds: Spacers Choice Update looks to be a Witcher-3 level disaster, if not worse.

View attachment 8414

View attachment 8415

It's shit on every platform, but got it on PC, and not surprisingly - tons of shader stutter. You can lower the settings and use dynamic res to get 60, but the constant shader stutter makes that unsustainable on any hardware combo. Submitted refund immediately.

Was done by Virtuous Studios btw - the guys responsible for the port of Horizon:Zero Dawn that Guerilla had to come in and fix. Considering that and the lack of 'modern' (cough) features like DLSS/FSR, I'm not entirely hopeful we'll see patches that address these issues to any significant degree.

Really...why? What is the impetus to push something out this broken right now? It's March 7th my man, not Dec 23rd. It can wait a few more weeks.
Apparently Virtuous is the team working on the supposed Metal Gear Solid 3 Remake as well..... yay...
 
Portal RTX seems like an extreme outlier. The 'jump' you're basing things on seems quite exaggerated compared to what we see elsewhere, including in something like Quake 2 RTX.

That’s fair. Portal benefits from optimizations like SER which tilt the scale in Lovelace’s favor so maybe it is an outlier. Future architectures could have even more tricks in store though.

I would be very hesitant to use that as a basis for the rate of improvement going forward, even ignoring the very real possibilities of diminishing returns on both GPU improvements and ray tracing improvements specifically. Lovelace in general is not going to be a typical leap, far from it. Nvidia have far less room for improvement with their next generation, even if they were to adopt TSMC's absolute latest N3E node in late 2024, which they may well not do for the consumer graphics cards.

I’m also banking on accelerated investment in RT hardware relative to other parts of the chip.

I have the opposite view on future scalability of raytracing hw. The hardware is in its infancy and the software is even less mature. We’re at the GeForce 3 level of RT right now.

The main thing dampening my optimism is the continued poor utilization of graphics hardware. RT adds yet another source of divergence and latency. In that respect I do agree we’re facing diminishing returns unless transistors are also spent on making better use of the massive hardware resources already available. There’s no reason a 4090 should be getting only 66fps in Dead Space. Absolutely ridiculous.
 
Status
Not open for further replies.
Back
Top