Current Generation Games Analysis Technical Discussion [2023] [XBSX|S, PS5, PC]

Status
Not open for further replies.
None RT workloads haven't stayed stuck at cross gen 2020 levels.

Neither has geometry.

Also, Starfield doesn't use hardware RT units, so the performance acceleration thay they can offer to Cyberpunk won't help Starfield. At all.

The idea that a none RT game cannot be or should not be more demanding than an RT game is an idea than people need dissuade themselves of.

"Ray Tracing" is not a set workload with a set cost.
 
None RT workloads haven't stayed stuck at cross gen 2020 levels.

Neither has geometry.

Also, Starfield doesn't use hardware RT units, so the performance acceleration thay they can offer to Cyberpunk won't help Starfield. At all.

The idea that a none RT game cannot be or should not be more demanding than an RT game is an idea than people need dissuade themselves of.

"Ray Tracing" is not a set workload with a set cost.
Path tracing on a CPU. Versus path tracing on RT hardware. Who would have thought the RT hardware would win out ?
 
Yes. I don't know what is going on right now where people are upset about a game, with an in house engine, using in house solutions, is being disparaged because it doesn't look like or perform like competing solutions, while at the same time complaining about consolidation in the middleware market is causing every game to look and perform the same. So yeah, Starfield's GI doesn't look as good as Lumen, or RT solutions. But isn't that what makes Lumen and the hardware RT solutions special? I'd also like to point out that Starfield's performance isn't out of line with recent Unreal Engine releases. Immortals of Aveum, for example, is often sub 60fps at 1440p on an RTX3080, and can't maintain 60FPS without upscaling. A 3080 is roughly the same in Starfield. It needs upscaling to maintain 60fps.

Parts of Starfield look good. Other parts of Starfield look good for a Bethesda game. This game uses reasonable amounts of VRAM, while maintaining high quality materials throughout, unlike several other recent releases. And it doesn't have shader stutter, unlike several other recent releases. And it launched in a fairly stable state on all platforms. Sure, there were missing features (DLSS/XESS, HDR settings, FOV, AF), though some of those might not have been added for contractual reasons, others (like AF) might be missing for technical reasons.
Most are not complaining that Starfield doesn't look as good as Lumen. Starfield doesn't look as good as red dead redemption 2 on the pc once you leave the indoor areas. Is Rockstar also using Lumen? Like please can we stop this? The long and short of it is that Bethesda is free to use their in house engine. The complaint is that they haven't done nearly enough to modernize it and make it performant. The complaints about it's performance and visuals are completely justified. It uses traditional rasterization techniques which I applaud but, implements them poorly in such a non performant manner that it must be criticized.
 
Cyberpunk 2.0 with Raytracing on a 4070 in 1080p: 58 FPS
Starfield on a 4070 in 1080p: 56 FPS

Twilight zone when Raytracing is cheaper than pure Rasterizing.
Their results cannot be right can it? If you filter out DLSS results, it says the 4090 gets 41 fps with raytracing overdrive without DLSS at 4k native? Isn't that a drastic improvement from launch?
1695234312170.png
 
None RT workloads haven't stayed stuck at cross gen 2020 levels.

Neither has geometry.

Also, Starfield doesn't use hardware RT units, so the performance acceleration thay they can offer to Cyberpunk won't help Starfield. At all.

The idea that a none RT game cannot be or should not be more demanding than an RT game is an idea than people need dissuade themselves of.

"Ray Tracing" is not a set workload with a set cost.
Tracing rays and compute these results is much heavier than any other workload.

And Starfield is stucked in pre 2020 level:
F6TMvboaEAAo_Ga
 
it says the 4090 gets 41 fps with raytracing overdrive without DLSS at 4k native? Isn't that a drastic improvement from launch?
That's in Ray Tracing, which is just Psycho Ray Tracing without Path Tracing.

For Path Tracing, check the other tab.

Also check here.

Sorry about that. Any way the raster performance didn't change that much in Cyberpunk Phantom Liberity (though the game became more VRAM heavy, so I am going to focus on 1440p to alleviate that bottleneck).

performance-2560-1440.png


Though I think RT performance became a little bit more heavy, a 3080 beats 7900XTX in Ultra Ray Tracing.

performance-rt-2560-1440.png



With Path Tracing, yeah a 2080Ti has the upper hand.

performance-pt-2560-1440.png
 
Last edited:
Don't know why everyone's so stuck on Starfield, it's Bethesda, they've not given a shit about tech since anytime after Oblivion.

Heck by a lot of measures it's actually much better than their past decade+. They actually stream a ton of stuff despite the ancient level design, the characters just move flatly instead of looking like hideous horror shows, overall it's an improvement! Heck maybe by the time ES6 rolls around they'll hit "competent B- grade" levels all around.
 
Most are not complaining that Starfield doesn't look as good as Lumen. Starfield doesn't look as good as red dead redemption 2 on the pc once you leave the indoor areas. Is Rockstar also using Lumen? Like please can we stop this? The long and short of it is that Bethesda is free to use their in house engine. The complaint is that they haven't done nearly enough to modernize it and make it performant. The complaints about it's performance and visuals are completely justified. It uses traditional rasterization techniques which I applaud but, implements them poorly in such a non performant manner that it must be criticized.
RDR2 doesn't use real time GI, though. That's why I compared it to Lumen. Real time is always going to have a performance penalty. Performance optimizations to real time work are often going to have visual trade offs.

And I'll say this again. Immortals of Aveum is a game running on competing, modern technology (UE5) that has much the same frame rate as Starfield on PC. On console, Starfield runs at higher resolution on Series S (900p) than Aveum runs on Sereis X or PS5 (720p).
 
Their results cannot be right can it? If you filter out DLSS results, it says the 4090 gets 41 fps with raytracing overdrive without DLSS at 4k native? Isn't that a drastic improvement from launch?
View attachment 9619
This website is unreliable. They don’t run all these tests. They run a handful of them and extrapolate the data. That or they just make it up. It doesn’t make sense how many GPUs they can test with three resolutions in such a short amount of time.

At times, I saw them with hundreds upon hundreds of results for games that had been released only hours ago and their data didn’t match other mainstream and more credible outlets.
 
Also, I can’t be the only one a bit puzzled by the low VRAM usage? A lot of the material and texture work in Starfield looks pretty great but I’ve also spotted frequent low resolution textures and materials.

Surely, all that unused vram is going to waste, isn’t it? Shouldn’t the objective be (at least on the Series X) to use as much vram available for the highest possible quality of assets while maintaining the target performance? Or would higher quality textures tank the performance even while remaining under the vram cap?

The game should be commended for having such high quality assets/textures with such a low cost in regards to vram.
 
Also, I can’t be the only one a bit puzzled by the low VRAM usage? A lot of the material and texture work in Starfield looks pretty great but I’ve also spotted frequent low resolution textures and materials.

Surely, all that unused vram is going to waste, isn’t it? Shouldn’t the objective be (at least on the Series X) to use as much vram available for the highest possible quality of assets while maintaining the target performance? Or would higher quality textures tank the performance even while remaining under the vram cap?

The game should be commended for having such high quality assets/textures with such a low cost in regards to vram.
I suspect there are savings from not having baked light maps, and shadow maps; and possibly SFS to micro stream only textures that can be seen. In combination with very long complex shaders meaning you’re doing more per shader and writing less buffers to vram?
 
Last edited:
But where I'm going with that comment for instance is can we assume that Starfield is a "complex workload" or that it is the best approach given the results (at least for all hardware)?

At least my impression from some of the wider reporting and narratives is there is interpretation being used here to gauge/judge the overall validity of each hardware approach from a generic stand point, when it seems at least to me subject to the software involved.

We don’t know that it’s objectively the best approach. I’m sure better results are possible on Nvidia with the right tweaking. However I’ve seen enough profiled games to know that low occupancy on my 3090 is a common problem and isn’t unique to Starfield.
 
Watching a bunch of the content on DLSS ray reconstruction, seems like it has upsides and downsides, at least in how it's integrated into Cyberpunk. Seems like specular reflections are clearly better, but in some cases diffuse reflections are worse. Temporal responses are improved. Performance numbers are interesting. Gamers Nexus showed more of a performance improvement with it on (from simplifying denoising) vs Hardware Unboxed who found almost no performance improvement at all.
 
Performance can only be better when different workloads are visible. So just direct light or just GI could not be enough.

But the quality of reflections is ridiculous. Here is a comparision between DLSS Q with and without RR and DLAA in 3440x1440: https://imgsli.com/MjA3ODA4/0/1

Reminds me of DLSS 1 on Battlefield 5. It was very good with the reconstruction of the reflections. DLSS 2 was in most cases not on par with DLSS 1.
 
So does anyone know of any games which use Sampler Feedback Streaming on Xbox hardware? They touted it so hard during the lead in to launch.. and we haven't heard anything about it since.
 
So does anyone know of any games which use Sampler Feedback Streaming on Xbox hardware? They touted it so hard during the lead in to launch.. and we haven't heard anything about it since.
Maybe when The Coalition releases something, we'll see it in action. Unfortunately, just like the use of hardware VRS that certain groups where shouting about pre-launch, I expect it's implementation to be bad.
 
Status
Not open for further replies.
Back
Top