Path tracing on a CPU. Versus path tracing on RT hardware. Who would have thought the RT hardware would win out ?None RT workloads haven't stayed stuck at cross gen 2020 levels.
Neither has geometry.
Also, Starfield doesn't use hardware RT units, so the performance acceleration thay they can offer to Cyberpunk won't help Starfield. At all.
The idea that a none RT game cannot be or should not be more demanding than an RT game is an idea than people need dissuade themselves of.
"Ray Tracing" is not a set workload with a set cost.
Most are not complaining that Starfield doesn't look as good as Lumen. Starfield doesn't look as good as red dead redemption 2 on the pc once you leave the indoor areas. Is Rockstar also using Lumen? Like please can we stop this? The long and short of it is that Bethesda is free to use their in house engine. The complaint is that they haven't done nearly enough to modernize it and make it performant. The complaints about it's performance and visuals are completely justified. It uses traditional rasterization techniques which I applaud but, implements them poorly in such a non performant manner that it must be criticized.Yes. I don't know what is going on right now where people are upset about a game, with an in house engine, using in house solutions, is being disparaged because it doesn't look like or perform like competing solutions, while at the same time complaining about consolidation in the middleware market is causing every game to look and perform the same. So yeah, Starfield's GI doesn't look as good as Lumen, or RT solutions. But isn't that what makes Lumen and the hardware RT solutions special? I'd also like to point out that Starfield's performance isn't out of line with recent Unreal Engine releases. Immortals of Aveum, for example, is often sub 60fps at 1440p on an RTX3080, and can't maintain 60FPS without upscaling. A 3080 is roughly the same in Starfield. It needs upscaling to maintain 60fps.
Parts of Starfield look good. Other parts of Starfield look good for a Bethesda game. This game uses reasonable amounts of VRAM, while maintaining high quality materials throughout, unlike several other recent releases. And it doesn't have shader stutter, unlike several other recent releases. And it launched in a fairly stable state on all platforms. Sure, there were missing features (DLSS/XESS, HDR settings, FOV, AF), though some of those might not have been added for contractual reasons, others (like AF) might be missing for technical reasons.
Their results cannot be right can it? If you filter out DLSS results, it says the 4090 gets 41 fps with raytracing overdrive without DLSS at 4k native? Isn't that a drastic improvement from launch?
Tracing rays and compute these results is much heavier than any other workload.None RT workloads haven't stayed stuck at cross gen 2020 levels.
Neither has geometry.
Also, Starfield doesn't use hardware RT units, so the performance acceleration thay they can offer to Cyberpunk won't help Starfield. At all.
The idea that a none RT game cannot be or should not be more demanding than an RT game is an idea than people need dissuade themselves of.
"Ray Tracing" is not a set workload with a set cost.
That's in Ray Tracing, which is just Psycho Ray Tracing without Path Tracing.it says the 4090 gets 41 fps with raytracing overdrive without DLSS at 4k native? Isn't that a drastic improvement from launch?
Sorry about that. Any way the raster performance didn't change that much in Cyberpunk Phantom Liberity (though the game became more VRAM heavy, so I am going to focus on 1440p to alleviate that bottleneck).
Though I think RT performance became a little bit more heavy, a 3080 beats 7900XTX in Ultra Ray Tracing.
With Path Tracing, yeah a 2080Ti has the upper hand.
RDR2 doesn't use real time GI, though. That's why I compared it to Lumen. Real time is always going to have a performance penalty. Performance optimizations to real time work are often going to have visual trade offs.Most are not complaining that Starfield doesn't look as good as Lumen. Starfield doesn't look as good as red dead redemption 2 on the pc once you leave the indoor areas. Is Rockstar also using Lumen? Like please can we stop this? The long and short of it is that Bethesda is free to use their in house engine. The complaint is that they haven't done nearly enough to modernize it and make it performant. The complaints about it's performance and visuals are completely justified. It uses traditional rasterization techniques which I applaud but, implements them poorly in such a non performant manner that it must be criticized.
This website is unreliable. They don’t run all these tests. They run a handful of them and extrapolate the data. That or they just make it up. It doesn’t make sense how many GPUs they can test with three resolutions in such a short amount of time.Their results cannot be right can it? If you filter out DLSS results, it says the 4090 gets 41 fps with raytracing overdrive without DLSS at 4k native? Isn't that a drastic improvement from launch?
View attachment 9619
I suspect there are savings from not having baked light maps, and shadow maps; and possibly SFS to micro stream only textures that can be seen. In combination with very long complex shaders meaning you’re doing more per shader and writing less buffers to vram?Also, I can’t be the only one a bit puzzled by the low VRAM usage? A lot of the material and texture work in Starfield looks pretty great but I’ve also spotted frequent low resolution textures and materials.
Surely, all that unused vram is going to waste, isn’t it? Shouldn’t the objective be (at least on the Series X) to use as much vram available for the highest possible quality of assets while maintaining the target performance? Or would higher quality textures tank the performance even while remaining under the vram cap?
The game should be commended for having such high quality assets/textures with such a low cost in regards to vram.
But where I'm going with that comment for instance is can we assume that Starfield is a "complex workload" or that it is the best approach given the results (at least for all hardware)?
At least my impression from some of the wider reporting and narratives is there is interpretation being used here to gauge/judge the overall validity of each hardware approach from a generic stand point, when it seems at least to me subject to the software involved.
Maybe when The Coalition releases something, we'll see it in action. Unfortunately, just like the use of hardware VRS that certain groups where shouting about pre-launch, I expect it's implementation to be bad.So does anyone know of any games which use Sampler Feedback Streaming on Xbox hardware? They touted it so hard during the lead in to launch.. and we haven't heard anything about it since.