Star Wars: Jedi Survivor [PS5, XBSX|S, PC]

I think its worth noting that despite its horrible CPU issues JS it actually quite forgiving on the GPU. At least on the face of it based on these TPU benchmarks:


The 3060 is easily averaging over 30 fps at Ultra RT settings / 1080p which based on the DF analysis is presumably quite a reasonable match for the PS5 quality mode.

The VRAM usage also doesn't seem to be hurting performance too much at all looking at the 3070/2080Ti and 3080/4070 comparisons despite the compared settings apparently exceeding 8 and 10GB requirements. That is unless its showing up as additional stutter but the 1% low charts don't suggest that.

Also did I read somewhere that it actually loads faster on PC?

So it looks to me like it's the CPU issues really holding this game back on PC. If they can solve those along with the worst of the bugs (is the audio sync issue resolved yet?) then it shouldn't be half bad.

It's quite different from TLOU for example where you need a far more powerful GPU than you should to match console performance, and where VRAM limits really do kill performance.
 
They made really poor decisions WRT the visual makeup IMO. The RT is not adding nearly enough to justify the awful IQ. FSR 2 is bad reconstruction tech as well.
 
I may have been wrong to call this patch another "placebo patch"... many people are reporting massive gains. Unfortunately the stuttering is still there, but it's an impressive performance bump to say the least.

Which makes it all the more maddening that it came just a couple days after the game launched... like WTF. This is why we complain about stuff like this.
 
HUB are showing enormous gains for non RT graphics. Up to 75% on the right (non GPU limited) hardware.

It's almost as if the game was still doing a lot of the RT CPU work even with the setting turned off.

Interestingly HUB are also claiming most of the stutters are gone, although that's on a 13900K, and didn't appear to be an in depth assessment.

 
Game still stutters when you enter a new area or running around this nonsense village thing in the second world.
 
HUB are showing enormous gains for non RT graphics. Up to 75% on the right (non GPU limited) hardware.

It's almost as if the game was still doing a lot of the RT CPU work even with the setting turned off.

Interestingly HUB are also claiming most of the stutters are gone, although that's on a 13900K, and didn't appear to be an in depth assessment.

The 4090 is still hilariously CPU-limited at 1080 and 1440 and perhaps even 4K with RT. The difference between 1080p RT and 4K RT is only 8fps
 
Yeah looking at that trace, oh man, thats terrible.
Any serious AAA game should have at least 1 if not multiple senior Devs looking at superliminal traces, an the various platform specific ones, as a nearly full time role.

Also I'm hoping for the consoles to get a newer version of FSR soon, it seems the IQ in 2.1+ is a big improvement on the original 2.0,
so that should help things a lot too.
 
The game looks really amazing, the detail and lighting is great

starwarsjedi_survivorq3ewx.png
 
Console vs PC comparison in this game is interesting. At native 1440p max RT settings, a 2070 Super delivers a locked 30fps most of the time, which is significantly more than PS5, and the PC is using higher settings, higher resolution, and is plagued by CPU bottlenecks when using RT.

At native 1080p, the 2070 Super is a 43ish fps performer.

 
Last edited:
Console vs PC comparison in this game is interesting. At native 1080p max RT settings, a 2070 Super delivers a locked 60fps most of the time, which is significantly more than PS5, and the PC is using higher settings, higher resolution, and is plagued by CPU bottlenecks when using RT.


That's just non-RT performance. In RT mode the 2070S gets a 43 average with 35 min. Still comfortably above the consoles locked 30fps in quality mode although we don't know how high they would go if unlocked in that particular test scene and also the DRS on the consoles can go a bit above 1080p and they would also have the added overhead of applying FSR2 on top of that. On the other hand as you say, PC Epic settings are likely higher than those of the console.

So it's difficult to com[are directly but I'd say the takeaway here is that on the face of it, the game doesn't seem to have any obvious GPU scaling issues on PC. The issues are all on the game code/CPU side.
 
I note VRAM usage is also pretty modest topping out at 10.2GB at 4K with RT on max settings on a 4090, but interestingly only 7.7GB at the same settings on the 8GB card which I guess suggests the game simply allocates more memory than it needs if it's available.
 
I note VRAM usage is also pretty modest topping out at 10.2GB at 4K with RT on max settings on a 4090, but interestingly only 7.7GB at the same settings on the 8GB card which I guess suggests the game simply allocates more memory than it needs if it's available.

Yes and this points to another problem with some of the vram usage analysis floating around. People are recording vram usage on 16GB and 24GB cards and assuming that the game would use the same amount of memory on lower vram cards. The only way to determine if a game is playable on a 8gb card is to test an 8gb card.
 
Back
Top