In combat it is indeed a 15fps difference.Nah I watched video its not on avarage 15fps more on xsx he probably misstyped and should be 5fps as he wrote on video not description
In combat it is indeed a 15fps difference.Nah I watched video its not on avarage 15fps more on xsx he probably misstyped and should be 5fps as he wrote on video not description
so maybe he should wrote that there are some scenes with 15fps diff not on avarage (thats why I like vgtech stats)In combat it is indeed a 15fps difference.
Not like for like scenes, not the same amount of enemies etc. The ~5fps delta is done with like for like comparisons in a splitted screen.In combat it is indeed a 15fps difference.
Not like for like scenes, not the same amount of enemies etc. The ~5fps delta is done with like for like comparisons in a splitted screen.
There is no point in denying it, the performance diff is big, imo waaaaaay to big keeping in mind that the specs are roughly the same.
they both have 1620p mode with steady 60fps alsoThis game is (surprisingly) quite good looking especially with the characters! Maybe at 1800p both next gen consoles should be able to hit a locked 60 FPS. 40-ish 50-ish FPS are just annoying.
This was a great analysis from him. So to summarize what NXGamer found in RE8 (a game that is properly using I/O on both consoles) respectively some loadings:
Been shouting this at clouds for the last year.If I look at those numbers it would seem that transfer rate isn’t the only factor at play.
If I look at those numbers it would seem that transfer rate isn’t the only factor at play.
The thing to think about here is that at some point, the limiting factor is no longer the actual "i/o" as in the drive transfer speed, but probably something else, like the CPU.This was a great analysis from him. So to summarize what NXGamer found in RE8 (a game that is properly using I/O on both consoles) respectively some loadings:
- Default internal 2.3s (5500MB/s)
- fastest 2.1s (7000MB/s)
- (Series X)
- slowest 2.7s (5000MB/s).
On average he didn't find a big difference from default to fastest (1% average), but there is already a bigger difference between default to slowest (3%).
The thing to think about here is that at some point, the limiting factor is no longer the actual "i/o" as in the drive transfer speed, but probably something else, like the CPU.
Been shouting this at clouds for the last year.
What? On PS5 and properly using the custom I/O the CPU is completely bypassed.The thing to think about here is that at some point, the limiting factor is no longer the actual "i/o" as in the drive transfer speed, but probably something else, like the CPU.
That should not take more than a few frames, ideally one frame. But it's the same loading here with the same data and the same game so it should take the same time.You're still gated by game code execution that runs on the CPU.
Does the game use Sampler Feedback Streaming on Xbox?This was a great analysis from him. So to summarize what NXGamer found in RE8 (a game that is properly using I/O on both consoles) respectively some loadings:
- Default internal 2.3s (5500MB/s)
- fastest 2.1s (7000MB/s)
- (Series X)
- slowest 2.7s (5000MB/s).
On average he didn't find a big difference from default to fastest (1% average), but there is already a bigger difference between default to slowest (3%).
I've not seen any developer of any title say they're taking advantage of Sampler Feedback yet.Does the game use Sampler Feedback Streaming on Xbox?
Or would that not make any difference for initial level loads because it (I'm assuming) needs rendered frames to "sample" and get "feedback"?