Current Generation Games Analysis Technical Discussion [2020-2021] [XBSX|S, PS5, PC]

Status
Not open for further replies.
I really don't understand the "tflops are arbitrary" meme. Floating point operations are most of what gpus do. It's an extremely relevant metric -- so is bandwidth (xbox wins), bus size (xbox wins), architecture (they're mostly the same).
The reason that teraflops are sometimes called out as not meaningful is because the figure represents the theoretical maximum number of floating point operators that the GPU can perform but in the history of modern GPUs, few GPUs have ever sustained this for more than a few microseconds because being able to get to the GPU to do that is very reliant on the CPU, excellent code, optimum data, excellent cache utilisation and plenty of bandwidth and literally no bottlenecks and these are all things that are really, really difficult to architect. More so for third party games where the balance of these things are all different.

If you take fairly crude comparisons like number of GPUs CU at any given clock speed, against cache (and percentage of cache hits) and available cache and memory bandwidth per CU, you'll find the current crop of console (4 last gen, 3 this gen) are all over the place in terms of 'balance'.

So teraflops is not meaningless, but it's also not a very good indicator of what actual performance you will get from given hardware.
 
I really don't understand the "tflops are arbitrary" meme. Floating point operations are most of what gpus do. It's an extremely relevant metric -- so is bandwidth (xbox wins), bus size (xbox wins), architecture (they're mostly the same). Microsoft's "most powerful console" claim is pretty safe

The rated theoretical maximum TFLOPs output isn't the only indicator that GPU's actual TFLOPs throughput during a real gaming scenario. The total theoretical bandwidth to the off-chip memory isn't the only indicator on how often and for how long the GPU is stalling / waiting for that memory, much less bus size (width?). Effective memory bandwidth is what matters here.

But the last mistake here (which I confess I also made at some point) is claiming the PS5's and SeriesX's architectures are "mostly the same" just because they seeming share the RDNA2 instruction set and Dual-CU arrangement, despite being quite different everywhere else. The PS5 has a lower number of ALUs per shader engine, and on top of that they run at 22% higher clocks, so a higher ALU occupancy on the PS5 should be expected.
The cache scrubbers should increase the proportion of cache hits within the L2 (otherwise why bother) and we don't know exactly how that affects the console's effective bandwidth. (We do know why AMD didn't want the cache scrubbers for their PC RDNA2 GPUs though, and it's because they're already spending a ton of die area on LLC in there)
The PS5 having a considerably faster I/O could also mean they don't need to cache as much data inside the system RAM, with assets coming in "on-the-fly" just a handful of frames before they're needed, and we again don't know how that will impact effective bandwidth.

And this is all before we start considering the differences between the SeriesX's VRS and Sony's own implementation for foveated rendering, the impact of the custom geometry processor, the PS5's higher pixel fillrate, etc.


So despite the SeriesX having larger big numbers (max theoretical TFLOPs throughput and max theoretical bandwidth), the fact that the consoles' GPUs are actually substantially different in many of the other resources means we can't really think that "10 vs 12 means 12 is 20% faster" and "448 vs 560 means 560 is 25% faster".



The Vega 64 averages above 11 TFLOPs and 480GB/s bandwidth, and these are about the same big numbers as the GTX 1080 Ti. It never reached GTX1080 Ti numbers AFAIK.
 
The PS5's gpu substantially higher clocks probably help it 'outmatch' the XSX the most. By increasing core clocks everything gets faster, instead of increasing CU counts. Since the XSX and PS5 are rather close in TF (1.9TF diff?), the additional TFs are not making up for the much higher clocks the PS5 gpu has. Its just that RDNA2 really likes extreme clocks. Which was abit of a surprise i think.
 
I *seriously* doubt the XSX gets these spurious max frame times because it runs with a lower gpu clock. Some rarely used function is either extremely slow or something spuriously disrupts something which I consider software problems until proven otherwise.
 
We're taking questions from our PlayStation 5 thermal benchmark coverage for this Ask GN segment, trying to provide some insight to thermal testing for outsiders from our videos.
TIMESTAMPS & TOPICS
00:00 - PlayStation 5 Thermal Benchmark Q&A
00:57 - “Broken” Thermal Pads
03:26 - What Can Users Do About Thermals?
04:42 - What if My Ambient Temperature is Higher?
09:44 - Module Side vs. Module Top Measurements
11:42 - Why Does Removing Side Panels Improve Performance?
15:17 - What About A "More Demanding" Game?
17:01 - Defending Sony Needlessly
21:11 - Will These Temps Affect the Lifespan?
23:33 - Should I Mod My PS5?
25:55 - Would Adding an External Fan Help?
27:28 - Gaming Benchmarks
 
i meant exclusives that won't release on PS5.

Problem is that there maybe isnt pure series x exclusives at all.

If they have to design around slow PCs + slow xss = compromises.

And even if they would Make xsex only games, those too are dragging the burden of windows+xbox one + xss common tools + directx.

I mean if they would have only series x with custom dev tools + hardware that doesnt have to care about PC/slower consoles maybe they could get "95% out of the hardware" but now "85-90%" as it is not so specialized

(Compared to PS5 with more efficent APIs and dev tools made only for PS5)

Numbers from hat as it is speculation
 
The differences this time around are really minimal on PS5 and X Series. They both run at the same resolution (2160p) at 60fps. I've only found hugely slight differences in grass draw distance and antialiasing in favor of PS5, but for all practical purposes there are no differences. In the case of Series S, the game runs at a maximum resolution of 1080p. It also appears to have less hair density in NextGen physiques.
Xbox Series S: 1920x1080p/60fps | 36,6GB
Xbox Series X: 3840X2160p/60fps | 36,6 GB
Playstation 5: 3840X2160p/60fps | 42,01 GB

Due to the high number of platforms on which this game comes out, I will make several videos to cover as many requests as possible. Immortals Fenyx Rising in NextGen offers us 2 game modes. In quality mode, PS5 and Series X run at 2160p resolution at 30fps and Series S at 1440p. The drawing distance in this mode is somewhat greater and it also increases the range of lighting points as well as some particles. In performance mode, the framerate increases to 60fps. Resolution on PS5 and Series X in this mode is dynamic, between 2160p and 1440p. In some areas, the resolution of PS5 is somewhat higher (1620p) than Series X (1440p). On S Series, the resolution is locked to 1080p. The draw distance is slightly higher in Series X, also, there seems to be some issues with the shadows on PS5 in quality mode. I definitely recommend using performance mode on any of the three platforms.
Series S: 2560x1440p/30FPS or 1920x1080p/60fps | 24 GB
Series X: 3840x2160p/30PFS or 1440p~2160p/60fps | 24 GB
PS5: 3840x2160p/30fps or 1440p~2160p/60fps | 21,41 GB


Per VG
"PS5 in Performance Mode currently has a bug that causes stuttering unrelated to the games frame rate.

PS5 in Performance Mode uses a dynamic resolution with the highest native resolution found being 3840x2160 and the lowest native resolution found being approximately 2275x1280. PS5 in Performance Mode rarely renders at a native resolution of 3840x2160.

Xbox Series X in Performance Mode uses a dynamic resolution with the highest native resolution found being 3840x2160 and the lowest native resolution found being 1920x1080. Xbox Series X in Performance Mode rarely renders at a native resolution of 3840x2160.

The scene at 0:46 was where the lowest resolution was found and this scene seems to render at a higher average resolution on PS5 in Performance Mode than Xbox Series X in Performance Mode. However, in other scenes the PS5 and Xbox Series X render at a similar resolution in Performance Mode, and in some cases the Xbox Series X can render at a higher resolution than PS5.

PS5 and Xbox Series X in Performance Mode use a form of temporal reconstruction to increase the resolution up to 3840x2160 when rendering natively below this resolution.

Xbox Series S in Performance Mode uses a dynamic resolution with the highest native resolution found being 1920x1080 and the lowest native resolution found being approximately 1280x720. Xbox Series S in Performance Mode uses a form of temporal reconstruction to increase the resolution up to 1920x1080 when rendering natively below this resolution.

The only resolution found on PS5 in Quality Mode was 3840x2160.

Xbox Series X in Quality Mode uses a dynamic resolution with the highest native resolution found being 3840x2160 and the lowest native resolution found being approximately 3328x1872. Drops in resolution below 3840x2160 on Xbox Series X in Quality Mode seem to be uncommon.

Xbox Series S in Quality Mode uses a dynamic resolution with the highest native resolution found being 2560x1440 and the lowest native resolution found being 1920x1080. Xbox Series S in Quality Mode uses a form of temporal reconstruction to increase the resolution up to 2560x1440 when rendering natively below this resolution."
 
Last edited:
The first post should really be a list with all the games which are compared, a date of said comparison, as well as a short summary or a link to the post where the differences are explained.

For now it's easy to remember which console runs games the best because one console 'won' everything, but it might change in the future. So while the list would be all blue now, maybe in a while, there will be some green as well?
 
Depends on the mode used, like some titles the SeriesX performs better in 3 out of 4 modes.

Also, its a technical discussion, not a list war. If you want list wars then go elsewhere.
 
Depends on the mode used, like some titles the SeriesX performs better in 3 out of 4 modes.

Also, its a technical discussion, not a list war. If you want list wars then go elsewhere.

I highly doubt that, but this is beside the point.
Also you don’t understand, it is not meant as a list war, it’s meant as an index to give a summary about the current situation, and to link to highly informative posts like for example the excellent summary which Shortbread gave a few posts above this.

it also easier to do analysis on the results if presented in a list: maybe there is a pattern of open world games performing better on a certain architecture, or high FPS modes favoring slower memory bandwidth.

with the current situation you can have posters like yourself claiming that some titles, which is plural, perform better in 3 out of 4 modes which I am not even going to get into.
With a clear list visitors can immediately see if such claims are truth, or just wishful thinking.
 
Just posted about fifa in the df thread, so I looked up videos. This one isnt great (no real like for like comparisons -- different camera angles, lighting situations, cutscenes, etc), but it looks like a hard game to capture like for like on. Going by this video, both consoles seem to dip a little below the 30 target when they switch to "cutscenes" -- but the ps5 has more tearing and a harsh, prolonged dip to 20 (in what looks like a hard workload -- has a bunch of closeup players, and there arent any in the xbox series x cutscenes shown). Both effortlessly lock 60 in the gameplay shown. Also the series S is the real star here imo, has greatly toned down settings but looks like it runs pretty great considering.

(in some shots the grass shader distance also looks farther on xbox, but its impossible to tell really without more similar captures and better analysis)
 
Status
Not open for further replies.
Back
Top