Sony PlayStation 5 Pro

Uncharted wasn't unplayable at 30 fps and didn't suffer from low sales. As mentioned before on this board though, does the prevalence of OLED screens make 30 fps less tolerable than other displays from the past?

It is for me, I can’t stand the stutter, it’s something that didn’t bothered me on my old LED.

But I reckon there are people that either don’t notice it or it’s not an issue for them.
 
@Dictator @oliemack Can one of you guys confirm the DLSS used? Seeing some people online saying it's 4K using DLAA. The image says DLSS, so I'm assuming DLSS Quality, but it doesn't specificy.

Thank you.
kSMNGQu.jpeg
 
Not quite how this works. You're assuming bandwidth is unlimited in these scenarios and it's a pure compute play when you're doing this calculations.
Firstly, the ampere series of GPUs are incorrectly rated on tensor ops. That's not your fault. But the 3070TI in this case is 174 tensor TOPS with sparsity. It's actually only 87 Tensor TOPs int-8

So perhaps this is largely missed, I think it's a critical marketing issue I suppose.
Tensor Cores, and large matrix accumulator silicon, are actually measured very differently than what you're measuring on the CUs, or SMs. Those are 8-bit int Tera Operations. If it was 32bit it would be called a TFLOP, which is tera floating point operations.

So the reason PS5 has 300 TOPs, is because that's actually just dual issue, 32bits cut down to 8bit, with sparsity for 2x.

Tensor Cores, and equivalent silicon, are rated in TOPS, but they don't stand for Tera OPs. They are Tensor Tera Ops. And that little bit, "tensor" being removed from the front, is a world of difference. What a tensor core is able to complete in a single cycle, will take _many_ cycles of a CU to complete. They are very different silicon right. The CU is a general purpose high performance SIMD/SIMT unit. That's the architecture for it. They are designed to hold precision.

The Tensor core, is a large scale, massive matrix multiplier with accumulate that is very happy to toss precision in favor of completing as much work as possible in a single cycle. It does it so fast, it's always bandwidth limited, its probably idle most of the time. There's just not enough data for it to crunch. The problem with tensor cores is that it's so specialized, it only runs 1 type of AI algorithm, and there are many, and it's designed to run the Neural Network family of algorithms. And they cannot be used for anything else, anything else requires the CUs.

It's worth reading about how tensor cores work, I've listed the blog post above. But incase you don't want to:

Tensor Core
  • Global memory access (up to 80GB): ~380 cycles
  • L2 cache: ~200 cycles
  • L1 cache or Shared memory access (up to 128 kb per Streaming Multiprocessor): ~34 cycles
  • Fused multiplication and addition, a*b+c (FFMA): 4 cycles
  • Tensor Core matrix multiply: 1 cycle
  • shared memory access 1*34 cycles
General SM
To perform the same matrix multiply it is 32 cycles and 8*34 cycles of shared memory accesses.

From a compute perspective, the tensor cores are 32x faster.
The problem is, on both sides there is memory and latency to get memory into caches to serve both. And that is a flat rate whether it goes into the compute path or the tensor path as the tensor cores are located inside the SM.
so the only reason we don't see more performance out of the tensor cores, is quite simply, because they cannot be fed any faster.
The larger GPUs with more tensor cores, only go faster at it because they are also paired into more SMs, and more SMs are paired with more bandwidth. There's nothing they can really do about it either, memory takes like 200 cycles to arrive, tensor cores are sitting around doing nothing.

quite simply, you're looking at bandwidth limitations here which is why tensor cores aren't just running away with it, memory latency keeps it idle, so you're looking closer to a 2x improvement overall in the worst case scenario. With latency hiding you are looking close of upward to 9x faster.

But the PS5 shares everything, and it's extremely bandwidth limited as it sharing bandwidth with the CPU, and losing some of it because of that, losing bandwidth to rendering, and of course it now has to do AI upscaling.

So it's not going to be the same as just counting cycle operations and saying it's 1/2 the speed to 10x slower than tensor cores. Tensor cores can take memory access in 34 cycles, and complete it's job 1 cycle later all the while the SMs are doing their work in parallel.

It's very different
Thank you so much for the in depth response, that extra level of detail is exactly why I come here. I have to digest all that now.
 
Oh, the circled regions don't matter. It's from this video.

Timestamped.

Curious, but how are you arriving at this number for the Pro? Didn't they say 300 TOPs?
It was just an estimate to answer my own question of worst possible performance assuming that the CUs are RDNA3 but with the ps5pro clock and cu count. I didn't mean to reflect the actual info because it's only a leak and we don't have official info on how the ML hardware is designed
 
Uncharted wasn't unplayable at 30 fps and didn't suffer from low sales. As mentioned before on this board though, does the prevalence of OLED screens make 30 fps less tolerable than other displays from the past?
OLED do change things a lot. But also, when Uncharted came out, it didn't have the option for 60fps. It was 30 or nothing so not much of a choice there. Now we do have that choice.
 
30fps being less tolerable is simply because gamers have had a consistent taste of 60fps for the last 4 years and they're just used to (and like) how it 'feels'
Of course people like 60fps versus 30fps, all else being equal. That's not a revelation. I am the same. But the thing is - I can still play 30fps games just fine. And I'd bet 99% of these people can too if they stop whining about it on-paper and just focus on actually playing and enjoying a game instead of working on some preconceived notions. You get used to it very quickly, especially if it's a solid 30fps. It's not like 60fps is some new thing, either. Plenty of people were playing 60fps games last generation. Playstation 2 had tons of 60fps titles. People would regularly switch between 30fps and 60fps games. It was fine.

But with fixed spec consoles, we're not in a 'all else being equal' situation. We're never gonna know what we're missing if we demand every developer make every game 60fps. Developers last generation would probably have made quite different games if 60fps was some mandate. We'd have lost out on lots of great experiences. I simply dont think it's reasonable to demand that any developer has to water down their ambitions or throw them away entirely based on this lackluster notion of 30fps somehow not being acceptable anymore. I dont think most of these gamers really understand that this isn't necessarily just a case of scaling resolution or graphics a bit.

EDIT: I will admit I do forget about the OLED argument when talking about 30fps. Done it a couple times here in this forum alone. lol I'm still not sure what percent of the console install base is using OLED, but I'd guess it's still a fairly small minority, though not insignificant. I also consider it a straight up major flaw of the technology for gaming and I dont think the whole world of game development should have to rearrange all their plans just to cater to it. Plus, there is still a platform for those who want both ambitious games and great framerates in any game...
 
Last edited:
Of course people like 60fps versus 30fps, all else being equal. That's not a revelation. I am the same. But the thing is - I can still play 30fps games just fine. And I'd bet 99% of these people can too if they stop whining about it on-paper and just focus on actually playing and enjoying a game instead of working on some preconceived notions. You get used to it very quickly, especially if it's a solid 30fps. It's not like 60fps is some new thing, either. Plenty of people were playing 60fps games last generation. Playstation 2 had tons of 60fps titles. People would regularly switch between 30fps and 60fps games. It was fine.

But with fixed spec consoles, we're not in a 'all else being equal' situation. We're never gonna know what we're missing if we demand every developer make every game 60fps. Developers last generation would probably have made quite different games if 60fps was some mandate. We'd have lost out on lots of great experiences. I simply dont think it's reasonable to demand that any developer has to water down their ambitions or throw them away entirely based on this lackluster notion of 30fps somehow not being acceptable anymore. I dont think most of these gamers really understand that this isn't necessarily just a case of scaling resolution or graphics a bit.

EDIT: I will admit I do forget about the OLED argument when talking about 30fps. Done it a couple times here in this forum alone. lol I'm still not sure what percent of the console install base is using OLED, but I'd guess it's still a fairly small minority, though not insignificant. I also consider it a straight up major flaw of the technology for gaming and I dont think the whole world of game development should have to rearrange all their plans just to cater to it. Plus, there is still a platform for those who want both ambitious games and great framerates in any game...
I wholeheartedly disagree with everything you write.

We're not 'whining', we're expressing our preference to 60, just like you are with 30. And no, 99% of 'these people' are not whining about this. Yes you get used to it if it's the only bloody option! It's still pretty shit. In 2024, 30fps should not be a thing.

OLED is not a 'fairly small minority'. In Europe alone, OLED has 30% market share of all TVs sold. You can bet that a lot of people buying the latest generation of consoles (and the Pro) are people who would have spent the extra money (not even that much in some cases) for an OLED TV.

Not being great at 30fps gaming is not a 'major flaw' of OLED technology. It's 30fps that is a 'major flaw' in gaming technology at this point in our lives.
 
Not being great at 30fps gaming is not a 'major flaw' of OLED technology. It's 30fps that is a 'major flaw' in gaming technology at this point in our lives.
Haha, I guess I hadn't thought about it this way, but it certainly feels correct to me. If you're gonna play $700 for a console, are you only limited to 30FPS? For all the fanboy-fighting over which is better, it seems like consoles could figure out a way to squeeze 60fps out of the hardware. At the same time, modern games have a lot of graphical goodies, whereas the quantity of CPU, GPU, memory, storage, and input hardware you can cram into a sub-$800 BOM is certainly limited.

Maybe as a middle ground, wouldn't 40FPS be a reasonable target? Would be more than 30FPS, and it sounds like it solves a lot of pain for OLED owners. It still surprises me OLED sets haven't introduced a software setting to "blend" the prior frame into the current frame, giving it a somewhat LCD-like quality where the pixels take just a moment to transition to the next requested color. I know it's technically feasible, I guess the device manufacturers just had no real reason to do it...
 
We should probably make a thread(list, or sticky the first post of this thread) of games I think, I honestly don't believe it will be overly common that 30fps games on PS5 will be 60fps games on 5pro.
If we look at a straight calculation across, lets assume the 45% improvement (as per Cerny), you're still at 23ms per frame. Still 7ish ms away from 16fps assuming we're GPU limited here.

If you add in PSSR you're sitting at 25ms -- and this will get you to the 40fps mark. If it was a little closer to 16.6ms, I think I could see it being more common. But needing to be able to chop down 9ms is significant, likely require even more reductions to get it to 14.6ms per frame, before adding PSSR in, and especially if you intend to improve the RT and graphical performance as well. With that many significant reductions, one could assume that PS5 would be able to do 60fps with those same reductions.

If we look at 60fps games, PS5 does it in 16.6ms. 5Pro should do it in about 11.5ms. Now you have about 5ms of window to add in PSSR, more RT, bump graphical options or even resolution.
 
Last edited:
Haha, I guess I hadn't thought about it this way, but it certainly feels correct to me. If you're gonna play $700 for a console, are you only limited to 30FPS? For all the fanboy-fighting over which is better, it seems like consoles could figure out a way to squeeze 60fps out of the hardware. At the same time, modern games have a lot of graphical goodies, whereas the quantity of CPU, GPU, memory, storage, and input hardware you can cram into a sub-$800 BOM is certainly limited.

Maybe as a middle ground, wouldn't 40FPS be a reasonable target? Would be more than 30FPS, and it sounds like it solves a lot of pain for OLED owners. It still surprises me OLED sets haven't introduced a software setting to "blend" the prior frame into the current frame, giving it a somewhat LCD-like quality where the pixels take just a moment to transition to the next requested color. I know it's technically feasible, I guess the device manufacturers just had no real reason to do it...
I'd be very happy if every game gave a 40fps option, I love it!
 
Back
Top