Playstation 5 [PS5] [Release November 12 2020]

if ps5 can't maintain 10.3 tflops at all times then the gap is more like 30%.
How exactly did you come to that 30% conclusion and on what factual basis? As far as the drop goes according to Cerny it's not much at all, in fact it'll keep the peak throughput almost all the time. Even if it drops as much as 30% which the evidence is against it according to Cerny's words, it'll be so rare you'll almost never feel it in gameplay. 30% gap is certainly not representative of the TF gap at all like you suggested.
 
This guy has made an even better analysis than DF. Suddenly the PS5 sounds even more impressive
DF just presented the information without opinions around it; and I feel that was probably the right thing to do.

NX Gamer made some leaps that I don't think DF would have published because they could be wrong; I respect that NX gamer is trying to clear up confusion, but I also think in doing so he may have portrayed some misinformation by accident.
I transcribed some his words:

This means that it will fluctuate all the time throughout its life cycle but it's under its own control. It doesn't need to worry about power which means they can build a cooling and acoustic system that is designed around the consistent level of power its always pushing through the system the TDP that means that the performance of the chip is always consistent.
I don't believe this is accurate and no one has any information on this as we debate this exact point here on B3D. If it were so trivial I think we would have seen a PS5 box by now.
I would seriously doubt it's going to vary much maybe 50 megahertz here and there up and down from that 2.2 (GHz) so a lot of the conversation will be around the fact that, that's up to 10.3 TF but actually that's not the case at all it's always going to hit that peak theoretical limit that when it needs to and how often it needs to, the same as the Xbox One X and Series X, it's 12.1 TF machine it will not be processing twelve point one teraflops at all times it just means that's the maximum it needs to hit when it can.
There are some interesting claims here if you're following before this. He begins with a video showcasing frequency fluctuations on a typical GPU, basically showcasing boost modes and how GPUs will alter frequencies to match workload. So the assumption he makes here is that PS5 isn't going to very much (50 MHz) here and there up and down from the 2.2 (Ghz) and not fluctuate like a PC GPU because the clocks are fixed due to power instead. But then goes on to imply it's the same way in Xbox, that it won't be hitting it's 12 at all times.

I don't think there is any information on this and some of those come across as leaps (educated though) but it's something that DF didn't report on because lets be real, we don't know

The only difference here is you might have some situations where it could drop and literally we're talking a couple of percentage here to gain back quite a significant amount of power that's the disadvantage I've covered before if you overclock a machine olike this or you run the clocks higher with with less cores (?) you run more power it's less efficient that's one of its impacts therefore if you take back maybe 50 megahertz you're going to gain aroudn 10% maybe more on the powersupply so ti's very minimal we're talking about 10.3 vs 12.1, that's the gap, it's pretty much consistent it's not going to cause an issue at all.
He makes some additional claims on PS5 being able to leverage more of it's avaialble system RAM (15.5) vs Xbox's (13.5) and that it's bandwidth is overall as a total system because it's not split is going to be faster.
"faster bandwidth overall combined with a much faster SSD"

Sort of a cringe quote for me. You don't average memory pools to get a single memory pool speed. GPU has access to only 10GB the CPU to both.

I respect the commentary about downplaying the TF difference due to just using something like dynamic resolution. That's probably pretty close to the truth. I expect it to be a wash largely at higher resolutions.
That being said, I don't believe necessarily agree that how graphical power used this generation will be duplicated for this coming generation. And that's something we need to see play out over time. It's a safe bet to sooth concern trolls though.

DF wouldn't have done something as reckless as create the narrative that the SSD will potentially have higher ceilings and performance than series X. While I don't know if the statement will turn out this way, the reason why this video is making the PS5 sound a lot better is because he made it sound better. He did within bounds, try to make PS5's deficits appear minimal and it's strengths significantly better, while emphasizing some of the weaker parts of Series X.

And that's fine but that is the difference between the 2 - which is why some people are feeling better about the PS5 now. Otherwise you're filling your head with internet narrative which isn't the best thing either to be honest.
 
Last edited:
There are some interesting claims here if you're following before this. He begins with a video showcasing frequency fluctuations on a typical GPU, basically showcasing boost modes and how GPUs will alter frequencies to match workload. So the assumption he makes here is that PS5 isn't going to very much (50 MHz) here and there up and down from the 2.2 (Ghz) and not fluctuate like a PC GPU because the clocks are fixed due to power instead. But then goes on to imply it's the same way in Xbox, that it won't be hitting it's 12 at all times.
I think he made it clear that he is talking theoretically and we have to see in practice to conclude better.
Regarding those TF's not being hit I think he made clear the power/performance differences between how GPUs work on PC, how the PS5 functions and why none will hit their theoretical max. He didnt say the GPU works the same on XBox.
 
How exactly did you come to that 30% conclusion and on what factual basis? As far as the drop goes according to Cerny it's not much at all, in fact it'll keep the peak throughput almost all the time. Even if it drops as much as 30% which the evidence is against it according to Cerny's words, it'll be so rare you'll almost never feel it in gameplay. 30% gap is certainly not representative of the TF gap at all like you suggested.

almost all the time is not good enough. These are boost clocks but what are the base clocks? the clocks where the cpu and gpu operate without any oscillation in power/performance?
 
I think he made it clear that he is talking theoretically and we have to see in practice to conclude better.
Regarding those TF's not being hit I think he made clear the power/performance differences between how GPUs work on PC, how the PS5 functions and why none will hit their theoretical max. He didnt say the GPU works the same on XBox.

I think he actually (accidentally) made the claim that PS5 will always be hitting it's max with 50Mhz fluctuations.
And then he went on to say that the Xbox will behave more like PC (in which it will boost as required). This is my understanding from the transcribing.

I actually felt this was wrong, there are no boost clocks on Xbox. They are fixed otherwise you'd get difference performance levels based on heat. MS explicitly said they were testing in hot environments to ensure that it would work and everyone was getting the same performance despite where you lived.
 
It's worth listening to the method used to control the frequency, it's for unforeseen usage pattern they cannot predict or simulate. They have to set clocks for worst cases which are not known, historically they went conservative. It's wasteful because if it could be much higher most of the time and they need to make it permanently lower because of badly coded main menus, the problem is obvious.

They can't do thermal throttling or current sensing throttling because it's not deterministic. This implementation is much more interesting because it's 100% deterministic.

They expect it will run at that frequency (or close) the majority of the time, and the potential drop would be small because dropping 10% power is only 2% clock. That's the official position. It gives us a ballpark.

The teardown and cooling solution will provide more info.
 
Are you just pulling numbers out of your butt now? Get a grip man, it's roughly 2 TF for majority of time if not all.

Take it easy. There are no facts or data given what the lowest clocks are. Im again going after data, github was tested at 2Ghz, i think that the comfortzone is in every situation, at the least, they probably workef their way up from there upclocking.
We are feee to discuss that because cerney didnt want to provide us with more info on that.

2000mhz is already a very high clock for a console.

DF just presented the information without opinions around it; and I feel that was probably the right thing to do.

NX Gamer made some leaps that I don't think DF would have published because they could be wrong; I respect that NX gamer is trying to clear up confusion, but I also think in doing so he may have portrayed some misinformation by accident.
I transcribed some his words:


I don't believe this is accurate and no one has any information on this as we debate this exact point here on B3D. If it were so trivial I think we would have seen a PS5 box by now.

There are some interesting claims here if you're following before this. He begins with a video showcasing frequency fluctuations on a typical GPU, basically showcasing boost modes and how GPUs will alter frequencies to match workload. So the assumption he makes here is that PS5 isn't going to very much (50 MHz) here and there up and down from the 2.2 (Ghz) and not fluctuate like a PC GPU because the clocks are fixed due to power instead. But then goes on to imply it's the same way in Xbox, that it won't be hitting it's 12 at all times.

I don't think there is any information on this and some of those come across as leaps (educated though) but it's something that DF didn't report on because lets be real, we don't know


He makes some additional claims on PS5 being able to leverage more of it's avaialble system RAM (15.5) vs Xbox's (13.5) and that it's bandwidth is overall as a total system because it's not split is going to be faster.
"faster bandwidth overall combined with a much faster SSD"

Sort of a cringe quote for me. You don't average memory pools to get a single memory pool speed. GPU has access to only 10GB the CPU to both.

I respect the commentary about downplaying the TF difference due to just using something like dynamic resolution. That's probably pretty close to the truth. I expect it to be a wash largely at higher resolutions.
That being said, I don't believe necessarily agree that how graphical power used this generation will be duplicated for this coming generation. And that's something we need to see play out over time. It's a safe bet to sooth concern trolls though.

DF wouldn't have done something as reckless as create the narrative that the SSD will potentially have higher ceilings and performance than series X. While I don't know if the statement will turn out this way, the reason why this video is making the PS5 sound a lot better is because he made it sound better. He did within bounds, try to make PS5's deficits appear minimal and it's strengths significantly better, while emphasizing some of the weaker parts of Series X.

And that's fine but that is the difference between the 2 - which is why some people are feeling better about the PS5 now. Otherwise you're filling your head with internet narrative which isn't the best thing either to be honest.

Completely agree, im siding suth DF on this one aswell, seems like hes trying to spin and bens untill it sounds better.
Even the ssd in xsx could be very close or even as fast if you theorize enough.
Also there are many videos out there claiming the opposite, i stick to DF though ;)

My personal opinion is that it wont matter so much in the end, in 9tf range or just 10, its a sizeable diff to me compared to the competition. He talks about resolution but xsx might just also do advanced upscaling, leaving headroom for other things (2/3 navi tf equals a one x). Framerates are more likely to be a cpu thing.
 
You still need something that will ensure 128GB is fed fully by something like....an SSD? Otherwise you will have unused ram.
Also you would have never EVER got 128GB or RAM in a console this generation, just by cost and size alone.
So it was never ever a solution. So why talk about an unrealistic scenario when the SSD is indeed a realistic practical and cost effective solution?
Ideally we would want everything to be better but we live in the real world with lots of limitation.
So yes I d say SSD at this point is one of the greatest things ever because without it you d be stuck with just 16GB of RAM
I am only bringing up 128GB because that dev mentioned the SSD as the most important thing in his career for game design. Last gen we had 16x improvement in amount of RAM. If this gen has the same upgrade, it would be just a big of a thing for game designers. So the guy is either BS or wasn't even around last gen. I fully know that 128GB if ram is unrealistic in current gen consoles. But if you are going to compare the improvement of a new piece of hardware for how much it will change game design, you should be comparing to the hypothetical increase that were apparent from previous generational jumps which is what the previous generations actually did. How much did game design change last gen when they moved from 512MB of ram for 8GB? Is that somehow smaller than the jump we are going to see this gen? I highly doubt that.
 
How much did game design change last gen when they moved from 512MB of ram for 8GB? Is that somehow smaller than the jump we are going to see this gen? I highly doubt that.
Game design, Actually, not that much, considering the jump! And that’s because that RAM still had to be fad by a slow ass mechanical hard drive, which is exactly what we and others have been saying.
 
It's worth listening to the method used to control the frequency, it's for unforeseen usage pattern they cannot predict or simulate. They have to set clocks for worst cases which are not known, historically they went conservative. It's wasteful because if it could be much higher most of the time and they need to make it permanently lower because of badly coded main menus, the problem is obvious.

They can't do thermal throttling or current sensing throttling because it's not deterministic. This implementation is much more interesting because it's 100% deterministic.
Thermal throttling is likely there, as with any desktop processor even those with “guaranteed” fixed clock, but only for overheat protection. As long as the cooling system is sufficiently provisioned for the target TDP, thermal throttling would not kick in. The declared DVFS ranges are practically guaranteed, and the behaviour is as deterministic as defined by the power management algorithm.

That has been how both AMD and Intel desktop processors behave at least. People dizzing PS5 using DVFS over thermal throttling is equivalent to speculating the consoles maker deliberately skimping on cooling, when they know how much heat is needed to move away from their chip. It could happen, but then that would be an engineering screw-up. :s
 
Game design, Actually, not that much, considering the jump! And that’s because that RAM still had to be fad by a slow ass mechanical hard drive, which is exactly what we and others have been saying.
Lol, go play a ps3 era open world game vs a ps4 era one and see how much game design changed. Or for a bigger difference, a ps2 era open world game vs a ps3 one. It's as if you guys don't remember how much different games used to be and are just hyping yourselves up cause you can.

In about a year come back and try a ps4 game vs ps5 game. My expectations is a lot more streamlined fast travel and next to no initial load time is about the only game design improvements. The other streaming assets advantages are barely game play applicable.
 
I am only bringing up 128GB because that dev mentioned the SSD as the most important thing in his career for game design. Last gen we had 16x improvement in amount of RAM. If this gen has the same upgrade, it would be just a big of a thing for game designers. So the guy is either BS or wasn't even around last gen. I fully know that 128GB if ram is unrealistic in current gen consoles. But if you are going to compare the improvement of a new piece of hardware for how much it will change game design, you should be comparing to the hypothetical increase that were apparent from previous generational jumps which is what the previous generations actually did. How much did game design change last gen when they moved from 512MB of ram for 8GB? Is that somehow smaller than the jump we are going to see this gen? I highly doubt that.
All generations felt starved for memory even with 16x improvement. We had to read games either from a slow ass CD or DVD or BR or from an HDD. Memory was so low that even that 16x improvement was low. Memory was low in itself plus it had to fill itself with wasteful information.
Even with just double the memory 16GB is Ok'ish by today's standards because we are reaching diminishing returns in quality of assets. That coupled with super fast transfer of data, makes it a better improvement than a 16x of memory plus eliminates a lot of game design headaches that were frustrating developers since forever.
 
Lol, go play a ps3 era open world game vs a ps4 era one and see how much game design changed. Or for a bigger difference, a ps2 era open world game vs a ps3 one. It's as if you guys don't remember how much different games used to be and are just hyping yourselves up cause you can.

In about a year come back and try a ps4 game vs ps5 game. My expectations is a lot more streamlined fast travel and next to no initial load time is about the only game design improvements. The other streaming assets advantages are barely game play applicable.

So Cerny lied? Let's wait for some exclusives - this should really release the shackles of the PlayStation devs - no more thinking about designing your game around slow-ass streaming restrictions.

Also, regarding PS4 open world games, largely we're looking at a bit more life and better graphics - there's little in the way of true innovation. PS3 from PS2 was probably helped by the introduction of HDD into consoles as standard...funny that.
 
Last edited:
Lol, go play a ps3 era open world game vs a ps4 era one and see how much game design changed. Or for a bigger difference, a ps2 era open world game vs a ps3 one. It's as if you guys don't remember how much different games used to be and are just hyping yourselves up cause you can.

In about a year come back and try a ps4 game vs ps5 game. My expectations is a lot more streamlined fast travel and next to no initial load time is about the only game design improvements. The other streaming assets advantages are barely game play applicable.

1. Don’t “lol” me.
2. I remember perfectly well how open world games on PS3 were compared to PS4. They looked much worse, of course, and the worlds were smaller and less detailed.
But “game design” is exactly the same! Engines trying to hide the slow streaming from HDD in a variety of ways.
Nothing changed in terms of “game design” from one gen to the next.
 
1. Don’t “lol” me.
2. I remember perfectly well how open world games on PS3 were compared to PS4. They looked much worse, of course, and the worlds were smaller and less detailed.
But “game design” is exactly the same! Engines trying to hide the slow streaming from HDD in a variety of ways.
Nothing changed in terms of “game design” from one gen to the next.

lol
 
let's imagine how a PS5 version of Ghost of Tsushima will look like

Mod : Meme removed. They were tolerated in the Baseless thread. B3D is about intelligent conversation, not memes.
 
Last edited by a moderator:
Back
Top