TeamGhobad
Newcomer
Are you just pulling numbers out of your butt now? Get a grip man, it's roughly 2 TF for majority of time if not all.
if ps5 can't maintain 10.3 tflops at all times then the gap is more like 30%.
Are you just pulling numbers out of your butt now? Get a grip man, it's roughly 2 TF for majority of time if not all.
How exactly did you come to that 30% conclusion and on what factual basis? As far as the drop goes according to Cerny it's not much at all, in fact it'll keep the peak throughput almost all the time. Even if it drops as much as 30% which the evidence is against it according to Cerny's words, it'll be so rare you'll almost never feel it in gameplay. 30% gap is certainly not representative of the TF gap at all like you suggested.if ps5 can't maintain 10.3 tflops at all times then the gap is more like 30%.
DF just presented the information without opinions around it; and I feel that was probably the right thing to do.This guy has made an even better analysis than DF. Suddenly the PS5 sounds even more impressive
I don't believe this is accurate and no one has any information on this as we debate this exact point here on B3D. If it were so trivial I think we would have seen a PS5 box by now.This means that it will fluctuate all the time throughout its life cycle but it's under its own control. It doesn't need to worry about power which means they can build a cooling and acoustic system that is designed around the consistent level of power its always pushing through the system the TDP that means that the performance of the chip is always consistent.
There are some interesting claims here if you're following before this. He begins with a video showcasing frequency fluctuations on a typical GPU, basically showcasing boost modes and how GPUs will alter frequencies to match workload. So the assumption he makes here is that PS5 isn't going to very much (50 MHz) here and there up and down from the 2.2 (Ghz) and not fluctuate like a PC GPU because the clocks are fixed due to power instead. But then goes on to imply it's the same way in Xbox, that it won't be hitting it's 12 at all times.I would seriously doubt it's going to vary much maybe 50 megahertz here and there up and down from that 2.2 (GHz) so a lot of the conversation will be around the fact that, that's up to 10.3 TF but actually that's not the case at all it's always going to hit that peak theoretical limit that when it needs to and how often it needs to, the same as the Xbox One X and Series X, it's 12.1 TF machine it will not be processing twelve point one teraflops at all times it just means that's the maximum it needs to hit when it can.
He makes some additional claims on PS5 being able to leverage more of it's avaialble system RAM (15.5) vs Xbox's (13.5) and that it's bandwidth is overall as a total system because it's not split is going to be faster.The only difference here is you might have some situations where it could drop and literally we're talking a couple of percentage here to gain back quite a significant amount of power that's the disadvantage I've covered before if you overclock a machine olike this or you run the clocks higher with with less cores (?) you run more power it's less efficient that's one of its impacts therefore if you take back maybe 50 megahertz you're going to gain aroudn 10% maybe more on the powersupply so ti's very minimal we're talking about 10.3 vs 12.1, that's the gap, it's pretty much consistent it's not going to cause an issue at all.
I think he made it clear that he is talking theoretically and we have to see in practice to conclude better.There are some interesting claims here if you're following before this. He begins with a video showcasing frequency fluctuations on a typical GPU, basically showcasing boost modes and how GPUs will alter frequencies to match workload. So the assumption he makes here is that PS5 isn't going to very much (50 MHz) here and there up and down from the 2.2 (Ghz) and not fluctuate like a PC GPU because the clocks are fixed due to power instead. But then goes on to imply it's the same way in Xbox, that it won't be hitting it's 12 at all times.
How exactly did you come to that 30% conclusion and on what factual basis? As far as the drop goes according to Cerny it's not much at all, in fact it'll keep the peak throughput almost all the time. Even if it drops as much as 30% which the evidence is against it according to Cerny's words, it'll be so rare you'll almost never feel it in gameplay. 30% gap is certainly not representative of the TF gap at all like you suggested.
I think he made it clear that he is talking theoretically and we have to see in practice to conclude better.
Regarding those TF's not being hit I think he made clear the power/performance differences between how GPUs work on PC, how the PS5 functions and why none will hit their theoretical max. He didnt say the GPU works the same on XBox.
These are the base clocks. Any deviation from 3.5GHz and 2.23GHz is by definition, temporary and load dependent.almost all the time is not good enough. These are boost clocks but what are the base clocks? the clocks where the cpu and gpu operate without any oscillation in power/performance?
Are you just pulling numbers out of your butt now? Get a grip man, it's roughly 2 TF for majority of time if not all.
DF just presented the information without opinions around it; and I feel that was probably the right thing to do.
NX Gamer made some leaps that I don't think DF would have published because they could be wrong; I respect that NX gamer is trying to clear up confusion, but I also think in doing so he may have portrayed some misinformation by accident.
I transcribed some his words:
I don't believe this is accurate and no one has any information on this as we debate this exact point here on B3D. If it were so trivial I think we would have seen a PS5 box by now.
There are some interesting claims here if you're following before this. He begins with a video showcasing frequency fluctuations on a typical GPU, basically showcasing boost modes and how GPUs will alter frequencies to match workload. So the assumption he makes here is that PS5 isn't going to very much (50 MHz) here and there up and down from the 2.2 (Ghz) and not fluctuate like a PC GPU because the clocks are fixed due to power instead. But then goes on to imply it's the same way in Xbox, that it won't be hitting it's 12 at all times.
I don't think there is any information on this and some of those come across as leaps (educated though) but it's something that DF didn't report on because lets be real, we don't know
He makes some additional claims on PS5 being able to leverage more of it's avaialble system RAM (15.5) vs Xbox's (13.5) and that it's bandwidth is overall as a total system because it's not split is going to be faster.
"faster bandwidth overall combined with a much faster SSD"
Sort of a cringe quote for me. You don't average memory pools to get a single memory pool speed. GPU has access to only 10GB the CPU to both.
I respect the commentary about downplaying the TF difference due to just using something like dynamic resolution. That's probably pretty close to the truth. I expect it to be a wash largely at higher resolutions.
That being said, I don't believe necessarily agree that how graphical power used this generation will be duplicated for this coming generation. And that's something we need to see play out over time. It's a safe bet to sooth concern trolls though.
DF wouldn't have done something as reckless as create the narrative that the SSD will potentially have higher ceilings and performance than series X. While I don't know if the statement will turn out this way, the reason why this video is making the PS5 sound a lot better is because he made it sound better. He did within bounds, try to make PS5's deficits appear minimal and it's strengths significantly better, while emphasizing some of the weaker parts of Series X.
And that's fine but that is the difference between the 2 - which is why some people are feeling better about the PS5 now. Otherwise you're filling your head with internet narrative which isn't the best thing either to be honest.
(2/3 navi tf equals a one x).
I am only bringing up 128GB because that dev mentioned the SSD as the most important thing in his career for game design. Last gen we had 16x improvement in amount of RAM. If this gen has the same upgrade, it would be just a big of a thing for game designers. So the guy is either BS or wasn't even around last gen. I fully know that 128GB if ram is unrealistic in current gen consoles. But if you are going to compare the improvement of a new piece of hardware for how much it will change game design, you should be comparing to the hypothetical increase that were apparent from previous generational jumps which is what the previous generations actually did. How much did game design change last gen when they moved from 512MB of ram for 8GB? Is that somehow smaller than the jump we are going to see this gen? I highly doubt that.You still need something that will ensure 128GB is fed fully by something like....an SSD? Otherwise you will have unused ram.
Also you would have never EVER got 128GB or RAM in a console this generation, just by cost and size alone.
So it was never ever a solution. So why talk about an unrealistic scenario when the SSD is indeed a realistic practical and cost effective solution?
Ideally we would want everything to be better but we live in the real world with lots of limitation.
So yes I d say SSD at this point is one of the greatest things ever because without it you d be stuck with just 16GB of RAM
Game design, Actually, not that much, considering the jump! And that’s because that RAM still had to be fad by a slow ass mechanical hard drive, which is exactly what we and others have been saying.How much did game design change last gen when they moved from 512MB of ram for 8GB? Is that somehow smaller than the jump we are going to see this gen? I highly doubt that.
Thermal throttling is likely there, as with any desktop processor even those with “guaranteed” fixed clock, but only for overheat protection. As long as the cooling system is sufficiently provisioned for the target TDP, thermal throttling would not kick in. The declared DVFS ranges are practically guaranteed, and the behaviour is as deterministic as defined by the power management algorithm.It's worth listening to the method used to control the frequency, it's for unforeseen usage pattern they cannot predict or simulate. They have to set clocks for worst cases which are not known, historically they went conservative. It's wasteful because if it could be much higher most of the time and they need to make it permanently lower because of badly coded main menus, the problem is obvious.
They can't do thermal throttling or current sensing throttling because it's not deterministic. This implementation is much more interesting because it's 100% deterministic.
Lol, go play a ps3 era open world game vs a ps4 era one and see how much game design changed. Or for a bigger difference, a ps2 era open world game vs a ps3 one. It's as if you guys don't remember how much different games used to be and are just hyping yourselves up cause you can.Game design, Actually, not that much, considering the jump! And that’s because that RAM still had to be fad by a slow ass mechanical hard drive, which is exactly what we and others have been saying.
All generations felt starved for memory even with 16x improvement. We had to read games either from a slow ass CD or DVD or BR or from an HDD. Memory was so low that even that 16x improvement was low. Memory was low in itself plus it had to fill itself with wasteful information.I am only bringing up 128GB because that dev mentioned the SSD as the most important thing in his career for game design. Last gen we had 16x improvement in amount of RAM. If this gen has the same upgrade, it would be just a big of a thing for game designers. So the guy is either BS or wasn't even around last gen. I fully know that 128GB if ram is unrealistic in current gen consoles. But if you are going to compare the improvement of a new piece of hardware for how much it will change game design, you should be comparing to the hypothetical increase that were apparent from previous generational jumps which is what the previous generations actually did. How much did game design change last gen when they moved from 512MB of ram for 8GB? Is that somehow smaller than the jump we are going to see this gen? I highly doubt that.
Lol, go play a ps3 era open world game vs a ps4 era one and see how much game design changed. Or for a bigger difference, a ps2 era open world game vs a ps3 one. It's as if you guys don't remember how much different games used to be and are just hyping yourselves up cause you can.
In about a year come back and try a ps4 game vs ps5 game. My expectations is a lot more streamlined fast travel and next to no initial load time is about the only game design improvements. The other streaming assets advantages are barely game play applicable.
Lol, go play a ps3 era open world game vs a ps4 era one and see how much game design changed. Or for a bigger difference, a ps2 era open world game vs a ps3 one. It's as if you guys don't remember how much different games used to be and are just hyping yourselves up cause you can.
In about a year come back and try a ps4 game vs ps5 game. My expectations is a lot more streamlined fast travel and next to no initial load time is about the only game design improvements. The other streaming assets advantages are barely game play applicable.
1. Don’t “lol” me.
2. I remember perfectly well how open world games on PS3 were compared to PS4. They looked much worse, of course, and the worlds were smaller and less detailed.
But “game design” is exactly the same! Engines trying to hide the slow streaming from HDD in a variety of ways.
Nothing changed in terms of “game design” from one gen to the next.