Playstation 5 [PS5] [Release November 12 2020]

Any statement made to the public is PR.
Thanks brit.

That last sentence, I really needed that.
It just clicked in my head. PR = public relations.

all these years I didn't make that connection. Yikes.

Edit: this post is not sarcasm. Why am I keeps making replies that sounds like sarcasm.... And it's not just on b3d
 
Jason Schreier's words are as much worth as that user who claims to be an industry developer. Don't put too much faith in Jason, he has dissapointed many people with his claims pre-ps5-reveal. I kid you not, some people know whats going on, but they won't make a show of it.

This is ridiculous. Jason is one of the most respected journalists in the industry and obviously has sources. The other is a random internet commenter that claims to be a developer with no substantiation. There’s no reason to hold their comments in the same esteem.

If people are disappointed in PS5 based on his comments, they lack reading comprehension.
 
I do have to say - Cerny’s presentation does not look like PR had anything to do with it.

Or it had to be something like: ‘we have no idea what you’re saying, but we like your enthusiasm and well run with it unedited’ ...
 
Let's please keep various attacks out of the discussions.
 
Can we at least argue about the semantics that were actually spoken by Cerny?
The Backwards Compatibility controversy proves the wisdom in this approach.
 
I do have to say - Cerny’s presentation does not look like PR had anything to do with it.

Or it had to be something like: ‘we have no idea what you’re saying, but we like your enthusiasm and well run with it unedited’ ...
It was prefaced by... this is a GDC presentation which we're putting on the internet because GDC was canceled.

It was identical to all previous GDC presentations from Cerny. But we're now missing all the juicy series of interviews we usually get at GDC, which were usually more PR-ish and more targetted at the public. :(
 
Ok this might be crazy talk but personally until I see PS5 holding that GPU clock consistently I'm going to treat it as PR war tactic spiel to make the difference to the competition seem smaller than reality. 9 vs 12 is a problem, 10 vs 12 not so much psychologically. Just like PS3 was first announced and "supposed" to launch at spring 2006, cutting X360's advantage in half... RSX was supposed to operate at 550Mhz, but was backtracked to 500Mhz and I don't believe PS3 was ever coming out before the fall of 2006.

It will be easy to backtrack from this position later on, what matters is that they got that 10+ teraflops out there in peoples minds. If it really holds that clock, great!

Not saying MS hasn't done it's share of similar antics or muddying the waters.
 
It was prefaced by... this is a GDC presentation which we're putting on the internet because GDC was canceled.

It was identical to all previous GDC presentations from Cerny. But we're now missing all the juicy series of interviews we usually get at GDC, which were usually more PR-ish and more targetted at the public. :(
DF promised a follow up article. Can’t come soon enough.
 
I’d just like to say regrading the doubts on the clocks maintaining 2.23ghz...there was a lot of doubt over the SSD back in the day, but they have delivered.
 
Ok this might be crazy talk but personally until I see PS5 holding that GPU clock consistently I'm going to treat it as PR war tactic spiel to make the difference to the competition seem smaller than reality. 9 vs 12 is a problem, 10 vs 12 not so much psychologically. Just like PS3 was first announced and "supposed" to launch at spring 2006, cutting X360's advantage in half... RSX was supposed to operate at 550Mhz, but was backtracked to 500Mhz and I don't believe PS3 was ever coming out before the fall of 2006.

It will be easy to backtrack from this position later on, what matters is that they got that 10+ teraflops out there in peoples minds. If it really holds that clock, great!

Not saying MS hasn't done it's share of similar antics or muddying the waters.
There's a real complexity here to predict how close they will be. An arbitrary round number for only one aspect is for technical forum posters.

The real test will be games performance: if watching clips of games back-to-back indicates an easily perceptible diference, Xbsx have the edge. If you have to pause and squint, or use image analysis tools, they're close enough.

Same for SSD advantage, if loading time is easily noticeable or we see games with noticeably more texture details, PS have the edge. But if we need to use a chronometer to figure out which one loads faster, and we need to pause and squint to see additonal texture details, they're close enough.
 
I’d just like to say regrading the doubts on the clocks maintaining 2.23ghz...there was a lot of doubt over the SSD back in the day, but they have delivered.
It takes a while for people to process things. The variable clock rate was dropped on us, it's not something they've been communicating over time unlike their SSD solution.

They can totally maintain 2.23Ghz clock on the GPU side of things.
but it's going to come at a cost of penalization of CPU clocks or it's going to come in at a higher cost for the console to deal with better parts across the board from PSU, VRMs to Silicon yield for handling higher voltages if they want to have both hold their max cap rates at the same time.

Due to the way that they setup the chip to be 'deterministic', there's no magic here, something must give.
 
I guess we’ll find out how open Mark Cerny is when we find out what “a few percent” ends up meaning.
Hoping we get more details in general about PS5's GPU features. Very little was said about RT and things like VRS(or their implementation of VRS) etc.
 
Hoping we get more details in general about PS5's GPU features. Very little was said about RT and things like VRS(or their implementation of VRS) etc.
it should have all the same hardware features as XSX. Unless they are custom, but like they said RDNA 2.0 is DX12 Ultimate certified. And XSX is DX12 Ultimate certified.

By in large, excluding the application layer of DX12 items, it should contain the same stuff.

Unless this is the bespoke RDNA 1.9 which I have been mocking for some time now, which could be everything but those minor DX12 Ultimate customizations to be fully DX12 Ultimate certified.
 
it should have all the same hardware features as XSX. Unless they are custom, but like they said RDNA 2.0 is DX12 Ultimate certified. And XSX is DX12 Ultimate certified.

By in large, excluding the application layer of DX12 items, it should contain the same stuff.

Unless this is the bespoke RDNA 1.9 which I have been mocking for some time now, which could be everything but those minor DX12 Ultimate customizations to be fully DX12 Ultimate certified.
I agree it will most likely be the same but I just found it strange that Sony didn't talk about RT more in their presentation. For most people including myself that's one of the most exciting features for next gen systems.
 
I agree it will most likely be the same but I just found it strange that Sony didn't talk about RT more in their presentation. For most people including myself that's one of the most exciting features for next gen systems.
They needed to focus on what was different and unique Sony was bringing to the field.
 
Ok this might be crazy talk but personally until I see PS5 holding that GPU clock consistently I'm going to treat it as PR war tactic spiel to make the difference to the competition seem smaller than reality. 9 vs 12 is a problem, 10 vs 12 not so much psychologically. Just like PS3 was first announced and "supposed" to launch at spring 2006, cutting X360's advantage in half... RSX was supposed to operate at 550Mhz, but was backtracked to 500Mhz and I don't believe PS3 was ever coming out before the fall of 2006.

It will be easy to backtrack from this position later on, what matters is that they got that 10+ teraflops out there in peoples minds. If it really holds that clock, great!

Not saying MS hasn't done it's share of similar antics or muddying the waters.
We struggled with the idea of 2ghz, now we're looking at a little than 10% higher, there's good reason to have questions.
 
It takes a while for people to process things. The variable clock rate was dropped on us, it's not something they've been communicating over time unlike their SSD solution.

They can totally maintain 2.23Ghz clock on the GPU side of things.
but it's going to come at a cost of penalization of CPU clocks or it's going to come in at a higher cost for the console to deal with better parts across the board from PSU, VRMs to Silicon yield for handling higher voltages if they want to have both hold their max cap rates at the same time.

Due to the way that they setup the chip to be 'deterministic', there's no magic here, something must give.
Given how much more CPU is available for developers this time, it seems to me that this had more to do with just the overall heat generation.

If the CPU was using 60% of it's resources but the GPU was at 100%, a 10% increase in GPU clock might prevent some latency while the proportional reduction in CPU speed should simply result in CPU using more (but not all) of the available resources at lower clock rate.

So something like ND analyzer tool which looks at overall utilization would allow developers to seesaw the clock speeds as needed. My concern is future compatibility with PS6 and potentially time spent playing around with optimization.
 
Given how much more CPU is available for developers this time, it seems to me that this had more to do with just the overall heat generation.

If the CPU was using 60% of it's resources but the GPU was at 100%, a 10% increase in GPU clock might prevent some latency while the proportional reduction in CPU speed should simply result in CPU using more (but not all) of the available resources at lower clock rate.

So something like ND analyzer tool which looks at overall utilization would allow developers to seesaw the clock speeds as needed. My concern is future compatibility with PS6 and potentially time spent playing around with optimization.
Perhaps, you made some good points earlier and you could definitely be right about that.

But to move the GPU clock 10% by taking power from the CPU which uses significantly less, I expect to see a large swing.
Removing 40% power from CPU to move up GPU by 10% for instance (numbers).

If you remove 40% power from the CPU what frequency are we left with.
I don't believe this is a hypothetical scenario that may 'rarely' show up. Everyone loves Sony 1P for it's graphics and how immersive their games are. That should by default push the GPU to full saturation as they have done and always have done. So what's left for the CPU in those types of situations? Very curious because this was the number that wasn't talked about. Cerny only talked about best case scenarios. He did talk about the CPU maybe pulling power from the GPU (and saying it's wouldn't be much) ; but he gave us no hints on the other way around.

CPU matters!
 
Back
Top