Playstation 5 [PS5] [Release November 12 2020]

Sony come across as a bit shell shocked, and their repeated mantra of SSD and 3D Audio is telling to me.
Personally, I find those improvements more exciting than yet more graphics and a better hook to try and get me to buy. Every generation we get better visuals and we assume that. I think you don't need to communicate "it's going to look better than last gen" any more to reach the public. Basically, Sony could just say, "PS5!" and leave their marketing at that, and everyone will expect another better-looking PS. If you're trying to make a difference elsewhere, on top of the predictable 'better graphics', it'd be important to really push that communication to show it's not just an afterthought.
 
exactly. Graphics today are already great, not to say i would not like even better graphics, but what bothers me more now with such detailed graphics, are all the streaming / poppin issues we still see and that are even more pronounced now, like when playing CoD warzone for exemple, when you land on the ground, sometimes textures take a few seconds to load, so you land in a N64 game for a few seconds.
 
Last edited:
I think it's pretty obvious Sony got caught out with the XSX being so powerful this next gen. Nobody would design a system from scratch to have variable frequency. It's not a preferred thing, and is evidence of a reactionary step.
The PS4 and PS4 Pro were quite ordinary from a specs point of view. At the time the PS4 released, a 1.8tflop GPU was pretty average, and the Jaguar cores were weak. Even when the PS4 Pro came out, it was quite average spec wise. It was only the fact that the Xbox stuffed up with Kinect that the PS looked good.
So a 36CU unit at 9.2tflop GPU for PS5 seems inline with the PS4 to me. However MS went balls out to make sure they got the power crown again.
Sony got wind of the XSX power levels and worked out they could not go into next gen with such a power difference, especially since it was single digit vs double digit (always looks worse).
It may appear that way but then it might have been designed this way from the outset. I don't think anyone quite expected the gains we've seen from GCN to RDNA1 and now RDNA2, I think it's possible Cerny, knowing the PS5 would be only ~3TF more than the X (obviously on paper only) would look bad and there maybe he thought this would be a good way to eek as much out of the available hardware. This (in my mind) is a bit like overclocking in the PC realm, where you can get (figures from my ass) a £50 CPU to run as fast as a £100 CPU. So maybe it was always the plan, maybe this will be the way forward for console design...remeber the complaints against the Cell and now multi-core CPUs are the norm.

Because they were using one of the excess CUs for their Tempest Engine it meant they couldnt enable the extra four Cu's on the die to take it up to 40 Cu's, so the only option was to push the clocks as far as they could.
Sony come across as a bit shell shocked, and their repeated mantra of SSD and 3D Audio is telling to me.
This makes no sense, you're essentially saying that Sony gave up 4 CUs for the Tempest engine...it would have been far better to go up to 40CUs and rethink the Audio options...just because it's based on a CU doesn't mean it's cost them the option to have 40CUs.

However, this short fall in power has a good upside to people like me who love tech, and that is that is impressive to see a console clocked so high that nobody thought it possible. I am also really interested to see what cooling solution they have come up with. To cool a smaller APU with such high clocks is going to take some smarts. I dont doubt Sony's ability here.
So while I have no doubt the XSX is a far superior machine, in almost every way, I am actually far more interested to learn more about the PS5 than XSX now.
lol, what is this!? Conern trolling? 'It's crap but wow it's running so fast so that's good - shame the other console is far superior'.

I would be pretty sure developers would rather not have to deal with variable clock speeds. If one had to choose, I would assume they would rather know up front what they were working with, and they would rather avoid having to divert power from the CPU to the GPU etc.
I'm sure devs would rather not have a split RAM pool.

The fact that they do have variable clocks shows that it was an add on.

We all know that power usage isn't linear with clock speeds. The PS5 GPU at 2.23ghz is going to be anything other than efficient. It will effect the yields in production, and will require a more expensive cooling system as well.
Yes, because Cerny told us so. What is your point?

We also know that initially they were going with a 2.0ghz native clock speed.
Old data on old tech.

I'm not sure there will be a refresh this time.
Last gen there was the jump to 4k Tv's, and when they did the upgrade they were able to increase the GPU power by 2.25 x in the PS4 Pro and 4.25 x in the X. In a few years time they won't be able to release a new console with those types of increases.
Refresh for better RT with 4K60 native - why not?

I dont think they have sorted everything out just yet. They are probably still finalizing designs etc. I mean, who shows their controller off before the console?
They hadn't shown PS4 by now either. Who shows random pics of various parts of a console rather than the whole thing? It's how they are revealing their product...no need for the 'concern'.

I totally agree that there will be little perceivable difference between the two consoles side by side.
So one is "far superior" but hardly noticable differences side by side? OK.

The developer will matter more than the hardware will this gen.
Just thought I'd quote the only bit I agree with, how are those Sony devs...any good?

My point isn't that 12.1tflos will be a big difference from.10.24tflops. My point is that initially Sony aimed lower than MS did with their consoles. Sony was originally running with a 9.2tflop GPU and then made the decision to try and ramp that up to a number closer to the XSX. They have been reacting to MS, and that shows.
Based on (again)?

They are certainly pushing the SSD and 3D Audio, but the fact of them having variable clocks also shows they have tried to up the power of their console more than originally planned.
The end result is a better console for PS owners than they may have originally ended up with.
Based on (again, again)?

The truth will likely come out one day, but I don't think you can just assume so much - you're essentially saying Sony sacrificed 4 CUs to have the Audio chip so decided to just ramp the shit out of the hardware instead. They may have planned this from the outset, we have the weird cooling system on the dev kit as well as the cooling patent that suggests Sony have been potentially going down this path for a while.
 
All this is raising a lot of questions for me. For instance, we know the memory bandwidth, but what I am wondering about is how much of work that is traditionally done in memory can now be replaced or bypassed by streaming data from SSD straight into the GPU processors?

There are many hints that suggest the PS5’s SSD implementation can solve sole important bottlenecks, and I can’t wait to get actual data from developers and see games making use of it.
 
Amen to that. My brothers PS4 Pro when playing COD sounds like a jumbo jet.

On ps4 pro it was sourcing issue and TIM issue. If sony doing that again with PS5, you'll be in a lottery to get a quiet or noisy console, like ps4 and PS4 pro.

In my case my ps4 pro got the quiet nidec fan but the TIM is tucking fucking horrible. After I replaced all the thermal pads and paste, I can no longer hear my ps4 pro even on quiet scenes.

It used to scream like a jet taking off. Now I only hear a slight whoosh if the room temperature is so hot, I'm sweating buckets
 
Not to mention the split pool of memory might get into the way of reaching maximum throughputs on the X which might further drag the difference down.

The bandwith situation seems to favor the XSX over the PS5, dunno if it's the right call to say MS made the worse decision there. I think we had a thread on it.

The 20% TF figure advantage to XSX, that's considering the GPU is running at 2.23ghz then, besides that, DF mentioned in one of their videos that clock speeds dont gain as much as going wider. It also remains to be seen how much graphics an SSD can provide.
To call Sony made the better choice.... i dunno, i dont think either made a 'better' choice. They just had different visions and power standards from the beginning. A faster cpu, more powerfull GPU (2+TF of navi2 equals what, a PS4 Pro), higher BW to prevent bottlenecks like the Pro, a SSD solution that will be close. Audio is unknown really, they are both equipped with hardware 3d audio.
 
All this is raising a lot of questions for me. For instance, we know the memory bandwidth, but what I am wondering about is how much of work that is traditionally done in memory can now be replaced or bypassed by streaming data from SSD straight into the GPU processors?

There are many hints that suggest the PS5’s SSD implementation can solve sole important bottlenecks, and I can’t wait to get actual data from developers and see games making use of it.


From cerny presentation, it would make devs life easier for

- no more designing "loading obscurer" like winding roads, super tall objects, zoom, slow walk, etc
- no more designing efficient data dupes. For example there are thousands of duplicated post box in spider man.
- no more tradeoff designing for which objects got high quality texture

I suspect devs probably can even dump a 3d scanned room / sets into the game it it'll just work, no asset streaming hitching. Instant realistic looking graphics. At least for static time of day or interior levels. (on my pc, currently with 6GB VRAM, max I could get is 3d scanned and exported to 4 parts of 4096 x 4096 texture. Higher than that, crash)
 
Last edited:
If you believe Jason Schreier (yeah, i don't, but anyway), devs are very happy with ps5 anyway, sometimes more than xbsx.

Or if you believe the reaction of many non anonymous third party devs on twitter too. There is a lot of reaction from a general reaction about the console like some people of Id software to tons of audio engineer about Tempest or third party dev talking about the SSD.
 
Last edited:
If you believe Jason Schreier (yeah, i don't, but anyway), devs are very happy with ps5 anyway, sometimes more than xbsx.
Why wouldn't you believe him out of interest?

All this is raising a lot of questions for me. For instance, we know the memory bandwidth, but what I am wondering about is how much of work that is traditionally done in memory can now be replaced or bypassed by streaming data from SSD straight into the GPU processors?

There are many hints that suggest the PS5’s SSD implementation can solve sole important bottlenecks, and I can’t wait to get actual data from developers and see games making use of it.
Both PS5 and the XSX SSD will, people seem eager to write off the PS5 advantage but it's almost DDR3 speed (apparently) so that's some fast asset access...I can't wait for the first DF face-offs.
 
Both PS5 and the XSX SSD will, people seem eager to write off the PS5 advantage but it's almost DDR3 speed (apparently) so that's some fast asset access...I can't wait for the first DF face-offs.

I think it's too early to write off either console, both are being written off depending on who you talk to. In the end it comes down to games and eco-system. Most people wont care about a more powerfull gpu/cpu or better sound.
 
i wonder if there are also hardware features in PS5 designed especially for VR ? Or maybe there will be some HW inside the PSVR2 itself.
 
Why wouldn't you believe him out of interest?


Both PS5 and the XSX SSD will, people seem eager to write off the PS5 advantage but it's almost DDR3 speed (apparently) so that's some fast asset access...I can't wait for the first DF face-offs.

Yes XBX will likely have similar features but I very much on purpose left the XBX out of the discussion here, and I recommend everyone to do the same in this thread.

Anyway, will be very interesting to know what bandwidth (and memory) savings can be made and where, and what the differences will be for what type of game.
 
It may appear that way but then it might have been designed this way from the outset. I don't think anyone quite expected the gains we've seen from GCN to RDNA1 and now RDNA2, I think it's possible Cerny, knowing the PS5 would be only ~3TF more than the X (obviously on paper only) would look bad and there maybe he thought this would be a good way to eek as much out of the available hardware. This (in my mind) is a bit like overclocking in the PC realm, where you can get (figures from my ass) a £50 CPU to run as fast as a £100 CPU. So maybe it was always the plan, maybe this will be the way forward for console design...remeber the complaints against the Cell and now multi-core CPUs are the norm.
In no way does cell act like multicore CPUs.
The 360 actually had a multicore CPU.
This makes no sense, you're essentially saying that Sony gave up 4 CUs for the Tempest engine...it would have been far better to go up to 40CUs and rethink the Audio options...just because it's based on a CU doesn't mean it's cost them the option to have 40CUs.
You do know that at a certain point into the design of a next gen GPU (like 18 months before release) you can just make those changes. You can choose to enable extra Cus and take the yield hit, but you cant go redesigning the chip.
lol, what is this!? Conern trolling? 'It's crap but wow it's running so fast so that's good - shame the other console is far superior'.
Way to over reach. I said I like that Sony pushed the chip further than anyone thought they could. That has nothing to do with what the XBX does. That's like me saying I like the PS5 controller, and you replying with the same post you did.
I'm sure devs would rather not have a split RAM pool.
XSX doesnt have a split RAM pool. Might want to look back into that.
"The Xbox Series X uses unified memory, but it still splits that 16GB into two conceptual pools. 10GB of RAM runs at 560GB/s, while the remaining 6GB offer 336GB/s"
https://www.extremetech.com/gaming/307903-playstation-5-vs-xbox-series-x-which-is-better


Yes, because Cerny told us so. What is your point?
I'm not sure the point of you quoting my post is at this point. But if all you need is "Cerny says", I guess I can work out your motivation.
Old data on old tech.
Lol. Point went well over your head.
Refresh for better RT with 4K60 native - why not?
If you think it would be a viable option to create new consoles with a 30% increase in power then I hope you dont run any business.
They hadn't shown PS4 by now either. Who shows random pics of various parts of a console rather than the whole thing? It's how they are revealing their product...no need for the 'concern'.
If you thought my post was made out of "concern", you are mistaken. And Sony's way of revealing their product has been such a success. Lol
So one is "far superior" but hardly noticable differences side by side? OK.
Where did I write "far superior"? Stop creating strawmen. You are only arguing with yourself.
Just thought I'd quote the only bit I agree with, how are those Sony devs...any good?
Sony developer are excellent. Your point?
Based on (again)?
Go back and re read my posts. Its in there.
Based on (again, again)?
Again, see above.
The truth will likely come out one day, but I don't think you can just assume so much - you're essentially saying Sony sacrificed 4 CUs to have the Audio chip so decided to just ramp the shit out of the hardware instead. They may have planned this from the outset, we have the weird cooling system on the dev kit as well as the cooling patent that suggests Sony have been potentially going down this path for a while.
You are free to believe what you wish.
 
In no way does cell act like multicore CPUs.
It's a multicore processor though, and it's a very powerful one. Point being that Sony were trying something new, MS took their X360 CPU off the back of the Cell.

You do know that at a certain point into the design of a next gen GPU (like 18 months before release) you can just make those changes. You can choose to enable extra Cus and take the yield hit, but you cant go redesigning the chip.
That still doesn't explain why Sony would just 'throw away' 4 CUs for Tempest - it makes no sense whatsoever, rather have the 4 extra CUs and rethink the audio solution.

XSX doesnt have a split RAM pool. Might want to look back into that.
"The Xbox Series X uses unified memory, but it still splits that 16GB into two conceptual pools. 10GB of RAM runs at 560GB/s, while the remaining 6GB offer 336GB/s"
https://www.extremetech.com/gaming/307903-playstation-5-vs-xbox-series-x-which-is-better
Except somehow the CPU needs to know to only use the slower RAM, likewise the GPU the faster - how does that work? - Genuine question

Lol. Point went well over your head.
They were targeting 2GHz at that point in time - which wasn't on RDNA 2. What's to say they weren't assuming 2.2Ghz for RDNA2?

If you think it would be a viable option to create new consoles with a 30% increase in power then I hope you dont run any business.
What makes you think we will only see a 30% increase in 3 years? - Genuine question.

Where did I write "far superior"?
Here;
I have no doubt the XSX is a far superior machine, in almost every way
 
Last edited:
Where comes from this 30% number?

From what we have seen I think the difference is not as big as with PS4 and XB1 and PS4 Pro and Xbox One X but I think the advantage will be out of loading speed on XSX side later in the gen for multiplatform games.

If there is an advantage to PS5 SSD streaming it will only be used for exclusives games, same probably for 3D audio.
 
Last edited:
That still doesn't explain why Sony would just 'throw away' 4 CUs for Tempest - it makes no sense whatsoever, rather have the 4 extra CUs and rethink the audio solution.
They did what they did. I think using redundant Cu's for a purpose like an audio chip is excellent thinking. I wonder what else could be done with them.




Except somehow the CPU needs to know to only use the slower RAM, likewise the GPU the faster - how does that work?
The same way the OS knows to only use the slower pool.

They were targeting 2GHz at that point in time - which wasn't on RDNA 2. What's to say they weren't assuming 2.2Ghz for RDNA2?
Are you saying that Sony was originally going to use RDNA1 for their GPU.
Companies like AMD have short runs of future chips to test well before they release it to the market. When they were testing RDNA1 chips, they didn't do it with GCN chips.

What makes you think we will only see a 30% increase in 3 years? - Genuine question.
The Xbox One X was released 4 years after the OG One, and it provided a 4.5 x increase in power over the One.
The XSX will be released 3 years after the One X and will only give a 2.0 x increase over the X.
If things were to be followed as they were with the one X, then you would expect a mid refresh to come in at 48tflops, which of course it never will. In four years they may be able to release a console of about 16-18tflops. Things are not increasing as a % like they were before, otherwise the XSX would have been a 18tflop machine.
At that point its not worth bothering about.
Here;

(Bolded and underlined as I'd already quoted it once)
Damm, I owe you an apologie. I dont know why I said far superior, as I actually dont think it is far superior. Its superior in a number of ways, but "far" superior? No.
 
It’s a 18% difference in theoretical compute, mitigate some of that with ps5’s faster clock and you might end up with only 10-15% difference.
hmm, not sure why this keeps coming up here and in different forums. But a lot of people that try to sell the 'clocks' argument haven't done the comparison in a fashion I think makes a lot of sense. You've got to compare the same architecture, where one has more cores with lower clocks, vs less cores with higher clocks. In the CPU side of things, this works out as you say. On the GPU side of things, it doesn't. 2080TI is one of the slowest clocked Turing's, and its just smashes the heck out of every other Turing card in every benchmark, new and old, and those other cards are up to 300-350Mhz faster.

Damm, I owe you an apologie. I dont know why I said far superior, as I actually dont think it is far superior. Its superior in a number of ways, but "far" superior? No.

You're new here (and at the same time, not new), so I suggest to just slow down a bit and explore the threads a bit more. We've had a lot of tech discussion on PS5 and XSX hardware in their respective threads. A lot of what you've asked/stated about has been answered repeatedly and dissected to death here. This isn't Gaf or Resetera, incorrect technical information and stretched wording is going to be spotted and called out pretty fast.
 
hmm, not sure why this keeps coming up here and in different forums. But a lot of people that try to sell the 'clocks' argument haven't done the comparison in a fashion I think makes a lot of sense. You've got to compare the same architecture, where one has more cores with lower clocks, vs less cores with higher clocks. In the CPU side of things, this works out as you say. On the GPU side of things, it doesn't. 2080TI is one of the slowest clocked Turing's, and its just smashes the heck out of every other Turing card in every benchmark, new and old, and those other cards are up to 300-350Mhz faster.



You're new here, so I suggest to just slow down a bit and explore the threads a bit more. We've had a lot of tech discussion on PS5 and XSX hardware in their respective threads. A lot of what you've asked/stated about has been answered repeatedly and dissected to death here. This isn't Gaf or Resetera, incorrect technical information and stretched wording is going to be spotted and called out pretty fast.
If I didnt talk about things that have already been discussed, I wouldn't have anything to talk about lol
I'm just talking about things that interest me. I ask questions about things I want to learn about, and I say what I do because I believe it to be true. If it's wrong, then I am happy to then have the correct information after its given to me.
I dont try to start fanboy wars, but I do like to dissect tech.
But thanks for the advice.
 
Back
Top