Playstation 5 [PS5] [Release November 12 2020]

All this time talking about 3d audio, yeah... sounds like a "oh shit, we have to talk about something MS didn't do"... I was getting curious, until the end like "well, that's the goal, and right now we have that : *need headset*" . Yeah so it's like any 3d audio stuff, but with a nicer presentation... No mention of VRS btw ?

I'll still buy a PS5 for the exclusive game, but it seems, power-wise, MS have the advantage, and I really like the fixed clock btw.
Cerny said it won't be a problem with ps5, but I doubt it...
 
Likely the difference in sony boost versus pc boost is that sony boost can work independent of temperature. It's only dependent on the power draw. At least that's how I understood what cerny spoke. Same boost is always achievable on all consoles, but if some game hammers cpu real hard the gpu has less power available and hence less clocks. This means each consoles boosts always the same way independent of ambient temp/chip quality.
 
From DF ...

Introducing boost for PlayStation 5
It's really important to clarify the PlayStation 5's use of variable frequencies. It's called 'boost' but it should not be compared with similarly named technologies found in smartphones, or even PC components like CPUs and GPUs. There, peak performance is tied directly to thermal headroom, so in higher temperature environments, gaming frame-rates can be lower - sometimes a lot lower. This is entirely at odds with expectations from a console, where we expect all machines to deliver the exact same performance. To be abundantly clear from the outset, PlayStation 5 is not boosting clocks in this way. According to Sony, all PS5 consoles process the same workloads with the same performance level in any environment, no matter what the ambient temperature may be.

So how does boost work in this case? Put simply, the PlayStation 5 is given a set power budget tied to the thermal limits of the cooling assembly. "It's a completely different paradigm," says Cerny. "Rather than running at constant frequency and letting the power vary based on the workload, we run at essentially constant power and let the frequency vary based on the workload."

An internal monitor analyses workloads on both CPU and GPU and adjusts frequencies to match. While it's true that every piece of silicon has slightly different temperature and power characteristics, the monitor bases its determinations on the behaviour of what Cerny calls a 'model SoC' (system on chip) - a standard reference point for every PlayStation 5 that will be produced.​

I'm not buying it. They are just saying "it is not the same thing because...."
 
The wonderful thing about having no allegiances is that you can switch from one console to the other.

I'm definitely going Xbox for the coming generation. PS5 sounds disappointing. I was expecting some BC for PS, PS2, and PS3 (optimistic, I know), looks like we get none of that. What's the point is adopting a hardware solution to BC when a software one can cover more hardware types?
Again, too early to say. This is a hardware view, and PS4 BC is in hardware. Earlier PS BC will be performed by emulation, which is a software and services discussion for a later date. Seriously no-one should be choosing console based on these incomplete reveals, not least because, for all you know, XBSX costs $1000! Just wait and see and enjoy the ride!
 
No, they tried the top 100 games and most worked. That could mean many other titles work; they just hadn't been validated.
I knew Sony's GNM would bite them for compatibility. This is the problem you create when you have little-to-no abstraction layer in a complex system.
 
Likely the difference in sony boost versus pc boost is that sony boost can work independent of temperature. It's only dependent on the power draw. At least that's how I understood what cerny spoke. Same boost is always achievable on all consoles, but if some game hammers cpu real hard the gpu has less power available and hence less clocks. This means each consoles boosts always the same way independent of ambient temp/chip quality.
Ok thank you that makes sense.
 
No. This was their GDC presentation. Not a consumer presentation. No case. No games.
With how everyone is looking at this and waiting, they should have adapted at least a little bit of it to the gamer audience...

I mean, dumping this much technicals before even showing the console box is kind of over most people's heads.
 
What was the news about BC? I missed it.
AMD designed the GPU to be BC. It's hardware BC with PS4. Kinda lame. Other platforms weren't talked about.

And what pisses me off?! He mentions that the SPUs in Cell would be ideal for audio processing but they modified a CU instead for Tempest. Bastards! Why didn't Cerny include Cell?! Does he not even read my posts here?!?!
 
The difference in power is negligible. It might even be ps4 with faster io has slightly better geometry/textures/less popup in games. And who knows how audio processing adds to experience, especially in vr. At least for me the optimal is get ps5 for sony content and 2-3years time upgrade pc to enjoy gamepass/multiplat in config consoles can't touch.

Also, PS5 gets 448GB/s BW all the time. Series X gets 560, but only to 10GB. Then it gets 336 to other 6GB.

This calculation is probably dumb but, 448X16=7168. 560X10+336X6=7616. This is only 1.0625% more BW. Or relatively less per Flop.

Would there be anything to the claim it kind of evens out and PS5 has greater relative average bandwidth?

Just looking for positives for PS5. Overall it certainly seems in a MASSIVELY better position than One was vs PS4. You can point to:

-Same fast memory architecture, where One had to deal with a deficient memory architecture with ESRAM.
-2X SSD throughput
-The bandwidth shenanigans listed above.
-Even if you disregarded boost clock and just assume a real figure of 9.2, a significantly closer FLOP spec (One, 28% less flops 1.3/1.8, PS5 24% less, 9.2/12.1). Boost clocks for whatever they're worth can narrow it to 15% deficit 10.28/12.1.



But not with CPU at 3.5 GHz at exactly the same time, most likely.

I'd easily dial back the CPU to like 3.2 and pump the GPU then.
 
Some sort of odd verbiage relating to 'about 100 of the top games available' or some such.
Not quite - they tested the top 100 PS4 games (by playtime) and found most worked - or most will work by launch. I've love to know more about what is problematic and why.
 
So, in summary:
- That I/O design looks to be very innovative. I wonder how efficient PS5 is compared to XBSX.
- Having compatibility with OTS NVME drives will help competition and keep the prices down.
- The 2.23GHZ for the GPU seems quite high. If this is a last minute decision I wonder whether this is going to be an issue in the long term.
- 3D audio seems a more refined version of what the XBX will offer.
- No Box.
- Not a single demo.

Sony has lost the raw power battle, they know it and they decided to use GDC to reveal the specs, only tech talk, no games, no demos, nothing.

From now on I don't think you will Sony talking much about specs, and only about real performance and showcasing as many games as they can.
 
Some sort of odd verbiage relating to 'about 100 of the top games available' or some such.
They tested the top 100 most played games and most worked. As a sampling, one would assume that if 90/100 of the top titles worked, 90% of the entire library would work. There's no reason why the most played games would be inherently easier to run in BC mode (unless they're mobile games :p)
 
-Evidently it's faster than Series X? Cool, but will it Matter? Will anyone notice? Will it be one second vs 2 second load times? Will DF analyze load time differences noticeably? Will it be a huge edge in system throughput? Yet to be determined.

I think a difference between no perceptible load time and a fast load time is noticeable.

10.3 vs 12 wont even result a resolution drop, maybe some settings higher on Xbox that’s it?
Not sure how much benefit ssd would bring for ps5 in graphics tho.

The clockspeed difference means certain things like geometry rate and fill rate could be a good bit faster on PS5, too.
 
I think it's unfortunate these companies burden future designs with backwards compat.

I'm curious about that IO performance. Their storage is pretty much like a RAM pool now.
 
AMD designed the GPU to be BC. It's hardware BC with PS4. Kinda lame. Other platforms weren't talked about.

And what pisses me off?! He mentions that the SPUs in Cell would be ideal for audio processing but they modified a CU instead for Tempest. Bastards! Why didn't Cerny include Cell?! Does he not even read my posts here?!?!
So it's not time to...

...Cell-ebrate?
 
Just finished watching
1. Cerny's voice sounds so clear and his explanation is surprisingly easy to understand despite the topic

2. He openly admit their mistake with ps4 cooling, and they really fixed it with PS5

3. PS5 basically runs in boost mode almost all the time and the cooling solution is already designed to properly handle that

4. There will be a teardown video (they also did for PS4)

5. Storage expandable with USB hdd and internal slot for extra m2 SSD. But not all ssd will be compatible. No info whether the OS will be able to automatically juggle data from hdd to ssd or not. But it's confirmed that you'll be able to manually move game to ssd.

6. Real 3d audio for everyone for headphones, TV speakers, sound bar, surround speakers. The quality will still be in progress even after launch. Hopefully it'll still support dolby/ac3 encoding over optical...

7. The audience are cardboard cutouts?
 
Back
Top