Sony PlayStation 5 Pro

The PS6 gen may actually be the last to offer big improvements to visuals.
They can build a chip that brings small improvements to rasterization while concentrating on ray tracing and ML.
Teraflop numbers may not even be a marketing point. After that, it's going to be like smartphones, small incremental improvements.

And there is nothing they can do to convince the Fortnite and live service crowd to move on, unless they stop supporting the PS4.

But thinking that Sony will stop making playstation consoles is misguided. It's like thinking that apple will stop making new iphones because sales have plateaued. Like Sony, it's their flagship product, they will try to make it better and sell more until they don't exist anymore.
For PS6 they'll barely talk about TFLOPS. They'll talk mainly about TOPS and Ray (Path) Tracing Techs.
So there is no dedicated AI block in this at all? It's like XeSS on none intel hardware?

Has there been any info on exactly what the RT improvements actually are? I've seen people saying it will perform in RT like a 4070 but this doesn't help much either, do they mean light RT loads where AMD does ok or do they mean what a 4070 can do with heavy RT loads like something like cyberpunk pathtracing including features like shader execution reordering.

Come on now, quoting that guy here? This is a known XBox troll. Cerny confirmed during his presentation it's custom hardware for machine learning. An AMD guy also said the same thing. This is "PS5 has no RT hardware" narrative all over again from the usual suspects. Very disappointing by some of you quoting that salty MS friend.
 
For PS6 they'll barely talk about TFLOPS. They'll talk mainly about TOPS and Ray (Path) Tracing Techs.


Come on now, quoting that guy here? This is a known XBox troll. Cerny confirmed during his presentation it's custom hardware for machine learning. An AMD guy also said the same thing. This is "PS5 has no RT hardware" narrative all over again from the usual suspects. Very disappointing by some of you quoting that salty MS friend.
Sorry I don't do twitter or the leak scene or whatever you call it, snc posted it and I just took it at face value.
 
So there is no dedicated AI block in this at all? It's like XeSS on none intel hardware?

Has there been any info on exactly what the RT improvements actually are? I've seen people saying it will perform in RT like a 4070 but this doesn't help much either, do they mean light RT loads where AMD does ok or do they mean what a 4070 can do with heavy RT loads like something like cyberpunk pathtracing including features like shader execution reordering.

It's 2-3x what PS5 can do, that's the benchmark.

PS5 in heavy loads gets beaten by an RTX 2060.

So 2-3x RTX 2060 in RT performance on average.

Best case it's 2-3x RTX 2070.

So still a good way off Nvidia imo.

I'm expecting 30fps RT modes on PS5 to be 60fps on Pro, and a 30fps RT mode on Pro with more RT.
 
I'm surprised by the lack of CPU upgrade. Simply put, zen2 isn't very good. Wasn't good when it came out and certainly sucks today.

This is critical when discussing RT capabilities in the real world. Our group has spent a reasonable amount of time examining RT on the PC. At lower rendered resolutions such as 1080p (pre upscaling), you can enable/disable RT and monitor CPU usage via hwinfo and if you want to get deeper, intel vtune proflier. You'll see the core load distribution go up, effective clocks per core are now hitting max peak and the power draw shoots up. We're talking CPU utilization that's almost double when comparing raster to heavy RT in titles such as CP, Spiderman etc

With RT enabled, the even at 3440x1440 with DLSS Quality and maxed out setting in a game like CP, the lows are 20-30% better stock vs heavily tuned. And that's on a stock configuration that's many times faster than a zen2 cpu. The stock configuration being a 14700k and ddr4 3600mts.

If there's a magic bullet to working around the CPU limitations when dealing with heavy RT workloads, it'll be the biggest breakthrough in RT since it was introduced with Turning. I'm betting there isn't.
 
No. Ps5 gpu is based on rdna 2 and wave64 does not give advantage in the number of operations because it is executed in two cycles and CU has one SIMD32 path
Ps5 pro gpu is based on RDNA3 CU(very likely, not official) and wave64 has one cycle execution. Wave64 is preferred in RDNA3 because of the new CU structure
However in practice amd is talking about 9% performance improvement in real tasks, theoretical numbers are very hard to achieve
So 30tf ps5 gpu is a number we will never see in reality.
Thank ypou for very good explanation. So this means PS5 Pro can not be counted as 30 TF console?

I think Sony should've make price cut to $400 for base model, and make Pro for $600. that would've be a lot better.

About FF7R. I completed game is 30 fps. Amazing graphics, top tier game in overal.
 
This is critical when discussing RT capabilities in the real world.

The PC port of Spiderman pretty much proved that the console CPU's punch above their weight in RT performance as it required a much faster CPU on PC to keep up with PS5 in the performance RT mode in Spiderman.

I had a Ryzen 3600 at the time and it couldn't lock to 60fps and keep up with PS5 with RT enabled, my RTX 3060ti wasn't the issue as it was well under utilised.
 

At least, among all other negative news, PSSR looks great. Not sure if I was interpreting it correctly, but a lot of the footage looked better than native with PSSR (maybe higher native resolution thanks to the bigger GPU? But even TLOU 2 looked better, and that is native 4k on PS5). Also, spiderman 2 has the RT shadows now?
 
From gaf
f7c167e8a1045897a20a.gif

eb64daf03b499cc0bd00.gif

e06301ac101b5de0a614.gif

Lou2 in fidelity mode on ps5 runs 4k so it looks pssr is realy good as its sharper than native (tough quite sure in this example internal for pssr is 1440p and we will probably more often see internal 1080p)
 
Last edited:
I think thats the point ;d performance mode on ps5pro should looks like resolution mode on ps5

Most of the ugliness was in the low-res asset textures, randomly plopped together assets, and poor lighting GI anyway. If they didn't fix that than I'd hardly call it a night and day difference just from a res boost in performance mode.
 
Most of the ugliness was in the low-res asset textures, randomly plopped together assets, and poor lighting GI anyway. If they didn't fix that than I'd hardly call it a night and day difference just from a res boost in performance mode.
Wouldnt expect such changes especialy in titles already released. And performance mode vs quality mode in rebirth was absolutly day and night thats why I finished it in 30fps.
 
TLOU on PC didn't have great IQ with TAA to be honest, so it wouldn't surprise me if TLOU2 also has poor IQ which is why PSSR looks good.

Isomniacs own upscaler is very very good, so that will be more of a interesting comparison with PSSR.
 
Last edited:
From gaf

eb64daf03b499cc0bd00.gif


Lou2 in fidelity mode on ps5 runs 4k so it looks pssr is realy good as its sharper than native (tough quite sure in this example internal for pssr is 1440p and we probably more often seen internal 1080p)
The 3d grass looking more detailed and not oversharpened is impressive vs native 4K. I am even more impressed how everything stays clean in motion. Compare that to the pixel soup seen in FSR2 games...
 
The PC port of Spiderman pretty much proved that the console CPU's punch above their weight in RT performance as it required a much faster CPU on PC to keep up with PS5 in the performance RT mode in Spiderman.

I had a Ryzen 3600 at the time and it couldn't lock to 60fps and keep up with PS5 with RT enabled, my RTX 3060ti wasn't the issue as it was well under utilised.

The reason here though was less the console CPU punching above it's weight, and more it simply having less work to do. i.e. on PC the CPU is also handling real time BVH updates which can be streamed from disk on a console as well as texture decompression which takes place on the consoles hardware decoder.
 
Back
Top