PS5 Pro *spawn

It was in the FTC leaks, the Series X GPU was supposed to have a few more CUs but they failed to secure the silicon and Phil apologized for it. I will provide it here for context once I find it.

That makes no sense to me. The GPU as it is already has too many WGP per shader engine. Any more would have bottlenecked the GPU further.

I would like to see the exact quote.
 
Pure spec-ulation, but I wonder if the "2180MHz/16.75TF" reporting may be another case of people reporting the low end of the Continuous Boost scheme just like folks were doing before the base PS5 was released, when we weren't aware of the implementation.

For eg. the vast majority of the time it could run at something like 2430MHz/18.64TF and then deterministically drop under heavy load/utilisation/power-draw as far as 2180MHz/16.73TF in a worst-case scenario.


Seems odd to run 53MHz shy of the 2233MHz on PS5 otherwise. If it is 2180MHz in Pro mode, I'd expect that in the compatibility mode it'd shut down 24CU and boost back up to 2233MHz (or higher for a boost mode).
 
That makes no sense to me. The GPU as it is already has too many WGP per shader engine. Any more would have bottlenecked the GPU further.

I would like to see the exact quote.
Spencer admits that Microsoft had a “yield miss for Scarlett,” the codename for Xbox Series X / S consoles, but that it and the delay to Halo Infinite weren’t the “main factors in our console scarcity” during the initial launch period. Microsoft had cut the compute units on its Xbox Series X silicon down to 52 from 56 to improve production yield, but the company still missed a target for hardware production of the console. Microsoft also had a strategic bet on cloud that it appears to have been holding back chips for.
This is from this Verge article
I remember reading more about it in the actual leaks unfortunately I deleted the emails and only kept the pdfs with the roadmap. But yes Sony has had a much better roadmap than MSFT this gen. They have made better decisions and I have confidence the PS5 pro will meet expectations of consumers especially in time for GTA 6 and Death Stranding 2.
 
This is from this Verge article
I remember reading more about it in the actual leaks unfortunately I deleted the emails and only kept the pdfs with the roadmap. But yes Sony has had a much better roadmap than MSFT this gen. They have made better decisions and I have confidence the PS5 pro will meet expectations of consumers especially in time for GTA 6 and Death Stranding 2.
That's a yield miss. If anything that suggests they should have lowered the clocks on the X or disabled even more CUs (down to 48).

Yeah, there was nothing here that suggested a more powerful Series X was contemplated.
 
So going by this, can we assume that PS5Pro will have an NPU like hardware block? It says "fully custom" on the slide, and "Accessed only by libraries" seems like they want to keep it completely isolated from coders.


BTW "PSSR" seems to have little to nothing to do with the "Spectral Upscaling" patent posted a few pages back. This is definitely talking about using similar inputs to DLSS and FSR.

"PlayStation Spectral Super Resolution" seems to be kind of new marketing name for something that had been called "PSML MFSR" in the SDK documentation.

It was in the FTC leaks, the Series X GPU was supposed to have a few more CUs but they failed to secure the silicon and Phil apologized for it. I will provide it here for context once I find it.

I think this is the document you're referring to...?


Phil definitely took responsibility for the below expected yields (which hurt Series sales), and also talked about diverting silicon to building up their vast cloud infrastructure (which hurt Xbox sales even further).* Having redundant CUs to improve yields being particular to the above issue of capacity for this particular system (rather than a normal decision made for all home consoles eg. Series S, PS5, PS5Pro) seems to be conjecture on the part of websites though, afaics.

IIRC a couple of years back a MS engineer (maybe at Hotchips or something like it) did talk about a couple of configurations being possible to hit 12TF, including using all CUs. Disabling a minimal amount of units is normal though.

IMO it would be very odd for the machine with the largest chip of the generation to *not* have a disabled DCU per shader engine. Even Sony with a smaller chip and much higher fab capacity went with this, as did MS with the Series S which as the smallest chip should have the best yield.

The thing that would have helped MS most is moar clocks for both Series S and X. Using boosting and also saying "fuck it!" with regards to power efficiency has really paid off for Sony. They got the best out of their hardware relative to the competition early on, when it mattered most.

*Hilariously, in the same email, you can see that Phil talks about how "Amazon Luna and Google Stadia do not have the console strength we have giving us developer engagement, gaming community and catalog of content."

Well, thanks to the way you've handled it, you don't have the same console strength now either. [LOL / FFS - delete as appropriate]
 
Last edited:
Considering that MS needs to respond to the PS5pro in some way, a tuned SeriesX/S is obvious. If I just look at how quiet and how slowly the fan of the Series X rotates, in my opinion there is potential for a significant increase in clock speed, which thus affects the entire APU. I think a 15% increase is achievable, so we can already talk about 14 Tflops in the case of Series X. If they make use of the int4/8 processing that is in the Xboxes, then this can be an elegant answer.
 
Considering that MS needs to respond to the PS5pro in some way, a tuned SeriesX/S is obvious. If I just look at how quiet and how slowly the fan of the Series X rotates, in my opinion there is potential for a significant increase in clock speed, which thus affects the entire APU. I think a 15% increase is achievable, so we can already talk about 14 Tflops in the case of Series X. If they make use of the int4/8 processing that is in the Xboxes, then this can be an elegant answer.
What you're saying in terms of boosting the clocks is plausible. I just think presenting the Series X as an alternative to the PS5 pro is going to be a tough sell. It would be better placed against the base PS5. Because consumers actually look at the comparative output, which despite the better hardware, the Series X hasn't shown a reason why its better than the PS5 in real world games. Assuming the PS5 pro releases and its hw accelerated AI upscaling doesnt perform as well then that would be a marketing opportunity for MSFT with the Series X. But with the specs leak and Sony's track record, I think the PS5 pro is going to knock it out of the park and make this positioning quite difficult.
 
So who is the ps5 pro aimed at?
The PS4 pro I can understand. 4K TV was arriving mid cycle so a pro version of the console that could take advantage of this new display format made sense, but the PS5 pro?

There is already a subset of gamers who find the consistency of reflections in puddles, or the quality of shadows very important and are prepared to spend a lot of money to achieve this. They’re already gaming on high end PCs. Are they going to be interested in this? Probably not, it’s a down grade.

So that leaves existing console owners. Is an advertising pitch of, ’’spend £600 and upgrade to a machine that plays exactly the same game, but now, with better reflections.” going to attract a market that probably doesn’t find that very important in the first place and could probably find something more useful to spend £600 on?
 
Tflops war is almost over. The TOPS war has begun! :yep2:
I'm really curious to see how PSSR stacks up. There's that image of a Ratchet and Clank: Rift Apart comparison from MLID.. and while I'm not going to get into the validity of it... it does show a decent improvement over the games TAAU mode, and FSR2.

It actually prompted me to fire up Ratchet again on PC to test them out, since I believe that FSR2 got an improvement in a later patch, I was curious to see if it was appreciably better at all.. and, well.. whatever the case may be, FSR2 in Ratchet is still complete trash. Actually even worse than I remember lol.

I made a quick vid very clearly demonstrating its issue. I don't really even need to mention this since it's so obvious.. but focus on the power lines swinging. First I show TAA, then IGTI, then XeSS, DLSS, and finally FSR2. Res was set to my monitors native resolution 3840x1600, and all were tested in performance mode.. so essentially 50% res on each axis.. meaning 1920x800 input res.. and then finally at the very end I show that even Quality FSR2, is worse than any other reconstruction technique at performance.


My main reason for posting this is that Sony's solution will undoubtedly provide higher quality than the terrible FSR2 which is plaguing console image quality atm. So even if that's all it did.. it will be a massive improvement for IQ in upcoming games.
Oh yes. I am used to compare IQs using compressed sources and was really impressed by PSSR compared to both TAAU and FSR2. And the pic used was actually one of the best case scenario for FSR2. But the best comparisons will be against DLSS2.

So going by this, can we assume that PS5Pro will have an NPU like hardware block? It says "fully custom" on the slide, and "Accessed only by libraries" seems like they want to keep it completely isolated from coders.



BTW "PSSR" seems to have little to nothing to do with the "Spectral Upscaling" patent posted a few pages back. This is definitely talking about using similar inputs to DLSS and FSR.

"PlayStation Spectral Super Resolution" seems to be kind of new marketing name for something that had been called "PSML MFSR" in the SDK documentation.
PSSR is another name for PSML MFSR (Playstation Machine Learning Multi Frame Super Resolution **cough). They needed to use a short name to market it (against DLSS), obviously.

Considering that MS needs to respond to the PS5pro in some way, a tuned SeriesX/S is obvious. If I just look at how quiet and how slowly the fan of the Series X rotates, in my opinion there is potential for a significant increase in clock speed, which thus affects the entire APU. I think a 15% increase is achievable, so we can already talk about 14 Tflops in the case of Series X. If they make use of the int4/8 processing that is in the Xboxes, then this can be an elegant answer.
They'll react the way they reacted 17 months later with X1X after the first serious Pro leak. They'll come up with bigger specs... likely one year later. Not sure how they'll react to PSSR though as this is software, not hardware.
 
That wouldn't be a refresh, that would be a near pro-style console.

If we're keeping the "refresh" to 52CUs and going to RDNA4 + Zen4/5 + increased clocks, you might have your Series X "refresh" outperform the PS5 Pro.
They'd probably trade blows in much the same way as the PS5 and Series X do now.

MS could even use full 60CU SoC's for a hypothetical Series XX and use any >=52 imperfect SoC's for base Series X instances in the cloud.

Measured like this rumoured PS5, and assuming a clockspeed of 2.5GHz just coz, that'd put it at 38.4TF. Not bad ¯\_(ツ)_/¯

I suppose it just comes down to the question of how expensive is it to incorporate new RDNA generation features into a 5nm refresh?
 
I'm really curious to see how PSSR stacks up. There's that image of a Ratchet and Clank: Rift Apart comparison from MLID.. and while I'm not going to get into the validity of it... it does show a decent improvement over the games TAAU mode, and FSR2.

It actually prompted me to fire up Ratchet again on PC to test them out, since I believe that FSR2 got an improvement in a later patch, I was curious to see if it was appreciably better at all.. and, well.. whatever the case may be, FSR2 in Ratchet is still complete trash. Actually even worse than I remember lol.

I made a quick vid very clearly demonstrating its issue. I don't really even need to mention this since it's so obvious.. but focus on the power lines swinging. First I show TAA, then IGTI, then XeSS, DLSS, and finally FSR2. Res was set to my monitors native resolution 3840x1600, and all were tested in performance mode.. so essentially 50% res on each axis.. meaning 1920x800 input res.. and then finally at the very end I show that even Quality FSR2, is worse than any other reconstruction technique at performance.


My main reason for posting this is that Sony's solution will undoubtedly provide higher quality than the terrible FSR2 which is plaguing console image quality atm. So even if that's all it did.. it will be a massive improvement for IQ in upcoming games.
ITGI is the best option outside of DLSS IMO. It's crazy to me that people claim FSR 2 as superior. Controversial take, but FSR 2 is worse than FSR 1.
 
I don't think Microsoft should respond to Pro with another machine.

They've been absolutely wrecked by Sony this generation in terms of hardware sales and from that point of view the generation is already over.

So why waste money on a 3rd SKU when it's not really going to allow them to recover some ground?

Best attack would be to price cut Series-X as much as possible (which might actually be less of a cost than the R&D cost of a new machine) to get as many machines installed as possible and look to start next generation a year earlier than Sony.
 
According to a known dev at era 8-bit 300 TOPS is equivalent to a 3090Ti, and about 20% lower than a 4080. He thinks that's well enough to properly reconstruct to 4K similarly to those GPUs using DLSS.
 
ITGI is the best option outside of DLSS IMO. It's crazy to me that people claim FSR 2 as superior. Controversial take, but FSR 2 is worse than FSR 1.
Because the very low rendering used in these tests and game performance modes in general is responsible for the poor image quality, not FSR or any upscaler per se!

Again, there is good evidence that FSR2 gives very good results in terms of image quality in practice in the Starfield game on Series X. It uses 3.7 megapixels to create something that closely resembles an 8.3 megapixel image, all with very little resource waste! Obviously, the quality sucks if you only have to work with, say, 1.5 megapixels... Furthermore, it's not a switch that you press and the image quality is ready, hello! It all depends on the correct implementation.

After what I've seen (a well-implemented FSR2), I'm so convinced of this that I'd question the need for the entire PSSR if it didn't involve unique frame generation. With frame generation, you can already have an advantage in a new hardware.

However, it would be foolish to think that MS and AMD won't have an answer for this...
 
According to a known dev at era 8-bit 300 TOPS is equivalent to a 3090Ti, and about 20% lower than a 4080. He thinks that's well enough to properly reconstruct to 4K similarly to those GPUs using DLSS.
He is wrong, rtx3090 is weaker with 285 tops 😁 all numbers for series 4xxx for int8 tops is doubled and I have no idea how nvidia calculate it, could be marketing edit: my mistake ti version is 320tops
 
He is wrong, rtx3090 is weaker with 285 tops 😁 all numbers for series 4xxx for tops is doubled and I have no idea how nvidia calculate it, could be marketing edit: my mistake ti version is 320tops
We have no idea how AMD are calculating their numbers either.
 
Back
Top