Current Generation Hardware Speculation with a Technical Spin [post GDC 2020] [XBSX, PS5]

Status
Not open for further replies.
Can you share a link from B3D with people celebrating 8TF machine after we got 12TF confirmed? Because I remember people being pretty damn happy with 9-10TF when 5700XT delivered Radeon 7 beating performance at 220W in 2019 (when we knew nothing about RDNA1/XSX).

There is weird sense here that some people are playing victims for some reason.


We had those? Because 5700XT is pretty much matching 2070S, so I am not sure something like this happened either. Maybe you thought 2060-2060S performance in RT? This could still be the case for both consoles.

Can we at very least keep drive by posts such as these to a minimum? Sometimes I am not sure if we are referring to Twitter rant when talking om B3D or actual posts on here.

Maybe the 'struggle PS5' was not from great sources, but the fact is the noise was out there - it wasn't a blanket of love for PS5.

Regarding expected performance, our very own iroboto suggested 2070 - and last time I checked a 2070 running at 2080 speeds could be considered 'punching above it's weight'.

I don't have time to trawl the forum, but I know what I read - found the above within a couple mins...if people were expecting 2080+ performances from PS5 it's the first I've heard of it.
 
I don't know how people couldn't see the PS5 being an equal. You take a 5700 with RDNA 2 architectural changes then boost it to 2.2ghz then you have a really crazy GPU. I have only seen clocks boost that high on 5700s with the power limit soft mods and only then does it play tag with 2080s.


Stock speeds in the latest cross gen games its competing with either a 2080 or 2080ti depending on the title.
 
We've had 'reports' also stating PS5 struggles and there were plenty here that were expecting performance around 2060 levels based off the spec sheet. I read comment on this forum that the PS5 performance in AC:V is actually performing at around a 2080 level, so...

Please provide examples where developers have actually stated such? Not corporate mouthpieces and shills; not armchair speculating quarterbacks; and especially not bias and wishful thinking fanboys. But actual developers claims of such issues.

2060 expectation vs 2080 real world results? I'd call that punching above the expected weight...but I guess that's just me.

When factoring in RT, both PS5/XBSX fall below an RTX 2080 in performance. If anything, both systems fall between an RTX 2060 and 2070 when factoring in RT (which DF has provided many examples of this being the case). And if people were listening more closely to actual developer mumbles (e.g., Matt, etc.) and/or sources closes to them (e.g., Jason Schreier), they mentioned (directly and indirectly) that PS5/XBSX performance were close to an RTX 2080... not that they performed like an RTX 2080.

But yes, feel free to believe PS5 is punching above its expected performance...
 
PSA: I'm not going to argue or prove whether or not there were people happy about the PS5 having 8TF was a thing or not.
There's a search function in the forum (hint: "8TF") and everybody can take their own conclusions from that, so there's no need to derail the thread for this.

To think how much 'fun' could be had if we could be bothered to go back a quote all the BS fake concern crap we've had to put up with for so long. Github was right, but here we have XSX performing as low as the PS5 - that XSX is clearly very broken...if I were an Xbox fan or XSX owner right now I'd be burning it in protest. /s
Fun? Yes (see how we're playing victim now).
Particularly useful for this thread or generally useful for the forum? Not really.

So I suggest we just leave it at that and call it a day.



The one inside PS5 is MT61K512M32KPA-14C:B
https://www.micron.com/support/tools-and-utilities/fbga?fbga=D9XKV#pnlFBGA
From FBGA code you can know its production date is 26th week of 2020

You can see Speed Grade Mark "-14C" in the PDF of Micron's numbering system
https://media-www.micron.com/-/medi...nts/products/part-numbering-guide/numdram.pdf

So what could be the difference between 14 and 14C? Lower CAS latency?
Maybe lower voltage / power consumption?



You take a 5700 with RDNA 2 architectural changes then boost it to 2.2ghz then you have a really crazy GPU.
The argument was always "looking at Navi 10 clocks there's no way the PS5 can ever reach 2.2GHz so it's probably just 1.8GHz most of the time".
The same sources who told RGT about the unified L3 on the PS5's CPU also said "95% of the time at 2.23GHz, when it lowers it's towards 2.1GHz", so depending on what we see on the PS5 SoC's xray we might get some validation on those clock speeds.

Of course, back in March those 2.23GHz on a 7nm RDNA2 GPU sounded completely unrealistic to many people, but now that we're seeing >2.5GHz overclocks on the much larger Navi 21 cards it doesn't sound that preposterous anymore.



Please provide examples where developers have actually stated such? Not corporate mouthpieces and shills; not armchair speculating quarterbacks; and especially not bias and wishful thinking fanboys. But actual developers claims of such issues.
I believe @goonergaz is specifically talking about the statements made by that capcom leaks guy (dusk golem?) who kept bragging mentioning on twitter and resetera how Resident Evil 8 on the PS5 was running terribly but the xbox version was running very well.
Not a developer for sure, but apparently he is (was?) pretty solid as far as capcom leaks were concerned. He lost credibility once he complained about post it because he felt the general discussion was being too biased towards the PS5.
 
When factoring in RT, both PS5/XBSX fall below an RTX 2080 in performance. If anything, both systems fall between an RTX 2060 and 2070 when factoring in RT (which DF has provided many examples of this being the case). And if people were listening more closely to actual developer mumbles (e.g., Matt, etc.) and/or sources closes to them (e.g., Jason Schreier), they mentioned (directly and indirectly) that PS5/XBSX performance were close to an RTX 2080... not that they performed like an RTX 2080.

But yes, feel free to believe PS5 is punching above its expected performance...

I guess I must have missed all the posts you and other forum members here made saying that they expect the PS5 to perform around the 2080 level - and of course, there's just too many examples for you to link to.
 
PSA: I'm not going to argue or prove whether or not there were people happy about the PS5 having 8TF was a thing or not.
There's a search function in the forum (hint: "8TF") and everybody can take their own conclusions from that, so there's no need to derail the thread for this.
But many of you are making some rather flimsy accusations that people on this forum were happy with 8TF in light of 12TF XSX, which is not true at all. In fact, its total fabrication that there is some kind of anti Sony crusade on this forum that has been pushed for years now.

If you come here and start lament about negative FUD campaign after few 3rd party games, bring some evidence. You are putting people in buckets without actually being precise on what you are referring too. I know you guys are all over RTG, secret insiders and twitter posts for years so maybe some of you got confused as to where does the "happy with 8TF notion while Xbox is 12" comes from, but it aint B3D.

I guess I must have missed all the posts you and other forum members here made saying that they expect the PS5 to perform around the 2080 level - and of course, there's just too many examples for you to link to.
See - victim complex. Who said consoles are performing like 2080? There is no universal benchmark that tells us how these consoles perform versus similar PC hardware, especially not after few cross gen games. 5700XT for what is worth performs better then 2070S in COD and Valhalla, so someone saying they expect similar performance to 5700XT still wouldnt be wrong (based on evidence we have). Checking just one option on PC to ultra vs consoles can bring FPS down 15-20% (volumetric clouds, shadows, RT) so we dont actually know how these consoles actually stack up against PC hardware.
 
Last edited:
I guess I must have missed all the posts you and other forum members here made saying that they expect the PS5 to perform around the 2080 level - and of course, there's just too many examples for you to link to.

Because I never did. I was predicting systems under 9TF, and any possibilities of RT wasn't even in my thinking process during that time. But once actual sources and certain developers mumbles became available, then a clearer picture of these systems became apparent. Systems with RT and performing close to an RTX 2080. And thanks to DF, especially Alex, we have a clearer picture on how these systems actually perform with RT, which is between an RTX 2060 and 2070 (maybe a 2070 S tops).
 
Because I never did. I was predicting systems under 9TF, and any possibilities of RT wasn't even in my thinking process during that time. But once actual sources and certain developers mumbles became available, then a clearer picture of these systems became apparent. Systems with RT and performing close to an RTX 2080. And thanks to DF, especially Alex, we have a clearer picture on how these systems actually perform with RT, which is between an RTX 2060 and 2070 (maybe a 2070 S tops).
There is no doubt once devs get grips with HW ala Insomniac RT results will be very good, but we should not forget that going by their own words (Sony and MS) heavy use of RT is not going to be focus this gen. Nvidia dedicated more HW for RT then AMD has, what AMD has done is speeded up RT in HW by monumental amount just by using small extension to TMUs, but its still not gonna be match for dedicated HW from Nvidia. Same with XSX shaders with INT4/INT8, they are lower level then 2060.
 
still too early to draw conclusions.
The first "real" next gen comparison may be Cyberpunk.
Still it's nice to see the PS5 doing well for now. Can't wait to get mine at the end of the week hopefully !
Cyberpunk is still a cross gen game. We already have next gen only cross platform games in the form of the newer sports titles that are not the same as their previous generation titles, and from what I've seen there's pretty close to parity in those titles.

Despite the RAM bandwidth differential, there is still bandwidth compression in the ROPs plus the higher clock rate to contend with.
So I got busy in real world stuff and stepped away from this forum for a bit, and I'm sure I missed some stuff. Do we know ROP and TMU counts for Series S, X and PS5? I asserted a few months ago that we may have a situation where both consoles have the same amount of ROPs so while Xbox has a compute and bandwidth advantage, PS5 might have higher fillrate. But some websites, none that I would consider to be trustworthy, are reporting Series X as having 80 ROPs while PS5 has 64, others have them both at 64. Even at 80 ROPs, Series X essentially equals PS5, depending on the clock speed, of course.

To dive back into the pre-launch FUD wars, I can admit that I definitely say that I may have contributed to some of that, especially early on. But later I came to the conclusion that really, at least in the long term, these consoles are going to be GPU limited, and any variability in GPU clocks are probably going to be minimal. That said, launch games are almost never great indicators of a system's real limits. And I think this generation more than any other, we are going to have an extended cross generation period. It may be years before we get any real separation in performance between the flagship consoles. Right now we are looking at games running 5% apart at over 100fps. Everyone who got a next gen console should be happy with that.
 
Cyberpunk is still a cross gen game. We already have next gen only cross platform games in the form of the newer sports titles that are not the same as their previous generation titles, and from what I've seen there's pretty close to parity in those titles.


So I got busy in real world stuff and stepped away from this forum for a bit, and I'm sure I missed some stuff. Do we know ROP and TMU counts for Series S, X and PS5? I asserted a few months ago that we may have a situation where both consoles have the same amount of ROPs so while Xbox has a compute and bandwidth advantage, PS5 might have higher fillrate. But some websites, none that I would consider to be trustworthy, are reporting Series X as having 80 ROPs while PS5 has 64, others have them both at 64. Even at 80 ROPs, Series X essentially equals PS5, depending on the clock speed, of course.

To dive back into the pre-launch FUD wars, I can admit that I definitely say that I may have contributed to some of that, especially early on. But later I came to the conclusion that really, at least in the long term, these consoles are going to be GPU limited, and any variability in GPU clocks are probably going to be minimal. That said, launch games are almost never great indicators of a system's real limits. And I think this generation more than any other, we are going to have an extended cross generation period. It may be years before we get any real separation in performance between the flagship consoles. Right now we are looking at games running 5% apart at over 100fps. Everyone who got a next gen console should be happy with that.
They confirmed it with 116Gpix rate. 64 ROPs x 1.825GHz

It has 2 shader engines with 5MB of L2 cache. This is similar to PS5 which has 2 shader engines with 4MB of L2 cache.

What is interesting is that Navi 21 has 4MB of L2 cache for full fat 80CU chip, so its interesting that consoles went with more L2 vs top end PC chips.
 
https://www.resetera.com/threads/vg...-s-frame-rate-comparison.326939/post-51689744

Surprising the normal mode in native 4k of DMC5 SE without RT runs pretty fast on XSX and PS5. It runs better than a non overclocked 2080Ti.




devil-may-cry-5-3840-2160.png
 
https://www.resetera.com/threads/vg...-s-frame-rate-comparison.326939/post-51689744

Surprising the normal mode in native 4k of DMC5 SE without RT runs pretty fast on XSX and PS5.




devil-may-cry-5-3840-2160.png
What are settings for TechPowerUp bench? Because dropping few can easily result in big frame rate gain.

But what I am impressed by is amount of games pushing 60-120fps even though resolution can vary, didnt have that since gen 6. What that proves is that there is plenty there for eye candy! I say shot for 1440p 30fps heh I want CGI level :)
 
What are settings for TechPowerUp bench? Because dropping few can easily result in big frame rate gain.

But what I am impressed by is amount of games pushing 60-120fps even though resolution can vary, didnt have that since gen 6. What that proves is that there is plenty there for eye candy! I say shot for 1440p 30fps heh I want CGI level :)

This is not the final result it was probably pre release. But it performs very well on PS5 and XSX. I find the final DMC 5 test

https://www.techpowerup.com/review/devil-may-cry-5-benchmark-performance-test/4.html

2160.png


Better than 2080 at ultra but not as good as a 2080 Ti probably around 2080 Super not bad at all. Next generation consoles are probably on ultra too I suppose.

EDIT: A surprising result for PS5, on paper XSX seems between 2080 Super and 2080Ti.
 
Last edited:
They confirmed it with 116Gpix rate. 64 ROPs x 1.825GHz

It has 2 shader engines with 5MB of L2 cache. This is similar to PS5 which has 2 shader engines with 4MB of L2 cache.

What is interesting is that Navi 21 has 4MB of L2 cache for full fat 80CU chip, so its interesting that consoles went with more L2 vs top end PC chips.
Is there a source for this? Just want to read more.
 
Status
Not open for further replies.
Back
Top