Current Generation Hardware Speculation with a Technical Spin [post GDC 2020] [XBSX, PS5]

Status
Not open for further replies.
All Will Be Fine!




Though some may have to worry about cats and other pets catching on fire... :runaway:

As long as the pet is blocking the fan noise I will be fine, heck I will even consider duct taping the pet in place! Nah! I love animals don't worry.

I am also a strong believer the PS5 will manage just fine, I even think that we will see both a little bit ahead of each other under different circumstances and games with insignificant on screen results during game play. Only visible under DF microscopes. Let the games begin!!:yes:
 
Well, I don't know about that. I'd say the CPU is actually virtually similar on both. 3.5 vs 3.6 ?

Developers are free to opt for the 3.8Ghz mode (sustained) without SMT, or 3.66ghz (sustained) with SMT. No idea if that is dynamicly though, if devs can switch between that ingame?

We have no VRS and ML on PS5 mentioned, according to DF? Those are rdna2 gpu features.
 
Well, I don't know about that. I'd say the CPU is actually virtually similar on both. 3.5 vs 3.6 ? That's a 5.5% difference. In Pro vs XSX case, even with a 9% faster CPU on XBX, many games run better on Pro, even in cases where res is the same or very similar.

I think CPU (among GPU and ram specs) is actually the only area where the specs are great and expected (we all expected 3.2ghz CPU). But Sony dropped the ball on RAM bandwith, again after Pro, (we know AMD 10tf GPUs are heavily bottlenecked with only 448GB/s, that's without CPU contention) and GPU.

This is the crazy Ken moment of Cerny. Hubris engineer stuff.
The variable clock stuff and BC situation is the Krazy Ken aspect of PS5, it looks like they are reacting to the MS offering.
 
Developers are free to opt for the 3.8Ghz mode (sustained) without SMT, or 3.66ghz (sustained) with SMT. No idea if that is dynamicly though, if devs can switch between that ingame?

We have no VRS and ML on PS5 mentioned, according to DF? Those are rdna2 gpu features.

VRS was highlighted by AMD as one of the key features of RDNA2, so I would guess it's part of PS5. No idea about the fp16, in8, int4 modifications.
 
Yes 15% is 2160p against 2050p.

Yep. Resolution differences should be negligible. The ssd access will probably be the most noticeable difference between the two systems. All of the other graphics stuff can be hidden. Dynamic res, variable rate shading, texture space shading with variable shading rates ... just so many ways to hide bottlenecks. Loading times are something that will just feel better on ps5, and there's not really anything you can do to mask the difference.
 
The variable clock stuff and BC situation is the Krazy Ken aspect of PS5, it looks like they are reacting to the MS offering.

I honestly don't know why they bothered if the clocks only change a few percent. Seems like to just make the specs look better. I think it would just be better to lower them by a few percent and have them fixed. It'll mean the difference of a few percent in resolution, which no one will notice anyway.
 
I honestly don't know why they bothered if the clocks only change a few percent. Seems like to just make the specs look better. I think it would just be better to lower them by a few percent and have them fixed. It'll mean the difference of a few percent in resolution, which no one will notice anyway.
Because clocks/power increase/decrease on logarithmic scales.
 
2160p vs 1800p is 44% difference in resolution (assuming the same setting).

How to reach the conclusion?


In fact Mark Cerny highly praises the narrow and fast approach. How to compare PS5 and xsx game performance
if PS5 GPU is at 2.23GHz with faster front-end?

Yes 15% is 2160p against 2050p.

My mistake. I've just been casually using the general difference between the PS4Pro and X1X.

I honestly wouldn't be surprised if the piss poor memory bandwidth ends up impacting the resolution by more than 15% though.

The variable clock stuff and BC situation is the Krazy Ken aspect of PS5, it looks like they are reacting to the MS offering.

You're seeing faces in clouds.

I honestly don't know why they bothered if the clocks only change a few percent. Seems like to just make the specs look better. I think it would just be better to lower them by a few percent and have them fixed. It'll mean the difference of a few percent in resolution, which no one will notice anyway.

I suspect that it's because their cooling solution can handle it, and the need to drop GPU performance at all is predicted to only happen when the CPU is being hammered.
 
The variable clock stuff and BC situation is the Krazy Ken aspect of PS5, it looks like they are reacting to the MS offering.
This is what I thought too, but then I wondered, that that would have been too much of a risk, and wouldnt have had a proper cooling solution ready by now. They would have been in panic trying to fix the shitshow and the possibility of burning the console after long use. Leaks of expensive cooling solution came a month (or was it two ago) and thats not a simple thing to do considering the form factor that was defined earlier
 
Well, I don't know about that. I'd say the CPU is actually virtually similar on both. 3.5 vs 3.6 ? That's a 5.5% difference....
Yes. I was just pointing out that any advantage XBSX has from CPU will be for cross-platform as well as exclusive titles, whereas Sony's advantage is more likely only for exclusives. CPU utlisation will scale naturally on modern engines whereas first-gen SSD-streaming titles will likely just cap at lowest common denominator.
 
Again though, I don't see the point of it. It can only add complication, and for what?

Does it add complications though? I suspect there was a mandate from on high to not increase the CU budget above 36, and to push clocks as high as possible instead. They achieved a solution that could allow them to maintain 2.23GHz, but knew there was scope in the future for the CPU to hamper that. So they've played it safe and implemented a system to allow the GPU to drop its performance in such instances.

I'm interested to see how it fares. It might fare horribly. Hopefully it'll fare well, as it's the platform on which I'll be gaming.

I sort of agree with you though: I would still have much preferred 40-48CU's @ a solid 2.1GHz or something.
 
@mrcorbo I agree about giving developers tools to track and debug latency. It's one of the most critical aspects of gaming, despite graphics and everything else. Games have to feel good and responsive. Unfortunately for both SX and PS5, I'm expecting a lot of 30fps titles, which means I have no interest in playing. I hope some of this stuff comes over to the PC side.
 
Again though, I don't see the point of it. It can only add complication, and for what?

Because when the systems are really being pushed in the longer term it's not just going to be a "couple of percent" of occasional drops that you're talking about.

I think the gains from boosting - and the drops from being loaded - are going to become more and more substantial over time.

You don't get owt fer nowt!
 
https://www.chiphell.com/forum.php?mod=viewthread&tid=2201057&page=2&mobile=2#pid44551026

From Zoo, I think its AquariusZi but on chiphell.

PS5 clocks very high, and therefore price is high and yields are bad. Move to counter 12TF from XSX.
But, what was Sony thinking MS would launch?. They deserve the bad marketing they are getting. The APU is a fudge with not custom development in the CPU or GPU that could at least make us think it could fly over its weight. The only thing they paid for was an i/o chip, useless for 90% of games compared to MS's one.
 
But, what was Sony thinking MS would launch?. They deserve the bad marketing they are getting. The APU is a fudge with not custom development in the CPU or GPU that could at least make us think it could fly over its weight. The only thing they paid for was an i/o chip, useless for 90% of games compared to MS's one.

Not much love in Rio?:rolleyes:
 
If the XSX SSD solution isn't fast enough for that (previous gen game), then PS5's wont be either for true next gen games. Most likely because Gears 5 isnt tailored for those speeds. Much like most pc games on NVME right now.

the Spiderman 0.8 second fast travel and high speed travel through city says otherwise
 
Status
Not open for further replies.
Back
Top