General Next Generation Rumors and Discussions [Post GDC 2020]

They aren't seperate pools, its the same hardware and bus thats shared between the two its just slower if you access part of it so the CPU accessing memory is likely to be slower than on the PS5 and have a larger impact on bandwidth overall as well. This isn't a optimisation this is a trade off to save on having to spend on 20GB of GDDR6.

Now that I think about it
I am very curious how much is reserved for the OS. MS will want the fast area to be a priority to not affect the overall bandwidth too much. Assuming 2 GB are reserved for the OS I presume that will be on the fast area? That means there are 8GBs left on the fast area and 6GBs on the slower area for games.
A game that requires 9GBs will surely access the slower area too, reducing the bandwidth further then?
 
Now that I think about it
I am very curious how much is reserved for the OS. MS will want the fast area to be a priority to not affect the overall bandwidth too much. Assuming 2 GB are reserved for the OS I presume that will be on the fast area? That means there are 8GBs left on the fast area and 6GBs on the slower area for games.
A game that requires 9GBs will surely access the slower area too, reducing the bandwidth further then?

Id suspect the OS reverses to be entirely on the slow region so that it does not affect the GPU too much.
 
Where is the information about the base clock? I've only seen the max 2.23Ghz..
I can't find where I saw it, perhaps a corrected error or just my imagination. I guess it wouldn't be a base clock anyway, but I had thought there was a hard minimum.
 
Now that I think about it
I am very curious how much is reserved for the OS. MS will want the fast area to be a priority to not affect the overall bandwidth too much. Assuming 2 GB are reserved for the OS I presume that will be on the fast area? That means there are 8GBs left on the fast area and 6GBs on the slower area for games.
A game that requires 9GBs will surely access the slower area too, reducing the bandwidth further then?
13.5GB for games and that includes all 10GB of faster mem according to Digital Foundry.
 
Itll lower bandwidth but not as much as forcing the GPU to access the slower region.

This is an interesting area as it also ties now to the ssd.

Memory used to have lots of what if data stored, this will be less of a case now.

Will the ssd mean system ram usage is spread across more of its area, or will ray tracing be a totally memory thrasher that uses most of the bandwidth, but focused on a very small portion of the data.


Architecture wise it seems like esram again. Instead of 32mb they have 10gig and its much faster, and the slow memory is as fast as the Xbox one X. :runaway::runaway:
 
I'm not sure why people are expecting price parity for XBSX and PS5.

XB1 was 499 USD, but it had a ~150 USD Kinect
PS4 was 399 USD
PS4 pro was 399 USD
XB1X was 499 USD, and came out a year later

So I'm fully expecting something like 399 for PS5, and ~499 for XBSX.
The specs for PS5 seems like a very natural for a 399 minded console and XBSX really seemed like a continuation of the XB1X mindset. Yes there is a (relatively) smaller power gap in the PS5/XBSX generaton but XB1X also had an extra year.
XB1X and PS4/PS4 pro had price tactics that basically worked for both companies, so I don't expect them to deviate that much from past strategies.


Then question is then is the 1.8 or so TF difference worth 100 USD/25% difference in price.
Personally the TF difference feels like what insiders have been hinting all along- that the difference is really negligible.

We're talking about a ~15% difference in performance in an age where pixel counting yields in at least 1440p+ or even 1800p+ (PS4pro/XB1X) numbers. We should PS5 and XBSX should at least run in the realm of 2160p numbers and probably easily hitting 60 fps and shooting for 120fps.

The PS3/XB360 and PS4/XB1 age was where pixel counting resulted in 500p~720p (pixels countable even by amateurs) and 720p~1080p (countable by experienced people) levels. Now it's to the point where pretty much nobody counts them anymore and relies on DF to just feed us the numbers.
 
Last edited:
I'm not sure why people are expecting price parity for XBSX and PS5.

XB1 was 499 USD, but it had a ~150 USD Kinect
PS4 was 399 USD
PS4 pro was 399 USD
XB1X was 499 USD, and came out a year later

So I'm fully expecting something like 399 for PS5, and ~499 for XBSX.
The specs for PS5 seems like a very natural for a 399 minded console and XBSX really seemed like a continuation of the XB1X mindset. Yes there is a (relatively) smaller power gap in the PS5/XBSX generaton but XB1X also had an extra year.
XB1X and PS4/PS4 pro had price tactics that basically worked for both companies, so I don't expect them to deviate that much from past strategies.


Then question is then is the 1.8 or so TF difference worth 100 USD/25% difference in price.
Personally the TF difference feels like what insiders have been hinting all along- that the difference is really negligible.

We're talking about a ~15% difference in performance in an age where pixel counting yields in at least 1440p+ or even 1800p+ (PS4pro/XB1X) numbers. We should PS5 and XBSX should at least run in the realm of 2160p numbers and probably easily hitting 60 fps and shooting for 120fps.

The PS3/XB360 and PS4/XB1 age was where pixel counting resulted in 500p~720p (pixels countable even by amateurs) and 720p~1080p (countable by experienced people) levels. Now it's to the point where pretty much nobody counts them anymore and relies on DF to just feed us the numbers.
Its ray tracing and those other features what I am interested to see. These might be more apparent than pixel counting
 
Its ray tracing and those other features what I am interested to see. These might be more apparent than pixel counting

Power difference there doesn't seem to be much more than the GPU difference, no?

It doesn't look realistic to me that a ~15% difference will make certain ray trace features doable on one console and undoable on another after some tweaked parameters.
 
Thing is, maybe the xbox sx solution can do that too. Which would be nice of course. Then we have to wait until game engine are built with that in mind. With MS forcing back compatibility for a while, could be a problem for them.

And, you can't always do ssd => ram transfert, you have to let the gpu/cpu access it too...
 
Last edited:
Power difference there doesn't seem to be much more than the GPU difference, no?

It doesn't look realistic to me that a ~15% difference will make certain ray trace features doable on one console and undoable on another after some tweaked parameters.
Things like DXR, VRS etc. RT will be doable on both but the quality and extend of the implementation may differ. We will see.
There are unknown parameters here, with the PS5's variable frequencies, differences in RAM and different approach in elevating performance making a little harder to understand the real performance gap.
There was zero mention of extra feature sets that MS announced for series X and the question is how do these affect farther the performance? The CPU and GPU frequencies on the PS5 are the best case scenarios, most likely will never have that frequency that high` simultaneously. So the performance gap may be higher or less.
 
Because of the different approaches we could have a situation where the faster console (and PCs) is, at the same time, the lower common denominator and actually limiting what 3rd party developers can do on PS5. Can't wait to see the actual difference in performance.
 
Thing is, maybe the xbox sx solution can do that too. Which would be nice of course. Then we have to wait until game engine are built with that in mind. With MS forcing back compatibility for a while, could be a problem for them.

And, you can't always do ssd => ram transfert, you have to let the gpu/cpu access it too...

Judging from the current demos of XB1 games loading in ~6 seconds and the spiderman in 0.8 seconds I'd say they're quite far apart and isn't really in the same class.
 
Judging from the current demos of XB1 games loading in ~6 seconds and the spiderman in 0.8 seconds seconds I'd say they're quite far apart.
Maybe that's because loading a level within a game isn't the same as switching between games on the fly from the dashboard.....
 
Judging from the current demos of XB1 games loading in ~6 seconds and the spiderman in 0.8 seconds I'd say they're quite far apart and isn't really in the same class.
I agree that PS5 definitely has super sweet SSD, but comparing PS4 Pro Spiderman (8s load time) and State of Decay on Xbox One S (with 38s load time) is not apples to apples.
 
Maybe that's because loading a level within a game isn't the same as switching between games on the fly from the dashboard.....


35=>8.5 seconds loading, not switching for XB1X to XBSX

that's what? 1/4th the time?


PS5 is doing like 1/10th.
 
Back
Top