Current Generation Hardware Speculation with a Technical Spin [post launch 2021] [XBSX, PS5]

Status
Not open for further replies.
SSD is critical. I/O memory is the slowest part of the pipeline for just about a lot of things. Compute moves a significant pace faster than bandwidth, so often compute is sitting idle.
Moving up SSD and ideally increasing bandwidth were an absolute critical portion of moving forward into the future.
 
SSD is critical. I/O memory is the slowest part of the pipeline for just about a lot of things. Compute moves a significant pace faster than bandwidth, so often compute is sitting idle.
Moving up SSD and ideally increasing bandwidth were an absolute critical portion of moving forward into the future.

Everything had to align, faster nvme, faster cpu, faster gpu etc etc
 
Do you think ps5 has functions similar to sfs?
If yes, why does ps5 need such ssd speed?
And if not, it could actually have a significant impact on the difference in picture and performance between consoles.
 
Do you think ps5 has functions similar to sfs?
If yes, why does ps5 need such ssd speed?
And if not, it could actually have a significant impact on the difference in picture and performance between consoles.
I'm sure it has something akin to Sampler Feedback, but not Sampler Feedback Streaming.
I'm not sure it will have much of an impact on the quality of graphics or textures, but should just make RAM usage more efficient.
 
Nah, probably will be very similar to end of the grnerarion as 20-25% teoretical compute/rt intersection/bandwidth advantage will not increase magically over time
It depends on how hardware support for ML, VRS, Mesh Shaders and SFS will affect the difference in performance between consoles. And on how multiplatform developers will use it.

Also, we do not know how much the ps5 GPU frequency drops when the CPU is used at full throttle. And how multiplatform developers will use Smart Shift.
According to Mark Cerny, they could not get the GPU to run stable at 2.0 Ghz and the CPU at 3.0 Ghz at the same time.
+ There was a questionable leak, which may well be close to the truth:
V8q8RzG.png
 
It depends on how hardware support for ML, VRS, Mesh Shaders and SFS will affect the difference in performance between consoles. And on how multiplatform developers will use it.

Also, we do not know how much the ps5 GPU frequency drops when the CPU is used at full throttle. And how multiplatform developers will use Smart Shift.
According to Mark Cerny, they could not get the GPU to run stable at 2.0 Ghz and the CPU at 3.0 Ghz at the same time.
+ There was a questionable leak, which may well be close to the truth:
V8q8RzG.png
lmao, its from xbox discord and has nothing to do with truth ;d
 
It depends on how hardware support for ML, VRS, Mesh Shaders and SFS will affect the difference in performance between consoles. And on how multiplatform developers will use it.

Also, we do not know how much the ps5 GPU frequency drops when the CPU is used at full throttle. And how multiplatform developers will use Smart Shift.
According to Mark Cerny, they could not get the GPU to run stable at 2.0 Ghz and the CPU at 3.0 Ghz at the same time.
+ There was a questionable leak, which may well be close to the truth:
V8q8RzG.png
If that is correct then I don't even want to know what XSX actual clock speed since its performance is similar to PS5. Yes, XSX clock is fixed, thus making that post questionable.
Also "According to Mark Cerny, they could not get the GPU to run stable at 2.0 Ghz and the CPU at 3.0 Ghz at the same time."... I don't remember Cerny said something like that. Is there a source for this?

edit: just think for a moment.... If XSX can run at 3.6 1.8 at fixed clock, why PS5 need to smart shift to 3.2 1.8? Why 3.5 1.2?

OT
This is why I posted here less. there is too much noise because of the console launch. Even if there was a post like this, nobody will like that post because it is questionable, but here we are, someone actually like that post. I just don't understand
 
Last edited:
They can both run at max freq during normal usage, smartshift is here to downclock one or the other when not fully used to reduce power usage.

In worst case scenario when the power eveloppe is exceeded it will downclock.
On the XsX side since it can't downclock it will throttle.
 
Last edited:
They can both run at max freq during normal usage, smartshift is here to downclock one or the other when not fully used to reduce power usage.

In worst case scenario when the power eveloppe is exceeded it will downclock.
On the XsX side since it can't downclock it will throttle.

Since we're still new to the generation and most titles are still cross-platform, it's unknown what normal usage will be like over the course of the generation.

More accurate to just say that as long as a game's code doesn't cause the PS5 to exceed it's power limit then it can operate at max frequency. However, if power use exceeds the limit then one or the other will have to downclock.

The hope here is that this will encourage developers to find more power efficient versus time efficient ways of doing things. But if a developer wants or needs to really push either the CPU or GPU, they have that option if they don't mind reducing the frequency of the GPU or CPU.

Regards,
SB
 
On the XsX side since it can't downclock it will throttle
Could you dig up where you heard that?

As far as I'm aware, I don't remember xbox ever saying it will throttle.
I.e. If it can't run at its fixed frequency for some reason then it's operating outside of its window, and will probably exit game or shut down.
 
They can both run at max freq during normal usage, smartshift is here to downclock one or the other when not fully used to reduce power usage.

In worst case scenario when the power eveloppe is exceeded it will downclock.
On the XsX side since it can't downclock it will throttle.
It can downclock. But only does so when there is nothing happening to save on power.

As per downclocking under load, it would shut down before doing that IIRC. Just like all other consoles. Just expect your XSX to get really hot first.
 
They can both run at max freq during normal usage, smartshift is here to downclock one or the other when not fully used to reduce power usage.
...
Sry, it is not exact enough. Smartshift is there to reduce the power envelope of one component and deliver the power to the other. So the soc has more or less a stable power envelope. Only if both would be idle most of the time, power will be saved. This is also what we've seen so far in power-consumption tests. PS5 is more or less stable in the ~200W area while xbox series x in a 150-175W area but it can go up to over 200W (I think gears was one of those games).
But the data is a bit incomplete so far, because power consumption tests are not done so often. @Dictator you and your team should really add this to multiplattform tests.
 
Sry, it is not exact enough. Smartshift is there to reduce the power envelope of one component and deliver the power to the other. So the soc has more or less a stable power envelope. Only if both would be idle most of the time, power will be saved. This is also what we've seen so far in power-consumption tests. PS5 is more or less stable in the ~200W area while xbox series x in a 150-175W area but it can go up to over 200W (I think gears was one of those games).
But the data is a bit incomplete so far, because power consumption tests are not done so often. @Dictator you and your team should really add this to multiplattform tests.
That's not true either. In gameplay it's usually fluctuating from 175-195W. The 200W figure is reached in specific scenes, mostly in cutscenes or in one specific spot of the map.
 
Last edited:
Also "According to Mark Cerny, they could not get the GPU to run stable at 2.0 Ghz and the CPU at 3.0 Ghz at the same time."... I don't remember Cerny said something like that. Is there a source for this?
«Running a GPU at 2 GHz was looking like an unreachable target with the old fixed frequency strategy.»
«Similarly running the CPU at 3 GHz was causing headaches with the old strategy.»

If XSX can run at 3.6 1.8 at fixed clock, why PS5 need to smart shift to 3.2 1.8? Why 3.5 1.2?
Because they have different architectures, designs and cooling systems.
MS spent a lot of time to make the CPU run stable at 3.6/3.8 Ghz. And this is the hottest point on the chip.

If that is correct then I don't even want to know what XSX actual clock speed since its performance is similar to PS5. Yes, XSX clock is fixed, thus making that post questionable.
For most pastgen or crossgen games the 2.2 Ghz GPU and 2.0 Ghz CPU (with multithreading) mode should be enough to double the frame rate (and improve the graphics).
+ xbox has some problems with the toolset at the moment
 
Sry, it is not exact enough. Smartshift is there to reduce the power envelope of one component and deliver the power to the other. So the soc has more or less a stable power envelope. Only if both would be idle most of the time, power will be saved. This is also what we've seen so far in power-consumption tests. PS5 is more or less stable in the ~200W area while xbox series x in a 150-175W area but it can go up to over 200W (I think gears was one of those games).
But the data is a bit incomplete so far, because power consumption tests are not done so often. @Dictator you and your team should really add this to multiplattform tests.

The only place I found my ps5 hovering around ~200W, is control photo mode. In most game plays (control and others), it's ~140-175W power consumption.
 
The only place I found my ps5 hovering around ~200W, is control photo mode. In most game plays (control and others), it's ~140-175W power consumption.
There it is. We know (from a guy who actually put that scene in his profiler) that Control photo mode is very highly GPU limited, like almost all scenes the PS5 is known to reach its max power consumption (cutscenes).
 
The only place I found my ps5 hovering around ~200W, is control photo mode. In most game plays (control and others), it's ~140-175W power consumption.
Ok, I was exactly not exact enough :). Sure the PS5 can have a lower power consumption, if the CPU & GPU are limited e.g. by FPS. Than those components are idling around and need less power. E.g. Control is a last-gen game. Yes it has now a RT implementation, but the actual "rasterizer" stuff didn't receive much of an update. Therefore it is limited to 30fps (except for photo mode, where we than can see what the hardware is capable of in that game).
Optimal case for future console games would be open FPS (above 60) + VRR (if VRR is available at some point).
 
Ok, I was exactly not exact enough :). Sure the PS5 can have a lower power consumption, if the CPU & GPU are limited e.g. by FPS. Than those components are idling around and need less power. E.g. Control is a last-gen game. Yes it has now a RT implementation, but the actual "rasterizer" stuff didn't receive much of an update. Therefore it is limited to 30fps (except for photo mode, where we than can see what the hardware is capable of in that game).
Optimal case for future console games would be open FPS (above 60) + VRR (if VRR is available at some point).
It won't change the results. AFAIK most max power consumption measures (~205W) have being done at locked 60fps and 30fps (in Miles Morales cutscenes). It's actually not only about the framerate outputted to the screen. In most cases that max power consumption is about what is being (uselessly) rendered by the GPU.
 
It won't change the results. AFAIK most max power consumption measures (~205W) have being done at locked 60fps and 30fps (in Miles Morales cutscenes). It's actually not only about the framerate outputted to the screen. In most cases that max power consumption is about what is being (uselessly) rendered by the GPU.

Just tested destruction all stars and in entering arena scene power spikes to 225W just before actual gameplay, so it's not limited to 200W.
 
I don't know why the PS5 would be limited to ~205W at the wall (~175W DC) if they have a 350W PSU, the cooling fan seems to have a lot of headroom without becoming loud and it looks like the 16-phase VRMs could support a lot more than that.

They do need to take away some 20-30W DC for the VR headset, but the console could probably still pull 300W at the wall without any problems.
 
Status
Not open for further replies.
Back
Top