Yet the talk had no mentions of any separate modes, if they actually do have separate modes that will run all PS4/Pro games as they are, Cerny did horrible job on conveying it to the audience, instead giving the impression that they're not going to even have full backwards compatibility on launch.
There were a number of times where Cerny used the words "expect to see" or "can expect", which I suppose is part of his role as a spokesperson for Sony. At this early stage, it doesn't seem like they have the depth of data to make a material claim about product features like that, in case there's a deviation from expectations given to the public and investors.
Agreed.
Also, I seem to remember that GG were showing off their Decima Engine when they made Horizon, and they had a system where data was streamed in from the HDD based on the camera angles, so it would (try to) load more data as you turned the camera.
Wanna bet Cerny spent a lot more time with them than other devs - for obvious reasons, that engine is Sony's shining moment this gen - and asked them what they believe would make more of an impact in the next gen?
I recall various developers trying virtual texturing or similar methods. I thought the Trials series eventually adopted one variant, and the megatextures from Rage were another.
Partially resident textures as a concept came out in the time period of the original GCN, though it seems like even now the vendors are trying to make it stick.
HBCC attempts to migrate data from lower tiers of memory for similar purposes.
I guess Oberon will dip below 10TF during regular play with variable frequency introduced, no? Because I am pretty sure they would have rather have locked 10TF across all consoles then have it variable.
Even 2-3% drop will result in chip running below 10TF, but this looks better on paper and people will not exactly notice a difference in performance if it drops few GFLOPs lower.
This may depend on how representative the scenario Cerny gave for how the prior generation set its cooling, clock, and power supply levels. If the idea is that the platform as a whole had to make a best-guess at what the worst-case power draw might be, and it could leave performance on the table if too conservative or have loud cooling or reliability issues if too aggressive, then something like the PS5's method can get a much better best-guess. It could come down to how almost all games on the PS4 Pro don't go over 160W, but there's one or two that have been confirmed to go over. If this were the PS5, the possibility of those one or two games wouldn't be causing designers to second-guess their clock targets so much.
Wasn't there the notion that temperatures have no impact on clock boosts? Yes it sounds strange, but that was what was said?
I'd expect it's true within the bounds of whatever ambient temperature range and airflow requirements Sony decides are part of normal operation. A sufficiently capable cooler with ambient temperature at or below the design boundary should be able to pull heat out of on-die hotspots quickly enough that there's no need to modify clocks (for some potentially non-zero but arguably not detectable periods of time).
Anything beyond those limits and the system may consider itself to be in a thermal recovery mode, which is not a new concept. In that state, the console is no longer in the range of what it promised to developers, and will probably start logging warnings or warning the user, if it doesn't hit a thermal trip condition and immediately shut down.
It's possible that under the old system, games that were exceeding expected power were doing so in a way that threatened shutdowns or were potentially compromising the longevity of the system.
So if PS5 is running at a constant power delivery....then is the fan going to be "max" the entire time?
I thought it was funny when Cerny mentioned that one of the most demanding cases he saw was the Horizon map screen, with its simpler geometry. It would have been nice if he had said they were going to give an easy way for developers to put a lid on the demands of their map and menu screens, rather than just boost the console to run the exit prompt at an uncapped frame rate.
Some games may not use avx because the instruction uses too much power? That is one of the dumbest things I’ve heard. SIMD is essential in leveraging maximum performance from a cpu.
Depends if you mean in general, or on the PS5 with a Zen 2 processor. In general, there's a lot of variability in terms of which CPUs support what set of extensions, and for some of the broader AVX types there are many CPUs with very abrupt downclocks for relatively long periods of time just from encountering a single instruction of that type. It's gotten better with more recent cores, particularly with Intel. AMD's processors have generally just lagged in adopting the wider widths until they were mostly handled by reducing boost levels. The PS5's CPU seems to be capped below where such variability would become noticeable in terms of single and all-core boost.
Cerny himself said that ps5 will run at boost clock all the time, except when a game hammer the chips too hard.
So I'm confused, are all of these boost on ps5?
1. PS5 default clock (boost except when power budget limits hit) running ps5 games
2. PS5 default clock running ps4 games (even more boost than ps4 pro running ps4 games). This only has been tested on 100 top games
3. Ps4 games running at ps4 pro boost clock backwards compatibility on ps5. Does this mean ps5 will run at lower clock than the normal boost? Or they will cut the CU counts to maintain compatibility? Both?
Cerny seems to have only addressed what AMD calls boost.
Running at these high speeds seems to be part of where he says games might be experiencing issues, but he didn't say that was all there was to the situation.
I'm pretty sure he's aware about the Pro's unrelated boost mode, but it's possible he's not committing to it yet.