PlayStation 4 (codename Orbis) technical hardware investigation (news and rumours)

Status
Not open for further replies.
I'm not sure I understand the need to fully reserve cores for the OS. On the PS3 it made sense for the Cell because of the LS I guess, but here what's the reason for not giving access in a non-guaranteed way instead, and give high priority to the OS thread? Most of the time the OS will be idle, it feels like it's kind of a waste of available resources.

Because if you limit the OS to run on specific cores it behaves deterministically.
Console Games generally start 1 thread per hardware core and use affinity to keep them there, if the OS can run on any core, it could push one of those threads over a frame, even though it was no necessary.
It also doesn't pollute the L1 of any none OS cores and if the OS requires multiple cores, and they are on the same module, it doesn't pollute the L2 of the other module.
 
I don't mean letting the OS thread steer to other cores, keeping affinity but simply allowing a game to start a thread on the OS reserved core. The devs would know that it's a "special" thread that is not guaranteed, while the other 7 threads are exclusive and predictable. I was under the impression that's what happened on the 360, wasn't it possible to have game thread on the OS core?
 
I wouldn't put too much faith in that figure caption. It was not said by the devs. What are the two cores supposed to do during a game?

IMO, one for OS and its services, and other for 3rd party programs that will be accessible during gaming [Netflix, etc].
 
interesting analysis of ps4 porting from pc !

it also confirms that the os of ps4 uses 2 cores !

Very nice article! First time I've seen a dev speak so openly about development on one of the next gen platforms.

As for the seeming return of split memory, now its a matter of deciding if you want to use the l1 and l2 cache or not, which isn't quite the same ;) ...but considering the impact even that has, you can imagine Xbox One's implementation of DirectX11 drivers to be quite complex as well.
 
the allocation between garlic and onion seems to drive the hailed "uniform memory access" straight out of the window :D
You end up with 2 quite different pools, accessible from anywhere... you know, just like the PS3 (albeit sizes are configureable).

Guess "UMA" is meant as " Unified Memory Architecture".. ie 1 pool of memory

Because this is actually better for performance than UMA
 
I don't mean letting the OS thread steer to other cores, keeping affinity but simply allowing a game to start a thread on the OS reserved core. The devs would know that it's a "special" thread that is not guaranteed, while the other 7 threads are exclusive and predictable. I was under the impression that's what happened on the 360, wasn't it possible to have game thread on the OS core?
There is no "OS Core" on the 360. It reserved 5% of cores 2 and 3, enforced through the DirectX present() call. So when the game was lagging, the OS lagged, that's why you get those horrible juddery experiences sometimes when bringing up the OS overlay.

The PS4 will be running apps, their version of GameDVR, streaming, all sorts of things in their reserved section.
 
There is no "OS Core" on the 360. It reserved 5% of cores 2 and 3, enforced through the DirectX present() call. So when the game was lagging, the OS lagged, that's why you get those horrible juddery experiences sometimes when bringing up the OS overlay.

The PS4 will be running apps, their version of GameDVR, streaming, all sorts of things in their reserved section.

Sounds like PS4 may suffer those "juddery" experiences that 360 suffered too!
 
Please refrain from bringing in versus material. This isn't the thread for it.
 
interesting analysis of ps4 porting from pc !

it also confirms that the os of ps4 uses 2 cores !

Mod: Copy/pasting entire articles is bad form. A quote of the most poignant part of an article accompanied with a link to drive traffic is the correct netiquette.
That's an excellent article, and it reflects what Cerny said about the PS4 being able to port an AAA PC game in two months.

Now I hope this benefits PC Gaming also, because now the ports are based on the highest common denominator rather than the minimum common multiple.

It's going to be a very interesting new era where the PC and consoles will work in tandem -as my friend Sarah would say- or hand in hand.

Edit: Cjail, your explanation makes sense. Maybe for the most demanding tasks and future proof projects they will reserve some extra memory in a way it can be dynamically allocated, so games that need more available memory are free to do so using some kind of "special mode". I don't think Sony are going to just use 512MB or 1GB, but they are going to utilize more RAM if need be in some particular scenarios. I mean 512MB-1GB as fixed memory used by the OS and some extra RAM used with some discretion.
 
Last edited by a moderator:
so now we have a confirmation of 2 cores reserved for the OS,next is the RAM,great article!

I'm not sure I would call that "confirmation". There are too many uncertainties. What did they base that caption on? The Killzone: Shadowfall reveal demo from earlier this year? It was running on an earlier devkit, so that could've been the reason why it only used 6 cores. Who knows.
 
Maybe for the most demanding tasks and future proof projects they will reserve some extra memory in a way it can be dynamically allocated, so games that need more available memory are free to do so using some kind of "special mode". I don't think Sony are going to just use 512MB or 1GB, but they are going to utilize more RAM if need be in some particular scenarios. I mean 512MB-1GB as fixed memory used by the OS and some extra RAM used with some discretion.

Well Just add Water CEO said that Sony has , I quote, "already ring-fenced the system memory away from the game memory"(source)

I would exclude dynamic memory allocation given this statement.
 
The services running in background need some memory but very little CPU power. Nothing to justify 2 cores reserved and unaccessible to the game, because in theory they are just slowly moving data around. Currently announced services are using dedicated hardware for both audio and video codecs, and even a dedicated ARM CPU for moving data. So far that includes audio/video chat, gameplay recording, uploading, background install and updates, using zlib hardware for decompression, and using display planes to avoid gfx pipeline contention between the game and the OS.

A browser would be an exception, but it wouldn't be in background, the game would be suspended or in constrained mode while a browser is open. (and if not, why?)

On the PS3 side, it looks like the OS lagging was caused mostly by not having the display planes (must wait for the laggy game to flip), and a severe lack of OS memory (dropped the XMB thumbnails cache when in game). None of this will happen on the PS4.

So if the 2 cores rumor is true, I'm thinking they are reserving much more than they need, and might release some later on, or otherwise they have something planned which we haven't heard about yet.
 
universal game chat will need some type of OS running in the backround at all times no ? So it will require dedicated resources. Same thing with cross game invite. Some type of OS is going to constantly run and its going to certainly need some type of cpu power to run it at all times and handle the requests.
 
Considered the dedicated audio hardware cross game voice chat would probably require little resources.
Cross game invites work already on PS3.
 
Last edited by a moderator:
One of the reasons you see such a dramatic jump in OS CPU usage is that on XB360 or PS3 a feature was probably part of a library linked to a game, using the games CPU budget and now those features are part of the OS reserve.
Kinect being the obvious example on 360/XBOX, I can't comment on PS4.
 
so now we have a confirmation of 2 cores reserved for the OS,next is the RAM,great article!

We had that some time ago actually...

FWIW if Richard is stating 2 cores are reserved, he has heard that first hand from someone with direct access to the hardware and probably has confirmation of that from multiple sources.
 
Relating to the discussion:

https://twitter.com/digitalfoundry/status/359270732850659329

It's revealed that the last 7 minutes is kept instead of 15, for video-game recording. According to another DF tweet (well, basic maths actually), 512MB would be enough to hold 7 mins of video at 9.75 Mb/s.

Using RAM only would make sense because with all the silent installs and streaming and what not, the hard disk would be super busy. However, I'm hoping and expecting a voluntary recording option with much longer caps (for commentary, play-throughs) that records to harddisk, but comes with a performance warning.
 
Relating to the discussion:

https://twitter.com/digitalfoundry/status/359270732850659329

It's revealed that the last 7 minutes is kept instead of 15, for video-game recording. According to another DF tweet (well, basic maths actually), 512MB would be enough to hold 7 mins of video at 9.75 Mb/s.

Using RAM only would make sense because with all the silent installs and streaming and what not, the hard disk would be super busy. However, I'm hoping and expecting a voluntary recording option with much longer caps (for commentary, play-throughs) that records to harddisk, but comes with a performance warning.

so 512 mb for os and 512 mb for recording - so 7 gb for games!
coudnt flash storage solve the recording gameplay problem ? 512 mb is a lot of gddr5 ram which could be used for games !
 
Shadow Fall gameplay session recorded and uploaded at the conference in February is 8 Minutes and 30 seconds long.
So they either reduced the total "capture" time since February or Neil Brown said the wrong number.
 
Last edited by a moderator:
Status
Not open for further replies.
Back
Top