PlayStation 4 (codename Orbis) technical hardware investigation (news and rumours)

Status
Not open for further replies.
@Aeoniss

Ubisfot said, I quote, "At the lowest level there's an API called GNM. That gives you nearly full control of the GPU."
 
My indie dev friend said his devkit, which is an earlier model, was 10% slower than the latest versions. That's got to be on the hardware level. Improvements to the system software and such could improve things a little more, on top of the developers' experience with extracting better utilisation of new hardware.

This 10% fits with going from a 1,6GHz CPU to a 1,75GHz one.

Or 1.45 GHz on early silicon to 1.6 in the final HW. I only have a relative number.

Was your Dev friend talking about 10% overall or just 10% for the CPU? because if he was talking about overall that small CPU clock boost wouldn't make a 10% difference.
 
LOL how can you take someone from Microsoft words as a confirmation for the PS4 specs when he had to go & ask questions about the Xbox One specs before he posted on Neogaf?

I'm not "taking him at his word", I'm just noting he is comparing to a 1.6/800 PS4. And thinking given his position he may have some insight into this, regardless of what he says about it (the last thing he probably wants to be doing is giving out Sony's clockspeeds for them, even if he knows, so he's going to say he doesn't know no matter what).

He may not.
 
I was also under the impression that PS4 would need some kind of GPU reserve, but my friends with technical knowledge (not 'insider' knowledge!) stated that this likely wasn't the case, unlike the Xbox One, as it isn' doing any kind of 'Snap' features or what not.

They said that the PS4 devs should have 100% access to the whole GPU.

Of course, if anybody could clarify\correct this I would greatly appreciate it.

Personally I dont agree. 360 did not have snap yet had a GPU reserve.

I assume all consoles will have that. For instant pullup of the guide etc. Others without insider knowledge will say Sony consoles dont have this. I imagine it does.

Or 1.45 GHz on early silicon to 1.6 in the final HW. I only have a relative number.

1.45 at any point Seems highly unlikely given 1.6 in the old Killzone slides. however I'd think 10% could easily be accounted by SDK optimization.
 
I'm not "taking him at his word", I'm just noting he is comparing to a 1.6/800 PS4. And thinking given his position he may have some insight into this, regardless of what he says about it (the last thing he probably wants to be doing is giving out Sony's clockspeeds for them, even if he knows, so he's going to say he doesn't know no matter what).

He may not.

Considering the lack of insight he has in there own machine (he had to go ask questions, nothing wrong with that considering he is non-technical) I really really really doubt it.
 
The pretext for avoiding HTML in interfaces was the performance on mobile ARM cores (and we're not talking about the high end insomuch those horrid Mediatek CPUs).

On a desktop-class browser/toolkit it should do fine. One word of caution though- Sony doesn't always do the best embedded Webkit browsers (PSP/3/V)

PSP doesn't have a WebKit browser.

PS Vita has a pretty good one now. At launch, it had redraw issues but they seem to have fixed it. Safari and Chrome are better, but the OS doesn't reserve resources exclusively for games there.
 
Was your Dev friend talking about 10% overall or just 10% for the CPU? because if he was talking about overall that small CPU clock boost wouldn't make a 10% difference.
He said, "it's an older devkit so it's not as fast as the current ones. It's about ten percent slower."
 
I was also under the impression that PS4 would need some kind of GPU reserve, but my friends with technical knowledge (not 'insider' knowledge!) stated that this likely wasn't the case, unlike the Xbox One, as it isn' doing any kind of 'Snap' features or what not.

They said that the PS4 devs should have 100% access to the whole GPU.

Of course, if anybody could clarify\correct this I would greatly appreciate it.

According the VGLeaks, the GPU has a high priority queue for VSHELL (supposedly the PS4 XMB equivalent). So the OS should be able to grab GPU cycles at any time.

If they reserve CU(s) for the OS, it should be running some on-screen or GPU compute tasks all the time. Can't think of anything off the top of my head at the moment.
 
According the VGLeaks, the GPU has a high priority queue for VSHELL (supposedly the PS4 XMB equivalent). So the OS should be able to grab GPU cycles at any time.

If they reserve CU(s) for the OS, it should be running some on-screen or GPU compute tasks all the time. Can't think of anything off the top of my head at the moment.

Right, so it's not like its actually reserving hardware resources\cycles as much as it is being allowed to inject itself at any moment.

This is in line with what my indie dev buddy said.

Thanks, I was kind of fuzzy on the issue. Provided some clarity =D
 
Right, so it's not like its actually reserving hardware resources\cycles as much as it is being allowed to inject itself at any moment.

This is in line with what my indie dev buddy said.

Thanks, I was kind of fuzzy on the issue. Provided some clarity =D

Need to see how their UI works first. I don't know of anything that's running at the OS level and needs constant GPU cycles.
 

Yap, they patched the OS to allow the browser to coexist with the game now. Background music playback or Skype is also possible. But quite often, the OS will clear the browser cache so you have to reload the pages when you switch back to the browser from a game.

When it gets intensive, the OS will ask to unload other apps too. In a general OS like iOS, everything is equal.
 
Right, so it's not like its actually reserving hardware resources\cycles as much as it is being allowed to inject itself at any moment.

This is in line with what my indie dev buddy said.

Thanks, I was kind of fuzzy on the issue. Provided some clarity =D
Injecting itself at a high priority at anytime could be a bad thing if there was no reservation.
So that doesn't actually mean that there isn't a OS reservation, it could mean that its done on a percentage basis (e.g. 10%) and not sectioning of CU's (e.g. 2 CU's).

I can think of reasons why they would reserve GPU resources even if their not doing snap.
It's synonymous to reserving the RAM even though they may not need all that much right now.
What they may want to do 3+ years down the line has to be taken into consideration.

So we don't know if they have reserved any, but I personally choose to think on the safe side and believe they have reserved some until I'm told otherwise for sure.
 
Injecting itself at a high priority at anytime could be a bad thing if there was no reservation.
So that doesn't actually mean that there isn't a OS reservation, it could mean that its done on a percentage basis (e.g. 10%) and not sectioning of CU's (e.g. 2 CU's).

I can think of reasons why they would reserve GPU resources even if their not doing snap.
It's synonymous to reserving the RAM even though they may not need all that much right now.
What they may want to do 3+ years down the line has to be taken into consideration.

So we don't know if they have reserved any, but I personally choose to think on the safe side and believe they have reserved some until I'm told otherwise for sure.

I think we need to get more information about the 'Secondary Custom Chip' because I remember hearing someone say that it would be helping with the OS maybe it will also have a GPU in it.
 
According the VGLeaks, the GPU has a high priority queue for VSHELL (supposedly the PS4 XMB equivalent). So the OS should be able to grab GPU cycles at any time.

If they reserve CU(s) for the OS, it should be running some on-screen or GPU compute tasks all the time. Can't think of anything off the top of my head at the moment.

I doubt that they will have CU reservation at the moment.

Much probably, instead, they have CPU reservation for OS and other task (like the announced Voice Recognition functions).
The courious thing about Voice Recognition reservation is that, in order to work, will be always present no matter if you have PSeye or not. Courious choice.
From a technical point of view, speaking about this topic, it will be very interesting to know how PS4 will manage the CPU, in regard of audio, speech recognition, OS gestures and so on, because at the moment PS4 seems to lack any dedicated hardware to offload the CPU. (In advance, regarding the Audio, Cerny has only declared that PS4 has a compression decompression functions, similar to the one already present of Xbox360, nothing more nothing less).

In the future, for sure, some tasks will be moved to CU, but not so soon, because (as I have learned on this board) some tasks (like audio tasks, i.e. among others, speech recognition) will be very very difficult (but not impossible!) to manage via CU.

As it is, PS4 seems to be a bit unbalanced, a great GPU with a weak (in comparison) CPU.
I only hope that they will have some hidden dedicated hardware to offload the cpu works, otherwise the situation could be that CPU (loaded with many tasks) will be the bottleneck of the system.

And I have to add, but it is only my personal sensation (no info, no rumor) that we could have some surprise about CPU clock speed... Good, Bad... Who knows?
 
Last edited by a moderator:
Status
Not open for further replies.
Back
Top