PlayStation 4 (codename Orbis) technical hardware investigation (news and rumours)

Status
Not open for further replies.
I doubt that they will have CU reservation at the moment.

Much probably, instead, they have CPU reservation for OS and other task (like the announced Voice Recognition functions).
The courious thing about Voice Recognition reservation is that, in order to work, will be always present no matter if you have PSeye or not. Courious choice.
From a technical point of view, speaking about this topic, it will be very interesting to know how PS4 will manage the CPU, in regard of audio, speech recognition, OS gestures and so on, because at the moment PS4 seems to lack any dedicated hardware to offload the CPU. (In advance, regarding the Audio, Cerny has only declared that PS4 has a compression decompression functions, similar to the one already present of Xbox360, nothing more nothing less).

In the future, for sure, some tasks will be moved to CU, but not so soon, because (as I have learned on this board) some tasks (like audio tasks, i.e. among others, speech recognition) will be very very difficult (but not impossible!) to manage via CU.

As it is, PS4 seems to be a bit unbalanced, a great GPU with a weak CPU.
I only hope that they will have some hidden dedicated hardware to offload the cpu works, otherwise the situation could be that CPU (loaded with many tasks) will be the bottleneck of the system.

And I have to add, but it is only my personal sensation (no info, no rumor) that we could have some surprise on CPU clock speed... Good, Bad... Who knows?

Why does it seem that everyone has forgotten about the Secondary Custom Chip?

ps4_secondary_custom_chip_sony_new_cpu_proccessor.jpg
 
Why does it seem that everyone has forgotten about the Secondary Custom Chip?

ps4_secondary_custom_chip_sony_new_cpu_proccessor.jpg

As Cerny said, it will manage background task like download and HDD access.
To me, it will also manage low power / standby state.
Microsoft has managed this situation by customizing the CPU (I read somewhere, if I remember well, that they customized the CPU adding gates that open/close enable/disable CPU cores according to the state), Sony instead, it is just my assumption, will use this ARM secondary chip in order to manage low power / standby state.
 
Last edited by a moderator:
Much probably, instead, they have CPU reservation for OS and other task (like the announced Voice Recognition functions).
The courious thing about Voice Recognition reservation is that, in order to work, will be always present no matter if you have PSeye or not. Courious choice.

Why do you need to reserve CPU resources for voice recognition ? Sony simply said PS4 will support voice navigation. It may not be like GoogleNow or Kinect 2's "always-on" speech recognition system.
 
Much probably, instead, they have CPU reservation for OS and other task (like the announced Voice Recognition functions).
The curious thing about Voice Recognition reservation is that, in order to work, will be always present no matter if you have PSeye or not. Courious choice.

I don't think "voice recognition requires a microphone" would be seen as an extreme requirement.

As to whether voice recognition works "all the time", that's unclear. Whilst gaming there are 2 jaguar cores doing (basically) nothing, on the other hand voice recognition whilst gaming is be debateably useful for the PS4.

(with 'buttons free' gaming [aka kinect], you clearly need a separate navigation method, but if you have a "home" button in your hands, then being able to say 'home' isn't a great leap forwards).

As it is, PS4 seems to be a bit unbalanced, a great GPU with a weak (in comparison) CPU.

Gaming machines of all shapes/sizes are usually a weak CPU with a great GPU. In some ways that's disappointing, but I don't see that changing any time soon.
 
Injecting itself at a high priority at anytime could be a bad thing if there was no reservation.
So that doesn't actually mean that there isn't a OS reservation, it could mean that its done on a percentage basis (e.g. 10%) and not sectioning of CU's (e.g. 2 CU's).

I can think of reasons why they would reserve GPU resources even if their not doing snap.
It's synonymous to reserving the RAM even though they may not need all that much right now.
What they may want to do 3+ years down the line has to be taken into consideration.

So we don't know if they have reserved any, but I personally choose to think on the safe side and believe they have reserved some until I'm told otherwise for sure.

I have no idea. It depends on how they design their run-time. e.g., If they reserve enough RAM to keep transient states, they may not need to reserve any CU.
 
The courious thing about Voice Recognition reservation is that, in order to work, will be always present no matter if you have PSeye or not. Courious choice.

I don't think "voice recognition requires a microphone" would be seen as an extreme requirement.

Sony is packing mono headset with mic in every PS4 box.
PlayStation-4-image-4.jpg


But they may force voice reg to be active only with PSEye. That move will drive better adoption of the camera.
 
LOL how can you take someone from Microsoft words as a confirmation for the PS4 specs when he had to go & ask questions about the Xbox One specs before he posted on Neogaf?
I don't know how technical he is, but I think of someone asking for confirmation as being a good thing rather than a sign of ignorance.
 
Power struggle: the real differences between PS4 and Xbox One performance

Mod: Link is off topic. It isn't providing technical insight into what PS4 hardware is - we already know what it is! PS4 vs XB1 development has its own thread where this has been posted.


/ Ken
 
Last edited by a moderator:
Gaming machines of all shapes/sizes are usually a weak CPU with a great GPU. In some ways that's disappointing, but I don't see that changing any time soon.

Well I am not so sure about this.

Cell, was a incredibly powerfull CPU, and I believe that also Xbox360, for its time, has a good CPU.

People of my circle, much more tech wise than me (not game-industry employed anyway) keep on tell me that CPU will be the "Achilles Heel" of the next generation.

For start, Jaguar is not a powerful CPU di per se and for 2013 standard in general.

On-line, Multiplayer, persistent open-world, procedural contents etc... could be some of the main genera in the next generation and they are quite CPU intensive.

Much probably also AI and physic engines will be more demanding (let's hope it anyway).

On top on this we could add that, PS4 cpu will, much probably, be load with other task:
Sound tasks, mainly, and maybe some pseye related stuff as much probably PSEye is simply a camera with a microphone.
A part for speech recognition (that is a feature that I am willing to try), the Sound Area is very peculiar and I hope in a next gen boost also in this department.

Then we have to consider the memory factor. PS4 badwith to CPU seems to be inferior to 20GB/s. And we have not to forget that GDRR5 latency is not optimized for CPU tasks.
And speaking about it, GDDR5 memory has been ever used for CPU before?

I do not know, with this considerations at hands, CPU could be the real bottleneck of PS4?
 
People of my circle, much more tech wise than me (not game-industry employed anyway) keep on tell me that CPU will be the "Achilles Heel" of the next generation.

That must be why the console makers went with such beefy Intel CPUs. ;)


On-line, Multiplayer, persistent open-world, procedural contents etc... could be some of the main genera in the next generation and they are quite CPU intensive.

Your list does not even remotely back your argument.

Much probably also AI and physic engines will be more demanding (let's hope it anyway).

They say this every gen, but no fear the 8-core Jaguar can certainly beat the snot out of the Cell PPU and the CUs can do some SPU type heavy lifting.

Then we have to consider the memory factor. PS4 badwith to CPU seems to be inferior to 20GB/s.

Yes, by design. The CPU does not need much bandwidth, go check the bandwidth of high end i7 systems.

And we have not to forget that GDRR5 latency is not optimized for CPU tasks.
And speaking about it, GDDR5 memory has been ever used for CPU before?

Do you have evidence or just repeating something you have read? Memory controllers for GPUs are not designed for latency, again this is by design. APUs have currently always been used on low-end systems, GDDR is too costly for such a use. Don't read a cost decision as a technical failure though.

I do not know, with this considerations at hands, CPU could be the real bottleneck of PS4?

This seems to be the new console war talking point, but when has a CPU ever been a bottleneck for games? I can play BF3 on my i5 PC, but my PS3 also plays it the same, the difference being in rendering, not CPU chores. I would expect that gulf in CPU power to be cut by an order of magnitude for the PS4, at least until PC CPUs get more powerful in the future.

Both Sony and MS sat with their engineers and designed system to get rid of old bottlenecks and avoid new ones. They both decided on low power x86 CPUs. I would expect they both made the right decision and not bottleneck themselves. Sony is betting on some offloading to GPGPU and MS on DSPs. 3rd party will boil down to lowest common denominator and neither will be used much outside of middleware.
 
On-line, Multiplayer, persistent open-world, procedural contents etc... could be some of the main genera in the next generation and they are quite CPU intensive.

Much probably also AI and physic engines will be more demanding (let's hope it anyway).
There's a lot of CPU power in 100 GF's (using that as overal performance metric as one people understand, taking the rest of the CPU to be balanced to similar 'performance' across integer and memory ops) efficient (PS3 had poor processor utilisation in some workloads) processing power. Unlike Cell it won't be needed for graphics work, so there's possibly an abundance of processing power, certainly compared to the norms of console design. And there's compute on both consoles to do some tasks.

I do not know, with this considerations at hands, CPU could be the real bottleneck of PS4?
CPU will no doubt be where most devs hit the ceiling first, but I wouldn't call it an Achille's Heel which suggests a platform-crippling weakness. And given the options, it seems a capable and sensible solution. Hence why both MS and Sony went that route, I guess.
 
Well, also PS4 is shipping final hardware now.

And we've received zero indication or even rumors of increased clocks.

Well you can't rule out possible software update though. If the hardwares shipping today are qualified for, say, 1.8GHz/900MHz. Should the need rises, SONY could add boost modes quite easily (and to be activated by the developers only, of course) without increasing power consumption. For example 1.8GHz/850MHz with only 6 CPU activated sounds reasonable to me.
 
Well you can't rule out possible software update though. If the hardwares shipping today are qualified for, say, 1.8GHz/900MHz. Should the need rises, SONY could add boost modes quite easily (and to be activated by the developers only, of course) without increasing power consumption. For example 1.8GHz/850MHz with only 6 CPU activated sounds reasonable to me.

Seems to me they could only do this if they are software screening out the chips that cant make those clocks. In other words if this is already planned. Otherwise some might ship and not make 1.8 or 850 once theyre upclocked.


Anyways they've said nothing, I highly assume it's 1.6/800 until proven otherwise.
 
Seems to me they could only do this if they are software screening out the chips that cant make those clocks. In other words if this is already planned. Otherwise some might ship and not make 1.8 or 850 once theyre upclocked.
Indeed, it'd be a little odd to include hardware with room to improve but a reluctance to do so. In a handheld like PSP, the cap made some sense to conserve battery life. For a home console the only issue is heat and noise. Given that a CPU upclock to 1.8 GHz should only add a few watts heat, if the processors are capable of that then I'd expect Sony to go with that. It all comes down to what the product out of the factory can achieve in numbers. Too many failures at 1.8 GHz and Sony will want to pick the lower speed.
 
The process is mature at this point. We have to assume that the CPU core macro AMD supplied is competently designed, and intended to run up to 2GHz I hear (possibly more.) 1.6GHz, and certainly also 1.8GHz, is a very conservative clock by today's standards; I doubt very many, if any chips would actually fail because of a 200MHz bump in clock that is still well below the alledged ceiling of the design...

Still, none of that means it's going to happen. I wouldn't hold my breath TBH, if sony already has the lead performance-wise, why would they have to do anything...? :p (Certainly not what a geek like me wants to think, but we gotta be realistic here.)
 
Status
Not open for further replies.
Back
Top