PlayStation 4 (codename Orbis) technical hardware investigation (news and rumours)

Status
Not open for further replies.
Well, IMHO if there are any problems with beta kits performance it is the 1,6 Ghz CPU.First parties like Guerrilla may be struggling to offload Cell loads to a GPGPU config.
 
This is just another case of kids making up extraordinary shit for no reason other than to see the waves they just made spread over the internet. Case in point: xbone downclock rumor.

I award 1 :rolleyes: to this rumor, that's all it's worth.
 
Cerny announced the secondary processor and mentioned the audio processor briefly in the Feb announcement. Sony won't be able to remove them quietly.

Besides, they really need a hardware security mechanism. The secondary processor may come in handy in that regard.

Their executives mentioned that $399 has always been their target. It doesn't make sense to announce the price and then remove the parts.

I have no doubt security is in:

- DRAM scrambling using e.g. AES with key burried using EEPOM on-chip (no ROM or fuses, too easy to read out using "expert tools"). Keys might be even done using PUFs..
- NOR flash scrambling/encryption (2nd stage bootloader, 1st is in ROM I would guess)
- Secure boot when loading OS into DRAM
- Signed games/apps
- Software obfuscation techniques ?
- Handshake via secondary IC, kind of watchdog. This IC probably also does ARM TrustZone so sensitive stuff is running in a protected mode.
- Hardware random number gen for proper crypto

etc..

Guess they learned from PS3, where at some point people found the root keys, you could just read them out. What puzzles me is that these hacks started by using games with bugs, so hackers could use some exploit to inject code. They should just run games in a sandbox or as a process in a less privileged mode.
 
doesnt the wide-spread PS3 hack are originated from stolen USB service dongle?
its a bit like PSP hacking where it get up from service battery (jig battery).
 
This is just another case of kids making up extraordinary shit for no reason other than to see the waves they just made spread over the internet. Case in point: xbone downclock rumor.

I award 1 :rolleyes: to this rumor, that's all it's worth.

Rumor is an euphemism.
 
doesnt the wide-spread PS3 hack are originated from stolen USB service dongle?
its a bit like PSP hacking where it get up from service battery (jig battery).
That and Sony's failure to implement the security algorithm correctly. Had Sony not screwed that up, the difficulty in breaking PS3's security would have been far higher. I dare say a modern console taking everything learnt from the successes and failures of last gen should be robust enough to withstand 10 years of hacking (at least at the consumer level, so complex hardware solutions are needed to circumvent).
 
I believe though Sony improved security greatly after the hack.
On PS4 they will be most likely be more prepared than on PS3.
 
What puzzles me is that these hacks started by using games with bugs, so hackers could use some exploit to inject code. They should just run games in a sandbox or as a process in a less privileged mode.

The TLB/page walking was so awful in the PPC chips they used that anyone who wanted top performance had to directly map all ram, with a single-instruction tlb miss handler. Proper memory protection was simply too expensive last time.

Modern x86 chips have great TLBs and hardware page walkers. I'd expect that this will be fixed this time around.
 
The consoles a revision or two down the road may also be taking advantage of even greater physical integration, with more of the system on-die, on-interposer, or in the same package.

If the security apparatus and even the RAM itself isn't already rolled in physically, it can be eventually.
Some of the last gen's exploits of physically glitching a bus that spans the motherboard or hijacking the pinouts of discrete chips is going to be an order of magnitude (or two or three) harder, and would require resources an ad-hoc hacking group wouldn't have without a very wealthy and technically-stocked backer.

That sort of hacking is going to be very hard to accomplish without compromising very fine microbumps or vias, and potentially wrecking their electrical behavior to the point that the captive system dies before spilling its secrets.
 
Why on Earth would Sony take out the DSP chips from PS4? Remote Play for all games and gameplay recording wouldn't even work then... and also what happens to the audio without the audio chip... lol.

That "rumour" is more FUD and smearing than anything else.
 
doesnt the wide-spread PS3 hack are originated from stolen USB service dongle?
its a bit like PSP hacking where it get up from service battery (jig battery).

Yes, that was very interesting. USB service dongle, in itself, is a ridiculous concept for me. Far too accessible way of interacting with the PS3 in service mode. Why would you even need such a thing on consumer hardware? For refurbishing broken units?

They could have had a signed diagnostic software inside the firmware or something, if they needed it at all.

Luckily for Sony, or "suspiciously" coincidentally, this didn't happen until around Sony broke even with their hardware costs.
 
...A lot of PS4 games seem to be struggling to hit 30 FPS, enough to make me wonder "what's wrong?". After all this was supposed to be the gazelle hardware not vice versa...


Speaking of this, can any developers shed light on the impact of GDDR5 latency vs something like DDR3? Or to bring it back to last gen, GDDR3 OR RAMBUS?

The bandwidth of GDDR5 is obvious, but for smaller bits when the system isn't moving large chunks of data like textures, how much can the latency hold the system up? Would these smaller bits not fit well in the 2MB L2 cache?
 
Speaking of this, can any developers shed light on the impact of GDDR5 latency vs something like DDR3? Or to bring it back to last gen, GDDR3 OR RAMBUS?

The bandwidth of GDDR5 is obvious, but for smaller bits when the system isn't moving large chunks of data like textures, how much can the latency hold the system up? Would these smaller bits not fit well in the 2MB L2 cache?

It will be not problem, at least not more that the high latency access of SPEs to their local memory this gen.
 
Phrased another way:

What kind of performance bump could be expected by cutting the GDDR5 latency in half?

Considering we are using a CU only for compute jobs and no graphics tasks:

If you only had a compute job that needs retrieve data from GDDR5 the performance for that job would be almost double. If you have many jobs it helps hidding latency switching from one to another while waiting for data from memory.

If we are talking about processing in the CPU, as has been said here, it being out of order and low clocked helps hiding latency although whenever it has to look for data in the GDDR5 module reducing its latency in half will have a big impact in fact.
 
Last edited by a moderator:
Does it really work that way? You would presume that you stream data from GDDR5 to the CU for processing - the latency is only for getting that stream to start and reach the GDDR5, but once that stream is going, it will go full speed and latency is not relevant?

So it would only matter in the unlikely case where you would randomly access individual bits of memory from the GDDR5 pool, doing no optimisation and making no use of any cache lines, etc.?
 
Does it really work that way? You would presume that you stream data from GDDR5 to the CU for processing - the latency is only for getting that stream to start and reach the GDDR5, but once that stream is going, it will go full speed and latency is not relevant?

So it would only matter in the unlikely case where you would randomly access individual bits of memory from the GDDR5 pool, doing no optimisation and making no use of any cache lines, etc.?

I'm thinking the GPU portion of the performance equation isn't an issue. The issue I'm concerned with is CPU performance (which the GPU is reliant upon to feed it data).
 
Status
Not open for further replies.
Back
Top