PlayStation 4 (codename Orbis) technical hardware investigation (news and rumours)

Status
Not open for further replies.
It's been discussed that such a processor may just be used for the compression/decompression of the shared videos, but that'd mean its use is only for a tiny percentage of the time and would be a terrible use of resource.
It's actually quite clear this function is handled by the VCE (video compression engine) and UVD (universal video decoder) blocks of the SoC. It would be kind of awkward to put it in some external chip with no direct connection to the framebuffer holding the data that needs to be encoded. And basically all recent GPUs of AMD contain these blocks (save for some ultra low end models).
 
If not for the OS or compression/decompression, it's only used for DRM and help in preventing hacking of the system? If so, it seems strange that they'd use it as a selling point during the reveal. I agree it almost certainly isn't accessible by the devs, but it in some way may leave the SOC to concentrate more on games.

It remains a bit of an unknown quantity. I'm fairly sure that Sony are keeping it a secret for good reason. They've actually been MORE quiet on it since the initial reveal.
 
The background processor was disclosed as being able to manage downloads and updates. There is imprecise language as to its involvement in network download and upload traffic during gaming.

Fundamentaly, it seems to be capable of arbitrating traffic between mass storage, the network, and potentially the main SoC during gameplay. When the console not active, this processor may be capable of at least initiating downloads and udpates.

Without further public disclosure about the details of its involvement, I don't see a way to eliminate a number of very different possibilities for how this arrangement is implemented. Many of the possibilities do not require the game or application to know much, if anything, about the processor.
A bunch of those functions aren't glamorous, and there could be platform-specific security details, since this processor does appear to interact with some potentially sensitive system paths.

This may give Sony little reason to go into too much detail concerning a rather unexciting and anonymous device from a gamer's point of view, and a potential vector into accessing the OS or system functions for those not interested in buying games.
 
You're probably right.

I'd have thought a tiny proportion of a single Jaguar core could be devoted to downloads and updates, even security. I'm not saying the chip is powerful in anyway, I'm certain it isn't. Just seems there's more to it, so that the system has a little competitive advantage. Especially is it had its own memory pool of slow memory.

Do we not think the GDDR5 pool is excessively fast for OS functionality? If you're going to consider as much as 2-3GB for the system, that's a big expense for something that'd be covered by much cheaper RAM.
 
The ARM can run at very low power when the console is 'off'. I don't think the OS Processor has anything to do competitive advantage, but user experience with a consistent performance at various power levels. And it can handle security too, which is reason not to talk about it, as 3dilettante says.
 
I'm thinking more along the lines of the reverse of what a current gaming PC would do. Normally a PC has a high amount of system RAM, with a comparatively small video RAM. What if the PS4 has a tiny system RAM and a high video RAM? Perfect for a gaming machine. The devs would never need know.

I'm probably in a dreamworld, but it's fun to speculate.
 
I'd have thought a tiny proportion of a single Jaguar core could be devoted to downloads and updates, even security.
Sony's original bullet point was energy efficiency, particularly for a form of networked idle that allowed the background processor to download updates when the console was "off".

Depending on their energy ceiling, keeping the SoC idling for the same purpose could have been too high. Power gating for the CPUs is at a module granularity, with an uncertain amount of system RAM that has to stay at least minimally active. AMD chips with 100W or more TDPs can drop down to single digit watt ranges, but even that could be too high.

There's also a potential yield benefit, if the idle power range is a limiter. Leaky APUs that don't idle as low as the norm can still be considered good if the tiny background processor is all that matters at those power modes.


Do we not think the GDDR5 pool is excessively fast for OS functionality? If you're going to consider as much as 2-3GB for the system, that's a big expense for something that'd be covered by much cheaper RAM.
The downside to unifying anything is that you eventually wind up using the single resource for things that don't benefit from the full capabilities of it. Since it's fast, it shouldn't waste too much time on it.

Having a separate RAM pool in addition to the GDDR5 just for the OS doesn't sound like it saves money, however. Synchronization updates and system structures like page tables having to hop between pools can lead to performance regressions.
 
Why would the secondary SoC need to run the full OS or even access the GDDR5? All it needs is a couple hundred MB of stacked ram to run a minimal *NIX instance and write access to the hard drive.
 
If the secondary chip is only running downloads, updates and security, I doubt it'll need even 10MB of RAM. Maybe just a tiny cache.

Regardless, it'd be interesting to know more, even if just a confirmation that it's used for security measures.
 
Just for the record; the image during the reveal gave the chip a complete slide unto itself.

"Secondary Custom Chip
Background Processing"

I completely agree that it's most likely for the items described above by 3dilettante, but interesting nonetheless.
 
Why would the secondary SoC need to run the full OS or even access the GDDR5? All it needs is a couple hundred MB of stacked ram to run a minimal *NIX instance and write access to the hard drive.

I think Cerny said the secondary chip can access the GDDR5 memory.

No more information after that. :-/
 
Secondary devices, like disk and IO controllers, use DMA to work with main memory in PCs.

Since the secondary processor sits in the middle of things like the network and storage, both of which in another system could write to main memory, it makes sense that the background processor can.
That could be the weakest level of interaction through memory between the background processor and the CPU cores. It doesn't rule out tighter synchronization and sharing of the memory space, but the looser the coupling, the less the low level details on either side can affect one another.
 
No more of that 14+4 and unbalanced system talk...

http://www.guardian.co.uk/technology/gamesblog/2013/jul/15/ps4-develop-2013-playstation-sony

6. Designed to be a balanced system
In his talk on developing for PS4, Neil Brown promised a system with no bottlenecks. "The Jaguar CPU is a state-of-the-art 64bit X86 architecture which will make it easy to port PC code. It has eight cores, and each core has 32KiB of D and I cache, and each four-core group has 2MiB of L2 cache". More importantly, he says, it's a modern general purpose CPU, which means unlike the in-order processors of current consoles it uses out-of-order execution, executing instructions depending on when they are ready rather than in the order governed by the original programming. Meanwhile, the GPU has 1.84 TFLOPS of processing power and a greatly expanded shader pipeline compared to PS3 to remove bottlenecks. There's also much better utilisation of the low-level hardware, via the GNM low-level API, for coders who really want to control who data is accessed by the GPU. Finally, the 8GB 256 bit GDDR5 memory reduces the bottlenecks associated with generating lots of large textures, while Brown says the PS4 has enough render back end units to ensure consistent pixel fill rates. There was also something about vertex shaders and dense meshes but he lost me there.

7. This is the age of 'asynchronous compute'
"One of the big new features of this generation is Compute, which allows you to use the GPU as a general purpose processor," said Brown. And on PS4, asynchronous compute means that general tasks such as physics and AI calculations can be executed in parallel with graphics processing. Effectively then, within a single frame of the game's runtime, each of the GPU's 18 compute units can seamlessly switch between general computing and graphics tasks depending on what is most pressing.
 
What chips do Sony use in their mobile chipsets?

PSP uses custom Sony chip based on 32bit MIPS-architecture with some custom GPU and another custom Sony chip based on 64bit MIPS-architecture as the "media engine"
PS Vita uses 2GHz quadcore ARM Cortex-A9 with PowerVR SGX543MP4+
Their phones use with couple exceptions Qualcomm Snapdragons
 

Yes, I vaguely remember they said the GDDR5 memory will remain powered in low power mode. So the secondary chip should be able to access the main memory during that time too.

article said:
There was also something about vertex shaders and dense meshes but he lost me there.

Haha, at least he's honest. They should reveal the info to people like DF instead.

Low level access for tweaking performance and utilization should be nice but what's exactly not round here ? :runaway:
 
http://www.engadget.com/2013/07/16/sony-ps4-development-kit-fcc/

Sony proudly showed off its PlayStation 4 hardware for the first time at E3, and now we're getting a peek at what developers are working with this generation thanks to the FCC. A DUH-D1000AA Development Kit for PS4 prototype is listed in these documents, thanks to its support for Bluetooth and 802.11 b/g/n WiFi. As one would expect, the diagrams show it eschews the sleekness of the consumer model for extra cooling, a shape made for rack mounts plus extra indicator lights and ports (USB and Ethernet.) Also of note is a "max clock speed" listing of 2.75GHz, and although we don't know what the game system will normally run it's interesting to hear what that 8-core AMD Jaguar silicon may be capable of, all while maintaining a temperature between 5 and 35 degrees celsius. Hit the link below to check out the documents for yourself, after this and the system's controller all we're left waiting for is Mark Cerny's baby.

ps4-dev-kit-fcc.jpg


sony-ps4-dev-kit-specs.jpg
 
The 2.75GHz "maximum clock frequency in the system" is clearly the WCLK of the GDDR5 running at the announced 5.5 Gbps. GDDR5 needs two clock signals to work, one at 1.375GHz and the other one at the mentioned 2.75GHz (for 5.5Gbps operation), which is likely the highest clock in the system, not the CPU clock.
 
Status
Not open for further replies.
Back
Top