Pondering The Top Technical Innovations of the Playstation 4

Ah yes, some official "specs". There's still a lot we do not know officially.

So what speed does the CPU run at?
What's the cache sizes?
Etc... Etc... Etc...

I agree. We only have some of the specs. I hope that the rest of the specs will be leaked out or figured out.

Maybe we will find out if there is any other "special sauce" in this system other than the eight gigs of GDDR5 RAM.

We do know for a fact that the GPU has had modifications. I hope these are extensive and make the GPU act like a more powerful one.
 
I'm interested in the companion CPU. It seems to handle some of the background processes. Perhaps the PS4 OS runs on it, leaving the full jaguar CPU for games. Given rumours of two cores being reserved on Durango this could be a significant advantage.
 
From the specs sheet:

The Graphics Processing Unit (GPU) has been enhanced in a number of ways,
principally to allow for easier use of the GPU for general purpose computing
(GPGPU) such as
physics simulation. The GPU contains a unified array of 18 compute units, which collectively
generate 1.84 Teraflops of processing power that can freely be applied to graphics, simulation
tasks, or some mixture of the two
 
I'm pleased to see lees innovation, i.e the innovation already happpened in the open platform of PC's.

It much better for the world, no need for special dev-kits or workstations to use this level of tech yourself, the tools widely available.
 
I'm pleased to see lees innovation, i.e the innovation already happpened in the open platform of PC's.

It much better for the world, no need for special dev-kits or workstations to use this level of tech yourself, the tools widely available.

Wouldn't you still need a devkit? It's likely that the GPU, though based on GCN, is not directly comparable with any similar PC part. Even if Sony provided OpenGL as an alternative to LibGCM for developers, you'd still want to profile your games on actual hardware and not some PC that's a fair bit different.
 
Wouldn't you still need a devkit? It's likely that the GPU, though based on GCN, is not directly comparable with any similar PC part. Even if Sony provided OpenGL as an alternative to LibGCM for developers, you'd still want to profile your games on actual hardware and not some PC that's a fair bit different.

of course. But what I mean, anyone who wants to start coding or doing 3D art gets access to tools of comparable power.

i know engines need reworking for direct APIs, but people write crossplatform engines with d3d,gl,gcm back-ends.. the logic & design of whats going on with the data & shaders is the same. theres just a bit of streamlining in the command-list builder, thats all.
 
8GB RAM
176 GB/s bandwidth
= 5.8gb/s maximum addressable RAM at 30fps, 2.9gb/s addressable at 60fps (not that you’ll ever get anywhere near 100% bandwidth utilisation)


4GB of DDR5 would probably have done, the rest will have to be some sort of cache for the HDD
 
You see, we do have some official specs.
Is the GPU GCN or GCN2? Are there 4 reserved CUs or is the whole GPU customised for better GPGPU? What's the audio DSP's capabilities and will it enable 3D sound? What's the ? Is there stereoscopic body tracking or not? We don't know the top technical innovations because we don't know the inner workings. One of Vita's top achievements is stacked RAM, but that wasn't revealed in the Vita press release. We have some simplified specs. And there's no point going on a rant demanding insder knowledge like you did with RSX, because it's very unlikely going to happen. By all means discuss what has been accomplished, but believing a lack of public discussion on internal workings means there's no innovation is naive.
 
what kind of math is this? :LOL:

You divide 176GB/s by 60 frame/s and you get 2.93 GB/frame accessible by the system to produce each frame.

Same math and divide 176GB/s by 30 frame/s and you get 5.87 GB/frame accessible by the system to produce each frame.

Which points out the importance of bandwidth that many on these forums have stressed.
 
You divide 176GB/s by 60 frame/s and you get 2.93 GB/frame accessible by the system each frame.

Same math and divide 176GB/s by 30 frame/s and you get 5.87 GB/frame accessible by the system each frame.

Which points out the importance of bandwidth that many on these forums have stressed.

great, now tell me which bluray/HD feeds RAM at 5.87 GB/S, because, you know, this happens continuosly.
you're confident that each frame you need all the data new, too.

interesting, please elaborate
 
great, now tell me which bluray/HD feeds RAM at 5.87 GB/S, because, you know, this happens continuosly.
you're confident that each frame you need all the data new, too.

interesting, please elaborate


I think that has been established in this post.

HD and CDs don't access or write at anything near RAM speeds so I don't think they're in the picture.
 
Last edited by a moderator:
Doubt he will be allowed to elaborate on what we know, there's a reason they didn't tell us everything and that is they still don't want us to know everything.
 
It's not absurd maths. Have you read the post?
I have read, and you?

maybe going from 200 MB of working set to 6 GB is trivial for you, how much cache do you need according the 10x from the link? 60 GB of cache, now really?
and even if you take cache low to 3-5x, explain how do you make a workset of 5.87GB/frame working in ps4, please. I'm very curious to listen you
 
Back
Top