Pondering The Top Technical Innovations of the Playstation 4

I have read, and you?

maybe going from 200 MB of working set to 6 GB is trivial for you, how much cache do you need according the 10x from the link? 60 GB of cache, now really?
and even if you take cache low to 3-5x, explain how do you make a workset of 5.87GB/frame working in ps4, please. I'm very curious to listen you

10x is very very high on the estimate. It did mention that 3x is good enough for most purposes.

If we do similar math for 60 fps v synced, then we shave off a bit from the 2.93GB/frame to about 2.5GB/frame maximum.
3 times that end up with 7.5GB, which is interestingly close to the PS4's RAM.
It's interesting that Sony might have hit a pretty good balance.

The point being every frame requires all assets to be re-read and this is where the bandwidth/frame comes into play in building each and every frame.
 
Last edited by a moderator:
I have read, and you?

maybe going from 200 MB of working set to 6 GB is trivial for you, how much cache do you need according the 10x from the link? 60 GB of cache, now really?
and even if you take cache low to 3-5x, explain how do you make a workset of 5.87GB/frame working in ps4, please. I'm very curious to listen you
I honestly don't understand your question. A top metric of 5.87 GB/s frame is scientific fact. That doesn't mean that's the whole picture or anything of the sort. It's just valid maths with one set of numbers deriving one perspective.

If your point was that the conclusion is wrong (PS4 has too much RAM), that's a different argument to "absurd maths".
 
I wish someone would ask Mark Rein to compare the performance of the PS4 to a high end desktop with a GTX 680 while running the same demo.

I'm trying to watch the video now. If possible, I will transcribe it and post it here.

I'm hoping the GPU has had some major customizations.

Surely we can feasibly do comparisons with a GFX card that costs more itself than what the PS4 is likely to cost?

I understand that the GTX 680 is at the top of the GFX cards at the moment and it will be superceded in due course, but as long as the PS4 is comparable with a high-end setup when it's released and can hold it's own for the next few years, we should be happy?
 
Shuhei reiterated in EG that they're allowing devs to go 'really deep onto the metal' so I'm not so worried what competent devs will be able to do compared to like a 7950 on PC.
 
Of course if you don't have enough bandwidth to address of RAM in a frame, one imaginative use that you could put the spare RAM to is storing video of your session that you could rewind, upload etc..

We probably need not worry about if the system reservation for this video is large because we couldn't use the RAM as anything but a cache anyway.

Fair enough 8gb of GDDR5 is perhaps overkill, 4gb of GDDR5 and 4gb of something slower might have been cheaper for consumers. (hard to be certain on this point however)
 
From the specs sheet:

to improve the GPGPU we have already talked about the caches. PS4 must have a 2MB L2 cache in the GPU instead of 512KB or something like that. I wouldn´t discard neither an intermediate full cocherent cache -2MB?- between CPU and GPU to make them talk the same language and fasten GPGPU decreasing latencies to GDRR5.
One thing is for sure, if Mark Cerny has architected all of this it will be good.

I would like to see the Barbarian and ERP faces when they were told: "8GB, it is".
 
Last edited by a moderator:
Did you see those one million things fall? That used a fraction of GPU compute they said
Nvidia did that 3 years ago in the rocket sled demo. Much of the PC gaming world hated on them for using GPU physics. Now that Sony goes with it, it'll probably be the best thing since sliced bread.
 
So what speed does the CPU run at?
What's the cache sizes?
Etc... Etc... Etc...
Rumors have said 1.6GHz CPU clock. Cache is presumably around 256k/core at most, considering there's 8 of them and sharing die with the GPU - so no going overboard with SRAM... Maybe 256k shared between pairs of cores. *shrug* Jaguar is an el cheapo laptop CPU IIRC, so no need to throw pearls before swine; it'll be a fairly small L2 for sure.
 
A noob question: Are there any scenarios where low latency is better than higher BW? I don't know, physics? AI?

Hard to say I think. The most obvious ones are situations with lot's of cache misses, or many new, small jobs. Most of the time, they should be able to hide the latency.
 
Nvidia did that 3 years ago in the rocket sled demo. Much of the PC gaming world hated on them for using GPU physics. Now that Sony goes with it, it'll probably be the best thing since sliced bread.

Much of the gaming world hated Nvidia for entirely different reasons, among which holding physics back in games by attempting to make this an exclusive feature, and that was made worse by consoles not being able to keep up by that time. Now there will be a much better baseline and more incentive to do cool things. I expect big improvements and much more exciting graphics and gameplay as a result. Some of the tasters that Uncharted 3 gave us in this area I thought were very exciting too (fights in storms on moving platforms and cruisers etc), or destructable environments on PC in Battlefields, etc.
 
Nvidia did that 3 years ago in the rocket sled demo. Much of the PC gaming world hated on them for using GPU physics. Now that Sony goes with it, it'll probably be the best thing since sliced bread.

What did they complain about ? On PC, the architecture is very different. The programming model is also API-based ? Most of them don't have the latest and greatest GPU to run extra physics jobs, they may want to reserve their GPUs for eye candy. Here, people paid for the same thing and would want all devs to use the h/w to the max.
 
Rumors have said 1.6GHz CPU clock. Cache is presumably around 256k/core at most, considering there's 8 of them and sharing die with the GPU - so no going overboard with SRAM... Maybe 256k shared between pairs of cores. *shrug* Jaguar is an el cheapo laptop CPU IIRC, so no need to throw pearls before swine; it'll be a fairly small L2 for sure.

I thought it was a tablet cpu.
 
I wonder if ERP would have anything to add at this stage, are we still missing any sauces etc?
I think PS4 is going have a dedicated chip for audio.

In regards to the PS4, I am curious about the new technologies such hardware can produce or what innovative ways developers can find to create new techniques.

There is no need to reinvent the wheel, sure, but there is so much room left for improvement now that they know what the hardware is going to be.
 
A single Jaguar core could composite 100s of sound sources and HRTF filter them and it probably still wouldn't exceed 10% of it's processing time ... what's the use of a dedicated audio processor?
 
A single Jaguar core could composite 100s of sound sources and HRTF filter them and it probably still wouldn't exceed 10% of it's processing time ... what's the use of a dedicated audio processor?
There is a rumour saying the audio chip on Durango needs 100 Gflops of equivalent CPU power to emulate, which is a lot of processing power. So I think a good audio chip makes sense the next generation --fellow forums discussed about this in a different thread btw.

According to this diagram, PS4 is going to feature an audio DSP.

20130041648-7+small.jpg
 
Back
Top