PS4 Pro Speculation (PS4K NEO Kaio-Ken-Kutaragi-Kaz Neo-san)

Status
Not open for further replies.
Link is blocked for me (work). Anyone care to explain or show more?

My layman interpretation

They render like a chess board, every black square frame 1 and white frame 2.

Using previous frames data and nearest neighbours msaa data they reconstruct the image, then filter for tooth marks similar to other aa fix up.

Less rendering, less memory space and bandwidth used, good result and mostly though trial and error, looking to use it more often with more scinetific approaches.
 
Think of the beauty of each pixel if we were still SD !
HD clearly is a step backward ! ;p
I'd love to see a demo targeting SD, just to see how close to photorealism we can get if not pushing pixels. SD still no good for gaming though. Just not sharp enough!
 
My layman interpretation

They render like a chess board, every black square frame 1 and white frame 2.

Using previous frames data and nearest neighbours msaa data they reconstruct the image, then filter for tooth marks similar to other aa fix up.

Less rendering, less memory space and bandwidth used, good result and mostly though trial and error, looking to use it more often with more scinetific approaches.
isn't this kinda what quantum break does ?
 
what was the rendering thing quantum break does. I know its not rendering at the output res . Maybe its combining two frames

It uses the four previous 720p 4xMSAA frames to reconstruct a 1080p frame, but it breaks down whenever the camera is moved. Not sure if they've presented exactly how it works.
 
@3dilettante

So now that we're looking at a 33% shrink for Xbox One S, any new thoughts on shrinking PS4 similarly? The power savings there seems to be about 30% for gaming, but I wonder if it wouldn't be that much for PS4 considering the GDDR5 bus wouldn't scale like that, and maybe it just wouldn't be worth it?
 
@3dilettante

So now that we're looking at a 33% shrink for Xbox One S, any new thoughts on shrinking PS4 similarly? The power savings there seems to be about 30% for gaming, but I wonder if it wouldn't be that much for PS4 considering the GDDR5 bus wouldn't scale like that, and maybe it just wouldn't be worth it?
Uhm, how do we know the breakdown of where the power savings come from? Perhaps they use a more efficient power supply instead of the external brick, for a good portion of the efficiency savings? If so, then that could mean less power savings from the process update to 16FF than 30%?
 
Uhm, how do we know the breakdown of where the power savings come from? Perhaps they use a more efficient power supply instead of the external brick, for a good portion of the efficiency savings? If so, then that could mean less power savings from the process update to 16FF than 30%?
We just have the power readings from the DF article.

And yes, I know it's hard to isolate, but it would be bizarre not to see a majority of the power savings from the 16nmFF. The overhead of the rest of the system would make it somewhat more difficult to observe the APU, but much of the rest of the system hasn't changed as drastically (same number of DDR3 chips, low power HDD, etc.)

edit:

Ok, I get where you're coming from (not just that place that rocks or something). Would be difficult to quantify the power supply efficiency, and there's no guarantee about anything there. :(
 
Last edited:
@3dilettante

So now that we're looking at a 33% shrink for Xbox One S, any new thoughts on shrinking PS4 similarly? The power savings there seems to be about 30% for gaming, but I wonder if it wouldn't be that much for PS4 considering the GDDR5 bus wouldn't scale like that, and maybe it just wouldn't be worth it?


Size-wise, Polaris at 232mm2 shows how close a 36 CU GPU gets to taking up its perimeter with IO. By way of comparison, I did a quick MS paint check to see what amount of area would be taken out if I took out the Jaguar modules and the silicon between them from the Orbis die shot. That's about 25% of the die and it leaves behind ~250 mm2 as GPU, sundry controllers, and IO. Get a double-density GPU with twice the unit count and it seems like a decent fit to what Polaris is sized at.
If Sony opted for a GF/Samsung node for a shrink, it seems like it is possible to shrink the existing Orbis die so much that it would be smaller than Polaris 10--perhaps too small for the bus.
That's on a GF process, however. There's an apparently modest density benefit (power penalty, maybe) that might be how it gets a good density boost.

I'm uncertain about the apparent area scaling with the Xbox One S. In theory, it would be more primed to benefit from a shrink with a more compact memory bus and a lot of ESRAM rather than complex logic. Getting less than 50% scaling wouldn't be out of the question, although 30% on the face of it seems like something didn't scale as much, although a comparison with Polaris is not apples to apples due to the foundry difference.

Power-wise :
There were closer reference points for the PS4 since AMD disclosed comparisons between its GDDR5 subsystem and HBM.
Using the earlier speculation for Neo and Polaris: https://forum.beyond3d.com/posts/1926535/

There, I assumed 110-120W for the Polaris GPU alone at its desktop base clocks, and handwaved some numbers like 27W for the GDDR5 bus and then possibly 10-15W margin to get around the peak draw of a launch PS4 running Killzone SF. The ASIC-only power draw appears to have been borne out in the end.

I'm not on firm ground as to what AMD's DDR3 bus would draw. I don't know if it would be twice as efficient as the Orbis GDDR5 bus, but if it were that would be ~14W and then maybe 10-15W for all other things non-SOC.
That does assume a more Polaris-like Neo implementation, which has some conflicting rumors about it.
For the Xbox If there's a ~30W baseline in non-SOC power consumption due to memory, VRM, and drive power consumption the APU portion's power scaling would look better than the 30% the whole-system measurements would indicate.
A purely shrunk PS4 seems like it might drop to close to the original Xbox One (maybe slightly lower?), if no further architectural changes occur. My other estimates were using something more Polaris-like.
 
Did you people watch?

interesting info IMO. I'll take 1080P and 30/60 fps with impressive visuals over anything else. Why not if they can do it and games are still far from looking anything CGI we are not at the point where graphics and effects are comparable to movies (In very few games like the order maybe) but we'll eventually get there.
 
I would guess DDR3 on XB1S stays at 1.5v.

For a PS4 Slim, they have the additional option of using 1.35v GDDR5 to save 25% on both ends. 5.5Gbps makes the 1.35v option still at a reasonable binning.

One problem is that 1.35v have a few timings changes, RC is raised to 48ns instead of 40ns.
 
Did you people watch?

interesting info IMO. I'll take 1080P and 30/60 fps with impressive visuals over anything else. Why not if they can do it and games are still far from looking anything CGI we are not at the point where graphics and effects are comparable to movies (In very few games like the order maybe) but we'll eventually get there.

I honestly think 4K is far too soon for gaming, to notice the difference you'd have to be sitting either far too close to the TV or have a stupidly large TV...so we're talking very few people will truely benefit whereas 1080p with better framerate or more effects is a win for everyone.
 
In the meantime the 8k train has started moving.
Before scorpio or neo become popular, 4k hdr will be considered surpassed and blurry
 
Status
Not open for further replies.
Back
Top