PS4 Pro Official Specifications (Codename NEO)

Status
Not open for further replies.
It could also suggest that the Xbox One platform has a better isolation layer between software and hardware, that means games and applications are less likely to break when the underlying hardware changes.
 
It's not, "just two PS4 GPUs, literally!" as half the Eurogamer article pointing out additional features on the new GPU shows quite clearly.

That an Xbox One game ""just works and takes advantage of extra power available" on the Xbox One S probably has a lot more to do with the hardware differences being minimal than it does with the system's APIs.

It's the Pro's inability to run at a higher clock rather than it's inability to use new hardware feature on old games (understandable) that's interesting IMO.

Shortly before launch MS did a last minute bump of both CPU and GPU clocks, and I'm not aware of that causing any issues. Then with the S they bumped the GPU clock again ... and everything just works.

With the Pro there is obviously a hardware configuration that allows the Pro to run PS4 games perfectly. Why does this configuration negate the clock speed enhancements that the hardware would appear to be completely capable of?
 
There's also an extra 1GB of DDR3 on the Pro, which seems to be used as swap/cache for less demanding applications like Netflix to be kept alive:

I'm really curious as to if this will effect performance when it comes to switching between apps and games. Granted, my curiosity is entirely for selfish reasons as on my ultra lazy days I make extensive use of switching between games and video apps (some video apps handling this more gracefully than others) all from the controller with minimal disruption or fuss. I really should be switching to a low power device and turning the PS4 off when streaming media but doing it all on the PS4 on the same controller and with the transition being quite fast is really convenient. :yep2:

Secret Sauce Confirmed! :runaway:

If the secret is revealed does the sauce stay Secret Sauce or Is it then considered regular sauce? Perhaps Special Sauce (with our without capitalization)? :-? These are the hard hitting questions we need answered.
 
Wasn't it also the Cell processor whose SPEs could do 4 16bit float ops in paralel? There are limitations to its use though - I could imagine that for 4k rendering you often need the precision of 32bit? But it is good to have the choice to use half where possible I suppose, and from the PS3 era I got the impression that 16bit is enough in many cases.

Certainly for HDR color operations I'd imagine using a 16bit float internally for every part of RGBA calculations could be very efficient, which is perhaps why it is also supported in Shaders?

I look forward to reading more comments from people unlike me that actually know what they are talking about. ;)
IIRC, Sebbbi was mentioning developers who did PS3 work would be comfortable in this type of environment.
 
I suppose it's possible that Sony wants to keep a more discernible difference between the PS4 and the Pro so they don't want the PS4 to get a bump.
 
Seems like the ID buffer is the game changer in the Pro.
I wonder how much additional processing it costs to add this triangle ID buffer and use it the same way without the hardware modifications (on the standard PS4, for example). That should give a good idea to figure out how much this helps the PS4 Pro punch above it's weight.

This buffer must be highly compressible, the IDs would be grouped together?
 
From Insomniac Games: http://www.neogaf.com/forum/showpost.php?p=220958487&postcount=301


"Provides AA" i take it that using the PS4P checkerboard rendering also means that you automatically do TAA as well (due to how PS4P works)?

It seems there is some sort of AA provides by PS4 Pro when using buffer ID (TAA?) but Insomiac Games don't use checkerboard rendering they use temporal injection with a four millions pixel jittered framebuffer using some work done by Brian Karis and Timothy Lottes. For Honor use a similar method. Same overhead than checkerboard rendering but it seems better result.


Already we are seeing developers adopt their own take on 4K presentations. Both the upcoming Spider-Man and For Honor use four million jittered samples to produce what the developers believe to be a superior technique compared to checkerboarding, with a similar computational cost. But the unique ID buffer Sony provides could still prove instrumental to the success of these techniques.

"The ID buffer will integrate with any of these techniques, even native rendering," Cerny continues. "You'd use the ID buffer then to get temporal and spatial anti-aliasing. It layers with everything."

edit:
Insomniac Games is drawing upon the work of Epic's Brian Karis and Timothy Lottes to create a 4K image from a four-million pixel jittered framebuffer in its new Spider-Man title. It believes the quality could exceed the results seen in checkerboarding with a similar overhead.
 
Last edited:
I hope Ratchet and Clank will have a PS4 Pro version. It was the biggest visual annoyance of the game, shimmering on thin object...
 
I wonder how much additional processing it costs to add this triangle ID buffer and use it the same way without the hardware modifications

They lightly touch upon this topic in the DF interview
It's all hardware based, written at the same time as the Z buffer, with no pixel shader invocation required and it operates at the same resolution as the Z buffer. For the first time, objects and their coordinates in world-space can be tracked, even individual triangles can be identified. Modern GPUs don't have this access to the triangle count without a huge impact on performance.

I hope Ratchet and Clank will have a PS4 Pro version. It was the biggest visual annoyance of the game, shimmering on thin object...

Yup, AA was a letdown.
 
Doesn't the Xbone use virtual machines for everything? That would pretty much explain the lack of concerns that Cerny mentioned.
Yea I was thinking along the lines of this as well.
I'm not sure if this is directly the reasoning, but it would appear that MS overall as a general statement does not allow as much low level access to the hardware. Which in turn, may give MS an opportunity to run the software on a variety of hardware without penalization (much like PC does).

I'm willing to bet some titles are running some gnarly optimizations on PS4.
 
Last edited:
It could also suggest that the Xbox One platform has a better isolation layer between software and hardware, that means games and applications are less likely to break when the underlying hardware changes.

Probably also suggest that there was enough abstraction going on in the Xbox 360 software that made it possible to get it workable with Xbox One hardware. The cpu difference between PowerPC 3 core -> AMD jaguar 8 core seems greater than difference between AMD CPU's.

Which means Microsoft getting an Xbox One games to run on a newer AMD CPU architecture would be fairly straightforward for them?

This just exposes the software "know-how" gap that exists between Microsoft and Sony ...Sony might not be able to ensure BC in the future even if they wanted to.

I could see Sony go completely mad with PS5 specs to make a clean break from PS4/PS4 Pro and attempt to diminish any reason for BC ;)
 
I could see Sony go completely mad with PS5 specs to make a clean break from PS4/PS4 Pro and attempt to diminish any reason for BC ;)

Re-Remasters!

Maybe they could give discounts for prior digital owners, or if you have the disc, you need to keep the disc in the console to be able to purchase digital re-remaster & to continue playing.
 
The ID BUFFER.

Would this completely eliminate the need for neighbourhood clipping, without TAA trailing artefacts?
Not that simple. Ids are not filterable. If you simply point sample an ID buffer (or depth buffer) to check whether an edge pixel belongs to the same object, you get alternating values, as the jitter makes IDs and depth values alternate between foreground and background object. Thus is you use this data to reject history, you end up losing your edge antialiasing as well (only insides of triangles get antialiased). Not exactly what you want. You could use dilate tricks to improve the situation, but it's not straightforward.

Neighborhood clipping is simple and works well because antialiased edge is a linear combination of (blend between) colors on the both sides of the edge. If you clamp the color inside the RGB (/YCOCg) convex hull (/bounding box) properly antialiased colors stay intact, while occluded colors are clamped near the new value (removing most of the ghosting).

ID buffers however have many other uses. Hardware id buffer output is a great addition.
 
Re-Remasters!

Maybe they could give discounts for prior digital owners, or if you have the disc, you need to keep the disc in the console to be able to purchase digital re-remaster & to continue playing.

It's interesting to see how publishers handle remasters vs allowing 360 title on Xbox One. I mean Skyrim is still like what $20-$30 dollars on 360 digital store? Allowing that on as BC on 360 would be instant profit with virtually no work on Bethesda's part. However a remaster version would probably sell more than 360 version but how much did development cost?
 
There is no intermediate resolution with checkerboard rendering. You are sampling at half your targeted framebuffer, but you never "upsample" anything to create the final image. In the case of titles that target 1800p instead of the full 2160p and use cherckerboarding, that would scale directly down to 1080p on an HDTV set rather than up to 2160p un UHD sets.

I was considering the 1800p to be the intermediate resolution, since the statement being discussed was how games were choosing between 1080p and 2160p support. The quote was structured in such a way that it seemed to indicate games that did checkerboard rendering for the purposes of 4K were then downsampling to 1080, but that is not congruent with other descriptions of what that set of games is doing.

In the hypothetical case that a game rendered to 1800p natively with checkerboard rendering to 2160p, my question was whether downsampling from that to 1080p lost more than the 2x resolution and additional reprojected pixels provided.
The base resolution would be natively higher than the final 1080p output, but would the reprojected pixels be that damaging to the downsampled results?
 
The PS4 could use SSDs, or is that some kind of hint that TRIM is enabled?
The lack of that particular feature can lead to degradation in performance and consistency over time.
 
The PS4 could use SSDs, or is that some kind of hint that TRIM is enabled?
The lack of that particular feature can lead to degradation in performance and consistency over time.

http://kb.sandisk.com/app/answers/detail/a_id/8142/~/difference-between-sata-i,-sata-ii-and-sata-iii

"Example: SanDisk Extreme SSD, which supports SATA 6Gb/s interface and when connected to SATA 6Gb/s port, can reach up to550/520MB/s sequential read and sequential write speed rates respectively. However, when the drive is connected to SATA 3 Gb/s port, it can reach up to 285/275MB/s sequential read and sequential write speed rates respectively."
 
Status
Not open for further replies.
Back
Top