PS4 Pro Official Specifications (Codename NEO)

Status
Not open for further replies.

Clukos

Bloodborne 2 when?
Veteran
Supporter
Specs

Crxx0qLXgAQRUgE.jpg:large


The leaks were the leaks :)

Edit: http://www.neogaf.com/forum/showthread.php?t=1275157

Also usb port in the back for those who asked for it
 
Also, from the other thread, it officially does NOT support UHD 4K BluRay playback.
 
I think it's safe to assume the 4.2TF leaks now - if it were more I'm sure Sony would be shouting figures from the rooftops!
 
Kinda surprised. Was it that hard to implement?

I think it's more a matter of having updated mechanical optical drive. I think @MrFox knows the details and posted it in the other thread. Perhaps he can re-share his knowledge here?
 
Where the extra 60W over the launch PS4 comes from is an item of curiosity for me.
A little over is something I could understand, but what would take it that much over?

My earlier assumptions were premised on something comparable to the 14nm Polaris 10 GPU, which does have certain assumptions built in. With the Pro described as an APU that has elements taken from Polaris, there are potential assumption-breakers ranging from process, the GPU architecture, APUs not always achieving optimal implementations for both the CPU and GPU regions, possibly restricted turbo/downclocking for consistency reasons, and the yield tolerances when one can be binned more so than the other.

That is a lot of assumptions, granted, but seemed plausible to get that plus the share of power producers besides the GPU+GDDR5 in without another 60W. There are some items in the system that were bumped like the number of potentially powered USB ports, but that contribution seems too modest.

I'm still not sure how much Neo was able to derive from Orbis. Some discussion of it being Polaris "and beyond" actually might be the persistence of the Tensilica block that Polaris dumped. Possibly, items like compression and front end changes could move into a revision of Orbis without overly complicating matters.
I'm not sure about the GCN CUs themselves, or the ISA since those do not appear to mesh well with the PS4 IP level.
 
Where the extra 60W over the launch PS4 comes from is an item of curiosity for me.
A little over is something I could understand, but what would take it that much over?

My earlier assumptions were premised on something comparable to the 14nm Polaris 10 GPU, which does have certain assumptions built in. With the Pro described as an APU that has elements taken from Polaris, there are potential assumption-breakers ranging from process, the GPU architecture, APUs not always achieving optimal implementations for both the CPU and GPU regions, possibly restricted turbo/downclocking for consistency reasons, and the yield tolerances when one can be binned more so than the other.

That is a lot of assumptions, granted, but seemed plausible to get that plus the share of power producers besides the GPU+GDDR5 in without another 60W. There are some items in the system that were bumped like the number of potentially powered USB ports, but that contribution seems too modest.

I'm still not sure how much Neo was able to derive from Orbis. Some discussion of it being Polaris "and beyond" actually might be the persistence of the Tensilica block that Polaris dumped. Possibly, items like compression and front end changes could move into a revision of Orbis without overly complicating matters.
I'm not sure about the GCN CUs themselves, or the ISA since those do not appear to mesh well with the PS4 IP level.
No.IMHO it is Polaris 10.Remember the 911 Mhz in RX480 being the first selectable power profile in the clock tool.The culprit of tdp being so horrible must be Jaguar.They must have clocked it way higher than 2,1 and taking it too far from its eficient zone,fearing Scorpio using Zen.In fact Jaguar clock is the only generic data without any figures in the spec sheet.B esides Sony sells the clock bump(presumably from the cpu) for improving frame rates.2,1 was too low to get us to 60 fps in too many 30 fps games. Something very hot is inside to need a way bigger than OG PS4 box(and so a bigger cooling solution).
That or GF 14nm is even worse for making APUs than standalone GPUs.
 
Last edited:
No.IMHO it is Polaris 10.Remember the 911 Mhz in RX489 being the first power profile.The tdp being so horrible culpript must be Jaguar.They must have clocked it way higher than 2,1 and taking it too far from its edficient zone,fearing Scorpio using Zen.

It wouldn't be purely Polaris 10, unless part of the reason Polaris 10 is less efficient than it should be because it also has binary backwards-compatibility with an older GCN ISA.
Also, I noticed in the graphics forum that Polaris broke compatibility with Trueaudio (not that many would care), which has a roughly equivalent block in the PS4 that cannot be discarded.

I'm not sure anything is to be gained in pushing >60W through a pair of Cat core clusters. A few hundred MHz more before running into instability isn't going to change how far beyond them Zen would be.
 
It wouldn't be purely Polaris 10, unless part of the reason Polaris 10 is less efficient than it should be because it also has binary backwards-compatibility with an older GCN ISA.
Also, I noticed in the graphics forum that Polaris broke compatibility with Trueaudio (not that many would care), which has a roughly equivalent block in the PS4 that cannot be discarded.

I'm not sure anything is to be gained in pushing >60W through a pair of Cat core clusters. A few hundred MHz more before running into instability isn't going to change how far beyond them Zen would be.
How much was Jaguar at 28nm@1,6Ghz?.15 watts?.
Now supposse they have gone 14nm@2,8Ghz in a chip designed for 2,1 max at 28nm(near 25 watts then?). Wouldnt be possible the cpu now had a tdp of near 50 watts?.
 
Where the extra 60W over the launch PS4 comes from is an item of curiosity for me.
A little over is something I could understand, but what would take it that much over?
PS4's nominal power is 250W. So we may see 30W increase of PS4 Pro.

I guess they taped out at Samsung 14nm, not TSMC 16nm. And they may increase CPU clock further.
 
The power may be whatever is driving PSVR or the USB3.1. Original PS4 never would have been designed to power a display if that's what they did.
 
ROP count is doubled?
As Neo has been described, Neo versus standard mode activates shader engines, so things could become strange if it isn't.

How much was Jaguar at 28nm@1,6Ghz?.15 watts?.
Now supposse they have gone 14nm@2,8Ghz in a chip designed for 2,1 max at 28nm(near 25 watts then?). Wouldnt be possible the cpu now had a tdp of near 50 watts?.
Puma at 28nm (GF?) got to that range in the same power envelope. Jaguar's characterization and DVFS was essentially broken in the consoles and in AMD's own products. AMD's pattern is to get that wrong with every new architecture (Jaguar/Puma, Trinity/Richland, Kaveri/Godavari, Carrizo/Bristol), although in this case it would getting things right and then re-breaking them.
 
Could be the power supply being chosen based on it's most efficient point. Maybe bigger margin and used closer to 50% which I think the Fat PS3 was?
 
So on that topic if we could expand that a bit, the HDR firmware update for existing PS4, I was under the assumption HDMI 2.0 was the requirement, and I'm pretty sure PS4 does not have one of those so, are they basically sending 1080p HDR signal to get around bandwidth constraints? Or like, can we discuss the technical there ?


Sent from my iPhone using Tapatalk
 
I find it really funny that MS who doesn't really have much to gain from having a UHD drive in their system has it but Sony who is a patent holder on UHD bluray doesn't.

Anyway the specs are the same from the leaks it seems so MS shouldn't have much trouble out pacing this release so we might see a good old fashion price war next year which is good for all of us
 
So on that topic if we could expand that a bit, the HDR firmware update for existing PS4, I was under the assumption HDMI 2.0 was the requirement, and I'm pretty sure PS4 does not have one of those so, are they basically sending 1080p HDR signal to get around bandwidth constraints? Or like, can we discuss the technical there ?


Sent from my iPhone using Tapatalk
10bits in 1080p is perfectly possible on hdmi 1.4. The problem is having the correct signalling for HDR (brought by 2.0 specs) which I thought was the hdmi driver responsibility and that would be a chip predating those specs. So they might have something flexible enough to add whatever metadata in the frame header for correct HDR signal which the TV would recognize.

Everything else is simply software, games already have frame buffers with enough precision, it's not a big leap to just convert it to a 30bits and add luminosity metadata.
 
10bits in 1080p is perfectly possible on hdmi 1.4. The problem is having the correct signalling for HDR (brought by 2.0 specs) which I thought was the hdmi driver responsibility and that would be a chip predating those specs. So they might have something flexible enough to add whatever metadata in the frame header for correct HDR signal which the TV would recognize.

Everything else is simlle software, games already have frame buffers with enough precision, it's not a big leap to just convert it to a 30bits and add luminosity metadata.
Interesting, so the HDR, is HDR gaming, and not necessarily 4K HDR Netflix? Unless Netflix started sending out 1080p HDR content....
hmm this is all really hmmm worthy.
 
The way I took their HDR announcement is that they will support HDR 1080p signal on PS4 and PS4 Slim. It is only the PS4 Pro that will support HDR 4K signals.
 
Status
Not open for further replies.
Back
Top