PS4 Pro Official Specifications (Codename NEO)

Status
Not open for further replies.
If Scorpio runs unmodified Xbox One games without the hardware going into some kind of emulation mode, that, to me, will be a compelling case for the effectiveness of the Xbox One's hardware abstraction.

I'm expecting some kind of BC (rather than emulation) mode, but what I'd like to see is any advances in CPU (IPC and clock) and GPU (clock and compression) translate to higher performance for that BC mode.

XBone has done this with the GPU bump, MS's 360 emulator has had a ball improving performance on some games (despite no access to source code), and I'd expect nothing less from Scorpio. This is in contrast to PS4Pro dropping clocks down for PS4 games.

Whatever MS's approach to this is ... it's a good one.
 
If Sony weren't focused on BC for PS4 before the Scorpio announcement, they will be now. Odds of PS5 having BC shot up after E3, IMHO.
Those comments from Cerny which talk about incompatibility issues moving between cpu/gpu architectures tell me that Sony doesn't have the software expertise - at least right now - to solve BC without hardware tricks.

Obviously Microsoft has an significant edge in this area..
 
Those comments from Cerny which talk about incompatibility issues moving between cpu/gpu architectures tell me that Sony doesn't have the software expertise - at least right now - to solve BC without hardware tricks.

Obviously Microsoft has an significant edge in this area..

Sony has some very smart people though, and smart people can generally work out which smart people to hire (even if only so they can hire more smart people). Offer them some good money and a challenge and ....

... I wouldn't rule Sony out from being able to respond to MS's approach to software continuity and enhancement.
 
I wonder when checkerboard rendering was invented and now that the PS3 and X360 era is gone, why hasn't been used before? Is it only suited for or useful for 4k consoles?

In a way it goes all the way back to the paper "Progressive coding scheme for multilevel images". The algorithm HINT specifies how you can iteratively (or hierarchically) half the resolution of a image down to 1x1. The idea is that is gives finer downsampling steps than mip-map style 1:4. The interesting property is that pixel distance drops by sqrt(2) instead of 2 each step. You can read the explanation here, page 17:
ftp://ftp.informatik.uni-stuttgart.de/pub/library/medoc.ustuttgart_fi/MSTR-3359/MSTR-3359.pdf

As this is later on used as a compression scheme (http://dlia.ir/Scientific/IEEE/iel1/42/568/00014516.pdf), followup papers describe predictive means to reconstruct the image under lossy channels. Like this:
http://www.intuac.com/userport/john/apt/

Now, you can think of a checkerboard rendered image as a "compressed

Addendum (submission failed, and I only submitted the saved draft without realizing):

Now, you can think of a checkerboard rendered image with missing "blacks" as a lossy compressed version of the images, where the "blacks" have 0 information transfered though the channel. Then you can use any decompressor invented for HINT to reconstruct the "blacks", for example by trained predictor banks or alternatives. It's way better than linear interpolation.

I'm not aware that this has been extended by temporal information in the compression community, but's straightforward, you simply use a predictor whose context extends to the previous frame. Each "black" would have 9+4 anchor pixels. I extended APT to encode animated GIF like image-sequences this way.

Having a a depth buffer and a object-id buffer, which allows to create good masks of valid relationships between previous and current frame, is unique to rendered imagery, but are also straightforward extension of the above algorithms, improving classification and pattern matching resulting in extreme low bit-rates, and very high error-resiliance (lossy compression theory and lossy channel theory are very much two sides of the same coin, the first is about the controlled loss on the encoder side, the latter about the uncontrolled loss in the information channel which you like to reconstruct the best possible way, that's why trellis quantization is very good in both cases).
 
Last edited:
In a way it goes all the way back to the paper "Progressive coding scheme for multilevel images". The algorithm HINT specifies how you can iteratively (or hierarchically) half the resolution of a image down to 1x1. The idea is that is gives finer downsampling steps than mip-map style 1:4. The interesting property is that pixel distance drops by sqrt(2) instead of 2 each step. You can read the explanation here, page 17:
ftp://ftp.informatik.uni-stuttgart.de/pub/library/medoc.ustuttgart_fi/MSTR-3359/MSTR-3359.pdf

As this is later on used as a compression scheme (http://dlia.ir/Scientific/IEEE/iel1/42/568/00014516.pdf), followup papers describe predictive means to reconstruct the image under lossy channels. Like this:
http://www.intuac.com/userport/john/apt/

Now, you can think of a checkerboard rendered image as a "compresse
I am surprised by the fact that that's used in medical research papers. Is there anything else you wanted to write? There is a part of your text left unfinished.

Gota check the first pdf, but the APT's approach ( http://www.intuac.com/userport/john/apt/ ) might be a matter of tastes.
 
I am surprised by the fact that that's used in medical research papers. Is there anything else you wanted to write? There is a part of your text left unfinished.

Gota check the first pdf, but the APT's approach ( http://www.intuac.com/userport/john/apt/ ) might be a matter of tastes.

Ah, yes, sorry, B3D timed out on me when submitting.
The first PDF is a new(er) paper describing HINT partially. The original paper is behind IEEE paywall. The original paper for APT, descibing BTPC 1994 is also behind IEEE paywall. I suppose it's possible to find it in physicall form (yikes, not papyrus though) at some universities. :)
 
Sony has some very smart people though, and smart people can generally work out which smart people to hire (even if only so they can hire more smart people). Offer them some good money and a challenge and ....

... I wouldn't rule Sony out from being able to respond to MS's approach to software continuity and enhancement.

Sure they are smart people but sony's software also often weird. Weird in the capabilities/compatibilities, in the bugs, in the UX.

And its everywhere. From the xperia phone, to vaio laptop (and vaio software), to Playstation.

Since ps4 era its been much better tho.
 
I think Sony will try for BC, but we've seen that they are far more fastidious about how that software runs on the new hardware than MS has been, for example. Xbox has been quite laissez faire about releasing titles despite severe performance issues. It's too their credit that they've gone back and improved that in certain cases, but that's a strong "fix it in hardware later" mentality compared to Sony's "work right in hardware day one" approach.

Even with PS1 compatibility on PS2 and PS3, or PSP compatibility on PS Vita: they've always been very conservative with the kind of enhancements that they could technically provide. They don't mess with the internal resolution or offer more than the simplest, toggleable texture filtering. Now that they have created a pure software emulator for PS2 games on PS3/PS4 they are being very slow about rolling out titles for sale so they can be individually, fully QA'd. Given the opportunity MS would not hesitate to just make that an app with a "your mileage may vary" compatibility warning, but for whatever reason Sony doesn't seem to think that's acceptable.

There have to be people inside Sony who know what a PR win it would be to make universal PS1/PS2 and PSP emulators available on PS4. I don't believe the reason they don't is entirely a business decision.
 
Now that they have created a pure software emulator for PS2 games on PS3/PS4 they are being very slow about rolling out titles for sale so they can be individually, fully QA'd.

Isn't MS doing the same thing with 360 games on the bone?
 
I think Sony will try for BC, but we've seen that they are far more fastidious about how that software runs on the new hardware than MS has been, for example. Xbox has been quite laissez faire about releasing titles despite severe performance issues. It's too their credit that they've gone back and improved that in certain cases, but that's a strong "fix it in hardware later" mentality compared to Sony's "work right in hardware day one" approach.

Even with PS1 compatibility on PS2 and PS3, or PSP compatibility on PS Vita: they've always been very conservative with the kind of enhancements that they could technically provide. They don't mess with the internal resolution or offer more than the simplest, toggleable texture filtering. Now that they have created a pure software emulator for PS2 games on PS3/PS4 they are being very slow about rolling out titles for sale so they can be individually, fully QA'd. Given the opportunity MS would not hesitate to just make that an app with a "your mileage may vary" compatibility warning, but for whatever reason Sony doesn't seem to think that's acceptable.

There have to be people inside Sony who know what a PR win it would be to make universal PS1/PS2 and PSP emulators available on PS4. I don't believe the reason they don't is entirely a business decision.
I wonder why this is the case. PS1 games are a simple fair and it doesnt look like it would be a problem with enhancements. They scrabbed even the basic emulation in the console which I find unfortunate.
I still cannot understand though how MS managed to pull off the 360 emulation though. We were all expecting this to be almost impossible and it seems to be a relatively simple solution
 
The previous consoles all had pretty exotische hardware. This time getting software to work on a next gen x64 system is certainly going to be much easier. And they will be motivated to get regular PC versions of a lot of their titles for PlayStation Now as well.

With PS4 Pro however they wanted more than easier - they went for effortless.

Clearly though Microsoft has been even more careful in that area.
 
I think Sony will try for BC, but we've seen that they are far more fastidious about how that software runs on the new hardware than MS has been, for example. Xbox has been quite laissez faire about releasing titles despite severe performance issues. It's too their credit that they've gone back and improved that in certain cases, but that's a strong "fix it in hardware later" mentality compared to Sony's "work right in hardware day one" approach.

Even with PS1 compatibility on PS2 and PS3, or PSP compatibility on PS Vita: they've always been very conservative with the kind of enhancements that they could technically provide. They don't mess with the internal resolution or offer more than the simplest, toggleable texture filtering. Now that they have created a pure software emulator for PS2 games on PS3/PS4 they are being very slow about rolling out titles for sale so they can be individually, fully QA'd. Given the opportunity MS would not hesitate to just make that an app with a "your mileage may vary" compatibility warning, but for whatever reason Sony doesn't seem to think that's acceptable.

There have to be people inside Sony who know what a PR win it would be to make universal PS1/PS2 and PSP emulators available on PS4. I don't believe the reason they don't is entirely a business decision.
likely a question of profitability. If its profitable enough they'll do it. Right now MS is doing everything they can, any advantage is profitable for them. Where with Sony, if you can pick and choose where your budget and funding goes, even if they wanted to do BC, it's likely being prioritized lower than some other areas (ie PSVR) and 4Pro variants.
 
Some possibilities for Sony's spoofing of counters might come from its emphasis on lower-level access or quirks in its compute emphasis. If it's not for titles, it might be related to some peculiarities to the system functions they might be using.
If the abstraction is a bit thinner and allows some percentage of extra performance, the downside would be that the performance was paid for by leaving more of the stack vulnerable to low-level details such as that.

Microsoft could have different thresholds for "close-enough", or it might have other mechanisms in place that provide similar results that Sony chose not to use or would infringe on something if it did.

It would be the case that if one was super-paranoid about making the additional support and development costs of the Pro above the standard PS4 as low as possible, that enforcing that level of equivalence is more necessary when there is a very large hardware and a noticeable feature gulf.
The Xbox One S is comparatively closer to the original, and we don't know what Scorpio might do along these lines.

As a compromise, the PS5 might still have backwards compatibility.
What it presumably would not have is Sony's requiring that games exist on both and also require performance and feature calculations that have to worry about the PS4 and PS5.
 
As a compromise, the PS5 might still have backwards compatibility.
What it presumably would not have is Sony's requiring that games exist on both and also require performance and feature calculations that have to worry about the PS4 and PS5.

I was a bit surprised by the remarks that Sony didn't want to touch the 8-core Jaguar CPU, because it might harm titles developed for PS4 "vanilla".

Are developers really squeezing the last bit of performance out of these CPUs ? I would expect that with the move to a more standard x86 APU,
there would be less machine specific optimizations like on PS2 and PS3. Moving data around on a 8-core x86 CPU + GPU is not that exotic compared to a PPC plus or do I miss something ? I would expect that the PS4 basic architecture scales. So you add faster CPUs, more memory, better GPUs in PS5, but the bus architecture, memory architecture scales. Sony is already making tweaks, like the SDRAM for UI and apps.

About BC... personally I don't care about PS1, PS2 or even PS3 games on PS4. Remastered and in 1080p res, but not 720p graphics upscaled to 1080p or even 4K.
 
http://www.gamasutra.com/view/news/283611/Inside_the_PlayStation_4_Pro_with_Mark_Cerny.php

Interesting that taking Jaguar cores to 2,1 Ghz is what made them increase the cooling solution size. It must be a little beyond the efficiency sweet spot at that clock. By the way we still don´t know if the PRO´s APU is 14nm GF or 16nm TSMC...

really curious to know how loud it is...maybe if the are using a larger fan it will actually end up quieter than original PS4?
 
Status
Not open for further replies.
Back
Top