Why the PS3 framebuffer only scales horizontally

The label HD-ready has a different meaning in the US and Europe.
Note Silent_Buddha said '"HDTV" ready' and not 'HD Ready'. ;) TV sets labelled 'HD Ready' in the EU have no problem. Those marketed as 'HDTV' or somesuch could be bunkum. You have sets incapable of rendering 720p or 1080i sold as HD sets because they could receive a signal and display it on their under-resolutioned displays.
 
I also noted US and Japan and possibly other NTSC territories. Doesn't appear to be a problem for PAL territories. But I think PAL made their requirements after the US and Japan had already been selling HDTV ready sets for a few years.

Regards,
SB
 
Note Silent_Buddha said '"HDTV" ready' and not 'HD Ready'. ;) TV sets labelled 'HD Ready' in the EU have no problem. Those marketed as 'HDTV' or somesuch could be bunkum. You have sets incapable of rendering 720p or 1080i sold as HD sets because they could receive a signal and display it on their under-resolutioned displays.
Of course, being "sold as HD sets" doesn't actually make them HD sets. ;)
 
No, the programmers told me in person that Burnout Paradise is simple scaled 720p. You'd want to be talking about games like Killzone 2, Uncharted etc.

Uncharted uses 960x1080P for upscaled to 1080P?

Sorry if was talk before(til death again...) but what does happen with MGS4 native in 1024x768P and much better IQ in 1080P?

Edit: I see now a post of Arwin about Uncharted.
 
So the consequences of PS3's gimbed scaler exist mostly for old TV sets, whereas the 360 can upscale on any TV set accordingly for best possible image quality, right?

No, most HD-TV's have better scaler-hardware than is available on a cheap console.

So, if the 360 will upscale the image, the TV with the über-scaler, will receive the signal in 1080p, and the TV-scaler will leave the 1080p-signal alone, thinking it's receiveing native resolution, while it's actually upscaled image on a worse scaler.

This problem is only a issue for people with old TV's, wich were built before standards came in place, or people using computer-monitors.
 
So, if the 360 will upscale the image, the TV with the über-scaler, will receive the signal in 1080p, and the TV-scaler will leave the 1080p-signal alone, thinking it's receiveing native resolution, while it's actually upscaled image on a worse scaler.
Except XB360's scaler is pretty good from what I hear. Is there anywhere a legitimate comparison of upscalers across different consoles, Media PCs and TVs?
 
This problem is only a issue for people with old TV's, wich were built before standards came in place, or people using computer-monitors.
Supporting only 720p output for most games means the PS3 isn't compliant with the "standards in place" either. If anything, 1080p output support for all software (upscaled or otherwise) would be compliant with current HD standards.

And that's just the plain transmission loss, before you even consider how scaling on the PS3 always costs you space in memory and more cycles. Case in point: Burnout Paradise/PS3. If you force it into 1080i mode, it scales from a 1280x720 render to a 960x1080 output buffer, i.e. it throws away horizontal resolution as a first step, because that's how it can stay within given memory constraints.

Games like Burnout, Uncharted, and Killzone don't use the lower horizontal resolution in order to stay within memory constraints. They use the strange 960x1080 resolution because that's just how the PS3's built-in "hardware scaler" works. It can only scale images horizontally, so it keeps the 1080 lines of vertical resolution the same, while stretching whatever the horizontal res is to 1920. It just so happens that 960x1080 is one of the resolutions supported by this hardware scaler, and it is close enough to 1280x720 in terms of resolution/performance.

Also, it only takes ~500kb of extra memory in order to use the PS3's 960x1080 mode as opposed to the regular 1280x720 mode, purely due to the extra ~10% of overall resolution increase. There's also no performance overhead or cycles for scaling, except again for the slight increase in resolution. This is because this is a hardware scaling solution, not a software solution.
 
Last edited by a moderator:
Games like Burnout, Uncharted, and Killzone don't use the lower horizontal resolution in order to stay within memory constraints. They use the strange 960x1080 resolution because that's just how the PS3's built-in "hardware scaler" works. It can only scale images horizontally, so it keeps the 1080 lines of vertical resolution the same, while stretching whatever the horizontal res is to 1920. It just so happens that 960x1080 is one of the resolutions supported by this hardware scaler, and it is close enough to 1280x720 in terms of resolution/performance.

Also, it only takes ~500kb of extra memory in order to use the PS3's 960x1080 mode as opposed to the regular 1280x720 mode, purely due to the extra ~10% of overall resolution increase. There's also no performance overhead or cycles for scaling, except again for the slight increase in resolution. This is because this is a hardware scaling solution, not a software solution.
Your second paragraph explains what's wrong with the first. The choice of resolution before and after scaling is not dictated by "just how it works" but by memory concerns.

960 pixels wide is one valid choice that can be fed into the (working!) horizontal-only output scaler, but there are others, among them 1024, 1280 and 1600. If memory wasn't a concern why would you scale from 1280x720 to 960x1080? You'd keep the width constant and scale to 1280x1080, or do it all in one step and scale directly to 1920x1080. It "only" takes more space in memory, but that's the whole problem.
 
Your second paragraph explains what's wrong with the first. The choice of resolution before and after scaling is not dictated by "just how it works" but by memory concerns.

960 pixels wide is one valid choice that can be fed into the (working!) horizontal-only output scaler, but there are others, among them 1024, 1280 and 1600. If memory wasn't a concern why would you scale from 1280x720 to 960x1080? You'd keep the width constant and scale to 1280x1080, or do it all in one step and scale directly to 1920x1080. It "only" takes more space in memory, but that's the whole problem.

From what I understood from these is that the game renders at 960x1080 instead of the 1280x720 and not that it renders at 720p, scales somewhat to 960x1080 and then use the hardware scaler.
 
Shifty Geezer said:
Is there anywhere a legitimate comparison of upscalers across different consoles, Media PCs and TVs?
From my understanding(which may be horribly outdated) 360 scaler can run a couple of different filters, with selectable kernel size, so any benchmark comparison would be impossible without having access to that.
That said, the filters are relatively standard fare you find in most image-manipulation software so it would be possible to simulate the test if we knew available kernel-sizes.
 
No, most HD-TV's have better scaler-hardware than is available on a cheap console.

So, if the 360 will upscale the image, the TV with the über-scaler, will receive the signal in 1080p, and the TV-scaler will leave the 1080p-signal alone, thinking it's receiveing native resolution, while it's actually upscaled image on a worse scaler.

This problem is only a issue for people with old TV's, wich were built before standards came in place, or people using computer-monitors.

Actually the 360's scalar is better than all but the top TV models. MS didn't skimp on that bit since it was designed with the US HDTV market in mind, they already knew they had to deal with 1080i HDTV ready sets which had no 720p resolution option.

I still think it's a bit bizarre that Sony missed out on that since Japan also has 1080i HDTV ready sets with no support of 720p. Some of which were made by Sony themselves.

Regards,
SB
 
Actually the 360's scalar is better than all but the top TV models. MS didn't skimp on that bit since it was designed with the US HDTV market in mind, they already knew they had to deal with 1080i HDTV ready sets which had no 720p resolution option.

I still think it's a bit bizarre that Sony missed out on that since Japan also has 1080i HDTV ready sets with no support of 720p. Some of which were made by Sony themselves.

Regards,
SB

And they also known, they're for long time so need a good scaler for 480p/576i to 720p, 720p to 1080p and 480p/i576 to 1080p media, not everybody buy a top TVHD in the 3000€/$ range to have good scaler…
Actually 360 is the most accurate media center, at a reasonable price, if you got lot of DVD and SD vid, the need of 90% of most consumers.
 
And they also known, they're for long time so need a good scaler for 480p/576i to 720p, 720p to 1080p and 480p/i576 to 1080p media, not everybody buy a top TVHD in the 3000€/$ range to have good scaler…
Actually 360 is the most accurate media center, at a reasonable price, if you got lot of DVD and SD vid, the need of 90% of most consumers.

Perhaps we should keep the scaler issue discussion to games output only as that is the topic.

I find the DVD scaler of the PS3 to be pretty good and have not heard otherwise, in fact I´ve read that it is top-notch.
 
Your second paragraph explains what's wrong with the first. The choice of resolution before and after scaling is not dictated by "just how it works" but by memory concerns.

960 pixels wide is one valid choice that can be fed into the (working!) horizontal-only output scaler, but there are others, among them 1024, 1280 and 1600. If memory wasn't a concern why would you scale from 1280x720 to 960x1080? You'd keep the width constant and scale to 1280x1080...
They don't use 1280x1080 because that's WAY too many pixels above the usual 1280x720 mode. Performance would drop like a rock if developers went this route just for the sake of compatibility with 1080i HDTVs.

960x1080 is the closest rendering mode to 1280x720 in terms of total pixels rendered; that's the main reason most devs opt for this resolution in 1080i/p upscaled output mode. The memory difference between rendering a game in 1280x1080 versus 960x1080 is pretty miniscule (less than 500kb total).

...or do it all in one step and scale directly to 1920x1080. It "only" takes more space in memory, but that's the whole problem.
No, because the PS3 doesn't HAVE a hardware scaler capable of "scaling directly to 1920x1080." If you want to scale directly like that on the PS3, you'd have to resort to implementing a software scaling solution to the game, which takes up way more memory than hardware scaling.

Again, you're confusing hardware-based scaling methods (which have zero memory and performance overhead) with software-based scaling methods which have considerable overhead in terms of memory required and extra performance for the scaling process done by the GPU.
 
In the recent develop magazine, they say, that MS dropped the mandatory native 720P TRC for smaller developers who had no leverage (surely, Bungie with Halo and Infinity Ward with Cod had more than enough) to push for a scaled image.
 
They don't use 1280x1080 because that's WAY too many pixels above the usual 1280x720 mode. Performance would drop like a rock if developers went this route just for the sake of compatibility with 1080i HDTVs.

960x1080 is the closest rendering mode to 1280x720 in terms of total pixels rendered; that's the main reason most devs opt for this resolution in 1080i/p upscaled output mode. The memory difference between rendering a game in 1280x1080 versus 960x1080 is pretty miniscule (less than 500kb total).


No, because the PS3 doesn't HAVE a hardware scaler capable of "scaling directly to 1920x1080." If you want to scale directly like that on the PS3, you'd have to resort to implementing a software scaling solution to the game, which takes up way more memory than hardware scaling.

Again, you're confusing hardware-based scaling methods (which have zero memory and performance overhead) with software-based scaling methods which have considerable overhead in terms of memory required and extra performance for the scaling process done by the GPU.
I get the feeling we're talking about different steps of the pipeline.

#1: render; done at full 720p, i.e. 1280x720
#2: software scaling to 960x1080
#3: hardware scaling to 1920x1080

Step #3 is free for all intents and purposes, but on PS3 limited to horizontal expansion only, and on top of that requires one of only a handful of horizontal input widths to work (one of 960, [strike]1024, [/strike]1280, 1440, 1600 [strike]AFAIK[/strike] according to this).

So the programmer choice is down to how to adapt render resolution into something that can feed step #3. If you want to output 1080 lines after a render with 720 lines, in absence of a vertical hardware scaler you have to do some software scaling. If you can't adjust rendering resolution to something fitting (with 1080 lines), #2 is a mandatory step. But you can still pick the horizontal target resolution produced by that step.
The easiest choice from the perspective of software scaler complexity is to keep the render width and just interpolate lines, i.e. produce 1280x1080 in this case. That's still a valid input to step #3.

So why not 1280x1080?
Is it because of compute throughput? No, because going from 1280x720 to 1280x1080 is a 1D filter and actually takes less compute resources than the 2D filter you need to shrink the lines to 960 width, too.
(bilinear interpolation: 3 vector MACs per output pixel; linear interpolation: 1 vector MAC per output pixel)

Is it because of memory bandwidth? Possibly, maybe ... not really. Writing 60 times 960x1080 pixels, assuming 4 bytes per pixel (which might be off by one), consumes 250MB/s of memory bandwidth. Do it in 1280x1080 and it takes 33% more, 332MB/s. Both are drops in the bucket of 19.2GB/s of GDDR bandwidth. You have to be teetering very close to the edge of acceptable performance to get pushed over by this incremental 82MB/s use of bandwidth, and even if it happens, all you get is a torn frame every once in a while, which may not be desirable, but won't make you fail QA.

So again, why not 1280x1080? Because it uses more memory.

Memory budgets are very hard lines. If all memory is already spent and accounted for, you can't push it another 1%. There's nothing left to use. There's no "we'll do it a frame later" if you require more space, as opposed to a search for more computation which is always a possibility and can be made to work, if a frame later.
 
I understood that the render is made at 960x1080, which I think is simpler than rendering at 720p and then software scaling to that resolution (Assuming some resolution independent HUD rendering).
 
yes this is my understanding also.

If your ps3 is set to forced 1080I or 1080P some titles will render at 960x1080 and then hw scale it to 1920x1080p.

If your ps3 is set to 720P,1080I or 1080P it's up to the developer to choose output resolution.
 
I get the feeling we're talking about different steps of the pipeline.

#1: render; done at full 720p, i.e. 1280x720
#2: software scaling to 960x1080
#3: hardware scaling to 1920x1080
There is no steps #1 and #2, because the PS3 games render at 960x1080 natively.

So there is only step #3: "hardware scaling 960x1080 to 1920x1080."
 
There is no steps #1 and #2, because the PS3 games render at 960x1080 natively.

So there is only step #3: "hardware scaling 960x1080 to 1920x1080."

So what you're saying is that Ghostbusters was rendered at 960x1080 and then downsampled to it's ridiculously low resolution? :p

I'm with Rolf N here that devs will render to whatever suits their game most. And that there are some cases where a game will probably be scaled from 720p.

Regards,
SB
 
Back
Top