[B3D Article] "Ripping off the veil: The mysterious PS3 hardware scaler exposed"

Huuh? didnt sony tell their internal PS2/PS1 backwards support crew of the harware of it? or, was it disabled until now?
Will be interesting to see if PS3 gets new capacities unlocked to the Euro launch, then. Lastest news is sony tightnened the NDA of the devs, so, we non-devs simply wont hear anything atm, or close to it.
 
Why is the front buffer 32-bit, not 24-bit? Or, in other words, what can PS3 do with an alpha channel in the front buffer, when theoretically all image composition has been completed?

Is there more image compositing to be done?

Is the front buffer 8-8-8-8 or 10-10-10-2 or ... ?

Jawed
 
And as LB said - why would this even be an issue as Sony have been using scalers in their tv's for years.:???:

Exactly, it's not a hardware issue. I think it's a strategy to give the devs a slight nudge to consider rendering above 1280x720. The step up to full 1920x1080 might seem to steep at first but going to 960x1080 is much easier. Then you have the next step, 1440x1080 and so on. More choices for the devs is a good thing, no?
 
Exactly, it's not a hardware issue. I think it's a strategy to give the devs a slight nudge to consider rendering above 1280x720. The step up to full 1920x1080 might seem to steep at first but going to 960x1080 is much easier. Then you have the next step, 1440x1080 and so on. More choices for the devs is a good thing, no?

If it is indeed Sony wanting devs to focus on true 720 or true 1080 resolution then I'm for it as it is misleading putting 720 or 1080 on the back of the box if the game isn't rendering these resolutions. Same as tv manufacturers that do this every day at retail and MS on the back of some xb360 games. It's not important to joe consumer but to those that buy hd machines for their resolution advantage over their previous consoles (that's the main point isn't it?) they could feel somewhat cheated. I know I do for the titles on 360 that don't render 720 native and scale this image out as such (pgr3). To do so is fine but to put 720 (or 1080) on the back of the box and have it internally rendering something other than these resolutions is misleading and borderline false advertising.

What I don't get is why it was not an option for the user to scale 720p (or 480i/p ps2 games) to 1080i/p to match their set.
 
TheChefO: Always trying to put a negative spin, right?

More choices is always better.

Not at all - like I said, I agreed with what Sony was doing by forcing a resolution that is true as advertised.

edit - even these resolutions are "ok" but I would like some notification on the box as to what the actual rendered resolution is.
 
Last edited by a moderator:
Old arguments here. The idea of getting "cheated" sounds kinda childish. Especially on a launch title. If no games rendered at the stated resolution then maybe youd have a point but gamers do not buy games because they render at high resolutions, they buy games that look good on their high definition televisions.
 
Not at all - like I said, I agreed with what Sony was doing by forcing a resolution that is true as advertised.

You're fighting wind mills here. If they put "720p" or "1080i/p" on the back of a game box they're merely stating possible video output formats and nothing more. If the video output then indeed is 720p or 1080i/p it's not false advertising. That goes for MS aswell.
 
Last edited by a moderator:
diggity diggity digg-it-y. :D Great article. Thanks for the explanations. :)

Q: How does a 960x1080 image compare to 1920x1080? i.e. Would it really be noticeable?
 
So am I correct in assuming this is going to essentially force PS3 devs to develop at a slightly higher than 720p resolution?

Well, for Sony fans, that's nothing to be applauded. Which will amount to another small performance hit, correct? And it's not like the PS3 hasn't had enough of those.

Though I suppose, if 1080p is doable in some cases, then this small rez increase over 720p could be almost irrelevant.
+1 it's 12.5% more pixels to be processed
 
Last edited by a moderator:
Old arguments here. The idea of getting "cheated" sounds kinda childish. Especially on a launch title. If no games rendered at the stated resolution then maybe youd have a point but gamers do not buy games because they render at high resolutions, they buy games that look good on their high definition televisions.

How important this is varies from person to person as it is personal preference/opinion. I'm glad that I have the ability to find out the rendered resolution on titles available but it is inexcusable to me that products are advertised to be something they are not.

Not a big deal though so please excuse the derail.

Regarding the issue of a scaler using memory, do all scalers use/need enough ram for the full frame buffer in order to function?
 
+1 it's 12.5% more pixels to be processed

Developers can care and insist on a certain level of performance in all modes, or target a certain level of performance in one mode (e.g. 720p), and support another but let the user eat any small performance hit using that mode incurs - if there is any, which there may not be (it would depend on the game's bottleneck).

Sony might mandate policy in that area, but we don't know if they do or do not.

I think for 1080i only owners, the option is better than just having 480i, regardless.
 
Developers can care and insist on a certain level of performance in all modes, or target a certain level of performance in one mode (e.g. 720p), and support another but let the user eat any small performance hit using that mode incurs - if there is any, which there may not be (it would depend on the game's bottleneck).

Sony might mandate policy in that area, but we don't know if they do or do not.

I think for 1080i only owners, the option is better than just having 480i, regardless.
Ok, that's better than 480i, but devs will have to chose between a upscaled 1080i/p with worse performance or graphics or set the 960/1080 as the standart resolution 720p running slightly smoother.
Still far from great in both situation for all consumers or some consumers.
 
Ok, that's better than 480i, but devs will have to chose between a upscaled 1080i/p with worse performance or graphics or set the 960/1080 as the standart resolution 720p running slightly smoother.
Still far from great in both situation for all consumers or some consumers.

Consumers make that choice. Devs can support both modes depending on your choice of output.

So my point was a developer could target a certain level of performance at 720p, which 720p users would get, and overlook any small drop in performance in the 960x1080 mode for 1080i-only owners (if Sony does not mandate a certain minimum performance level..which I'm sure they don't, since games with dodgy framerates are hardly unheard of).
 
Why is the front buffer 32-bit, not 24-bit? Or, in other words, what can PS3 do with an alpha channel in the front buffer, when theoretically all image composition has been completed?
Is there more image compositing to be done?
Quite possibly, if the game FB is separate from the PS2 output buffer, and you want compositing with other components like BRD playback. No, I don't know why you'd want alpha blending of game over TV, but if you want to suture proof... ;)

Is there any clue whether the 'hardware scaler' is the OS SPE or the controller chip? Knowing that would help towards an explanation why it was neither mentioned as a hardware component nor used from day one.
 
Consumers make that choice. Devs can support both modes depending on your choice of output.

So my point was a developer could target a certain level of performance at 720p, which 720p users would get, and overlook any small drop in performance in the 960x1080 mode for 1080i-only owners (if Sony does not mandate a certain minimum performance level..which I'm sure they don't, since games with dodgy framerates are hardly unheard of).
I understand your point ;)
But in the future there will be a lot of owners of 1080i/p set that would not be happy to relie on the poor scaler that pssibly power their tv (IQ, latencies depending on the tv scaler).
So in one case they will have to chose between the full quality game (720p) scaled with possibly iq and latencies issues, or set the game in upscaled 1080i/p but an already downgrade game as far as IQ or fps are concerned.

That why I think it's better but far from great, does somebody know if the scaler will be able to work with zero performance hit in near future?
 
Last edited by a moderator:
Q: How does a 960x1080 image compare to 1920x1080? i.e. Would it really be noticeable?
Well, it's equivalent to a 1280x720 image upscaled to 1920x1080. A 960x1080 framebuffer has slightly more pixels than a 1280x720 one. But these two are still a far cry from a 1920x1080 frame, which pushes twice as many pixels.
 
Back
Top