Why the PS3 framebuffer only scales horizontally

Actually the 360's scalar is better than all but the top TV models. MS didn't skimp on that bit since it was designed with the US HDTV market in mind, they already knew they had to deal with 1080i HDTV ready sets which had no 720p resolution option.

I still think it's a bit bizarre that Sony missed out on that since Japan also has 1080i HDTV ready sets with no support of 720p. Some of which were made by Sony themselves.

Regards,
SB

I've heard the 360 scaler is better than most TV's scalers too, but I have my doubts. I've owned an entry level bravia V5100 and high end XBR6 and they looked identical when scaling 720p to 1080p.

Also, TVs are updated every year. I would think the scaler found in any TV sold today would outperform the scaler in a 360 which is based on technology from 4 years ago.
 
Also, TVs are updated every year. I would think the scaler found in any TV sold today would outperform the scaler in a 360 which is based on technology from 4 years ago.

Well of course... but the point is that there are a lot of older HDTVs out there, and it made a lot of sense in 2005.
 
I'm just a EE, with no specialized knowledge of TVs or scalers, but a strong background in signal processing, so take this for whatever you think it's worth: I just don't think that scaling an image is that difficult. I mean, it's certainly not black magic; it's done all the time. The sinc function is the mathematically ideal interpolator, and the Lanczos filter is the practical (finite) realization of that filter. Nearest-neighbor and bilinear filtering look poor, but bicubic looks pretty good, and Lanczos is very close to ideal. I think that a lot of the reason people think that scaling is hard is because on video and film sources, the scaler is often trying to "fix" analog noise, motion and compression artifacts, etc. When you have access to the idealized, digital source image (from the front buffer or from a TV's HDMI input) the scaling should be very simple--bicubic or Lanczos. Anything else, and the scaler is trying to pretty up the image with the equivalent of a post-processing stage that the image creator never intended.
 
Well of course... but the point is that there are a lot of older HDTVs out there, and it made a lot of sense in 2005.

Considering that 1080p was rare in 2005 (Sony didnt even sell a 1080p LCD in 05), I guess it made sense. 1080p didn't go mainstream until 07.
 
There are a hell of a lot of the older CRT-based 1080i sets in the USA. Sure they may well become obsolete in the "ten year life cycle" as patsu reckons but the fact is that most people buy TVs to last them years. It's the one component that will probably out-last the console. And such conjecture is of course cold comfort to the people who own these sets.

Who wants a big giant lump of TV standing around when everyone else is getting nice flat TV´s to hang on the wall?
CRT´s are dying so fast it even surprised me, a HiDef optimist.
 
Who wants a big giant lump of TV standing around when everyone else is getting nice flat TV´s to hang on the wall?
CRT´s are dying so fast it even surprised me, a HiDef optimist.

Perhaps where you are. But out here, I'd say CRTs still outnumber LCD's 4:1 in people's homes. A friend of mine that is an avid movie watcher only recently got a LCD, and he only did that because he wanted a larger screen. Otherwise, he'd have probably kept his 7 year old CRT. And while he likes the sharpness of the display he doesn't much like the color balance (especially black levels in a dark room) compared to his CRT.

Overall he semi-regrets getting a LCD. Told him he should just return it and give Plasma's a try, but it's difficult convincing him that Plasma burn in is over-hyped.

Regards,
SB
 
I'm just a EE, with no specialized knowledge of TVs or scalers, but a strong background in signal processing, so take this for whatever you think it's worth: I just don't think that scaling an image is that difficult. I mean, it's certainly not black magic; it's done all the time. The sinc function is the mathematically ideal interpolator, and the Lanczos filter is the practical (finite) realization of that filter. Nearest-neighbor and bilinear filtering look poor, but bicubic looks pretty good, and Lanczos is very close to ideal. I think that a lot of the reason people think that scaling is hard is because on video and film sources, the scaler is often trying to "fix" analog noise, motion and compression artifacts, etc. When you have access to the idealized, digital source image (from the front buffer or from a TV's HDMI input) the scaling should be very simple--bicubic or Lanczos. Anything else, and the scaler is trying to pretty up the image with the equivalent of a post-processing stage that the image creator never intended.

You would imagine so, but the scaler in my Gateway 30" versus the Dell 30" is absolutely worlds apart in how well they scale images. And my two year old Aquous certainly scales SD to HD significantly better than my friends brand new Vizio.

I would agree that for top end TV's I don't see scaler's being improved much. I'm more seeing a slow migration of good scaling going from high end TV's down to mid range sets rather than scaling improving at the top end.

Perhaps by the time the next generation of consoles is around (2015 or so) there may be no need for a good scaler in a console, but I still think it's pretty essential currently.

Regards,
SB
 
Yeah sadly my tv has a pretty piss poor scaler for SD content at least. PS2 games look pretty awful and due to the lack of backwards compatibility in the new PS3's even when I get one around this Christmas my PS2 games will still look like crap.
 
You would imagine so, but the scaler in my Gateway 30" versus the Dell 30" is absolutely worlds apart in how well they scale images. And my two year old Aquous certainly scales SD to HD significantly better than my friends brand new Vizio.

I would agree that for top end TV's I don't see scaler's being improved much. I'm more seeing a slow migration of good scaling going from high end TV's down to mid range sets rather than scaling improving at the top end.

Perhaps by the time the next generation of consoles is around (2015 or so) there may be no need for a good scaler in a console, but I still think it's pretty essential currently.

Regards,
SB

Oh, I didn't mean to suggest that every TV set or upscaling DVD player has a good scaler. I was just arguing that scaling itself is well understood and easily doable if you have the budget for it. It's not surprising to me that the budget/value-oriented products save a few pennies on the scaling, but, as you say, hopefully Lanczos scaling will propagate to lower- and lower-priced equipment as time goes on.
 
You would imagine so, but the scaler in my Gateway 30" versus the Dell 30" is absolutely worlds apart in how well they scale images. And my two year old Aquous certainly scales SD to HD significantly better than my friends brand new Vizio.

Scalers in a monitor arent expected to be that great because the video adapater is supposed to output the correct resolution. modern consoles output mostly 720p, and scaling 720p is trivial for any 1080p HDTV, even a crappy vizio. There is quite a bit of variation from scaling 480p to 1080p, but for all intents and purposes, it doesnt really matter. Why would anyone with a HDTV pick 480p?
 
Last edited by a moderator:
Scalers in a monitor arent expected to be that great because the video adapater is supposed to output the correct resolution. modern consoles output mostly 720p, and scaling 720p is trivial for any 1080p HDTV, even a crappy vizio. There is quite a bit of variation from scaling 480p to 1080p, but for all intents and purposes, it doesnt really matter. Why would anyone with a HDTV pick 480p?

There's still fairly noticeable differences between the scaling of my 2 year old Aquous compared to my friends months old Vizio, even with 720p. It's certainly less noticeable than with SD sources however. I'll grant you that.

And as I mentioned, quality of scalers is slowly migrating down from High End sets to more consumer friend mid range sets.

As for monitors, that all depends on whether there's a need (or desire) to run legacy appliations. Likewise when you get to something the size of a 30" monitor. It's nice to be able to run a game at 1680x1050 and have it look indistinguishable from a 2560x1600 source. Other than the jaggies if you don't have AA enabled.

Regards,
SB
 
No you won't find any real info. What you'll find is at some point people believed (H)ANA was the dedicated scaler.

Now we know Xenos is doing the scaling, we still don't know how much of hardware resources of Xenos is shared there. I'd consider it bad engineering if Xenos had a completely dedicated scaler (that does not share anything with software side of things), thus I find it unlikely.
 
No you won't find any real info. What you'll find is at some point people believed (H)ANA was the dedicated scaler.

Now we know Xenos is doing the scaling, we still don't know how much of hardware resources of Xenos is shared there. I'd consider it bad engineering if Xenos had a completely dedicated scaler (that does not share anything with software side of things), thus I find it unlikely.

What like UVD on current Radeon's that does nothing except accerlate video?

If MS put in provisions for hardware scaling, then chances are Xenos has hardware scaling.

What is odd is that G70/71 has the ability to scale video and RSX is based off those. Yet it isn't capable of full scaling for some reason. Did Nvidia remove it for RSX in order to reduce size and cost?

And it isn't as if ATI is unfamiliar with this sort of thing considering in they past they had made consumer electronics chips capable of HW scaling. And in fact included it in consumer video processing chips.

Regards,
SB
 
What like UVD on current Radeon's that does nothing except accerlate video?

If MS put in provisions for hardware scaling, then chances are Xenos has hardware scaling.

What is odd is that G70/71 has the ability to scale video and RSX is based off those. Yet it isn't capable of full scaling for some reason. Did Nvidia remove it for RSX in order to reduce size and cost?

And it isn't as if ATI is unfamiliar with this sort of thing considering in they past they had made consumer electronics chips capable of HW scaling. And in fact included it in consumer video processing chips.

Regards,
SB

Software scaling is very cheap computationally though, unlike decoding (which still does not make it good engineering mind you). In fact the main reason PS3 games don't do soft scaling is because of the memory cost, which would be irrelevant if your hardware has a few (probably 3 or 4) lines of buffer before the final output.

There is also the case of Capcom's Framework which implies 1080p scaling of Xenos is not completely transparent to native rendering. Of course there may be other reasons, but evidence is evidence. ;)
 
Back
Top