[B3D Article] "Ripping off the veil: The mysterious PS3 hardware scaler exposed"

Anyway, the big thing to take away is that 12.5% more pixels can degrade image quality ...
Yes, and in some cases it will improve the image as seen by Thowllly's AA images. (edit: not that obvious after that last post)

It would be interesting to see how the images would compare if you only add AA in the horizontal direction. Could these scaling requirements of Sony be a mean to reduce the need for AA in de vertical direction and thereby reduce the need of bandwidth and memory somewhat?
 
Last edited by a moderator:
Now the "1280x720" image is clearly better on the majority of the edges, as it should be :)
That's actually a bit surprising, because I figured 4xAA would help to level the playing field.

Thanks for your work! BTW, you don't have to put those resolutions in quotes, because every comparison pic here is a crop anyway. So long as you did the renders in the right ratios, which you did.
 
Yes, and in some cases it will improve the image as seen by Thowllly's AA images.
In his squished pics, yes. In his normal ones, no. The reason is that even near 45 deg angles look better with a square pixel.

It would be interesting to see how the images would compare if you only add AA in the horizontal direction. Could these scaling requirements of Sony be a mean to reduce the need for AA in de vertical direction and thereby reduce the need of bandwidth somewhat?
I think all you could do is change the AA sample locations, and AFAIK rotated grid is pretty close to optimal regardless of pixel aspect ratio. A near vertical edge will look worse at 960x1080 w/ 2xAA than at 960x540 with 4xAA, so I don't think you can reduce the AA level without consequences.

Anyway, like I said before this isn't a terrible solution for Sony and does what it has to. It gives 1080i-only owners a setting that will look a heck of a lot better than 480p, and devs don't need to do a manual framebuffer scale to implement it.
 
Yup, I did, but my own analysis of the original pics tells me the reduced detail is from a reduced mipmap.
Yeah, that might be so, but thats in addition to the unnessesary vertical blur.
I also see a much bigger difference between your third and fourth images than you do.
I don't know what you see in those pictures, but yeah, there is a big difference between image 3 and 4. Image 4 has even more blur!
If there is any vertical blurring, it is extremely subtle (much less than you did in the third pic) and not enough to explain the reduced texture quality.
Much less!? There is much more blur! Let me demonstrate, I have taken the image from #168, but isolated out one column from every picture, just to only focus on the vertical blur.
vblurvs4.png

Column 1: From unscaled picture, no vertical blur.
Column 2: From horizontaly scaled picture, no vertical blur.
Column 3: From horizontaly scaled picture with a small vertical blur filter added, some vertical blur.
Column 4: From original scaled picture, huge amount of vertical blur.

And this is on polygon edges, where no blurring should be if it were only texture blurring.
Reduced mipmap level also explains why the 720p pic has muddy highlights.
The 720 picture also had this unnessesary vertical blur applied
Doing the same point sampling downscale from 1080p gives 720p more artificial texture detail also, so I think the comparison between the old 960x1080 and 1280x720 pics was still fair.
Yeah, that I totally agree on, I never said the 960x1080 vs. 1280x720 comparison wasnt fair, only that something was wrong with the upscaling of the pictures. And as I've already said, I noticed this first on the 960x1080 picture, since it shouldnt have any vertical blur at all.
Anyway, it doesn't matter. We have new source images from macabre. I've attached another comparison in this post. It's not a tremendous difference, but you see vertical edge aliasing standing out and textures being blurrier, and the better horizontal edges are little consolation.

The lesson for the day is more pixels aren't always better.
Yup, agreed.

Edit: I've said it already, but I just want to clarify. The reason I critizised those orignial scaled images was not because they gave an unfair advantage to the 1280x720 picture over the 960x1080, because they didn't. My whole point the entire time has only been that those images made the scaled pictures look worse than they should, making it look like the leap to full 1080p resolution is larger than it really is.
 
Last edited by a moderator:
That's actually a bit surprising, because I figured 4xAA would help to level the playing field.

Thanks for your work! BTW, you don't have to put those resolutions in quotes, because every comparison pic here is a crop anyway. So long as you did the renders in the right ratios, which you did.
Oh, now that you're so nice and saying thank you, I feel a bit bad about maybe being a bit harsh when discussing macabres pictures... :oops: I still think I'm right though :p
 
Let me demonstrate, I have taken the image from #168, but isolated out one column from every picture, just to only focus on the vertical blur.
You picked a very misleading column, because the gradients you see are from horizontal blur of the adjacent column.

Look at the attachment. These are crops from the original pic where horizontal blurring isn't a factor. The arrows show you how small the weighting is from adjacent rows.

Regardless of what made the original pics like that, let's move on, especially since we both think the comparison is still fair. We both have our beliefs, and we have new examples too.
 

Attachments

  • scaling2.jpg
    scaling2.jpg
    16.2 KB · Views: 16
Anyway, like I said before this isn't a terrible solution for Sony and does what it has to. It gives 1080i-only owners a setting that will look a heck of a lot better than 480p, and devs don't need to do a manual framebuffer scale to implement it.
Does this mean: If you have a 1080p TV, you should rather pick the 720p output and let your TV do the scaling than using the 1080p output of a game that uses scaling from a 960x1080p image to generate the 1080p output?

I wonder how Sony will communicate that?

I wonder if we are missing something here?
 
I think that Sony should force the developers to have in-game resolution selection because some games support 1080p but at very bad framerate. Having a per game profile with resolution option in the XMB would be another option.
 
Does this mean: If you have a 1080p TV, you should rather pick the 720p output and let your TV do the scaling than using the 1080p output of a game that uses scaling from a 960x1080p image to generate the 1080p output?

I wonder how Sony will communicate that?

I wonder if we are missing something here?
Yes, I think scaling up 1280x720 will look a bit better most of the time. But, remember, there are HDTVs (1080i ones) that don't support 720p at all, for them 960x1080 will be a huge improvement over 480p.

But, the game can potentially do a much better job of scaling the image using software than the TV ever could (going of-topic again, since this is about hardware scaling, but I want to mention it). When scaling a image on the console you can use the extra information present in the unresolved multisample buffer and the z-buffer to do an improved "intelligent" scaling.
 
Yes, I think scaling up 1280x720 will look a bit better most of the time. But, remember, there are HDTVs (1080i ones) that don't support 720p at all, for them 960x1080 will be a huge improvement over 480p.
Maybe could this scaling from 960x1080 be limited to be used only in 1080i output and not 1080p output? Isn't 1080i considered to be lower quality than 720p anyhow?

To be allowed to check the 1080p checkbox, Sony may require you to scale from 1280x1080, 1440x1080 or 1600x1080?
I don't think Deathkillers idea of having a lot of setting possibilities in every game sounds very likely.

But, the game can potentially do a much better job of scaling the image using software than the TV ever could (going of-topic again, since this is about hardware scaling, but I want to mention it). When scaling a image on the console you can use the extra information present in the unresolved multisample buffer and the z-buffer to do an improved "intelligent" scaling.
The whole concept of using dumb scalers to "recreate" a lot of information that actually once was present in the unresolved framebuffer seems overall stupid IMO.
 
Last edited by a moderator:
That was more in the wishful thinking side of things, I seriously dub that Sony will care about 960x1080 scaled being worse that 1280x720 scaled on 1080p panels. They will either don't allow scaling for 1080p output (reserving it only for 1080i) or allow any of the supported scaling resolutions to be used for 1080p (very likely).


In the end every HDTV owner will have to play with XMB resolution to have the best PS3 output depending of the game and media.
 
Back
Top