Xbox Upscaler... Any Details and Information?

The blur that occurs in most scalars when you scale an image to a higher resolution.
Not really. Only low quality scalers would fail and add noticeable blur when scaling 720p or higher to 1080p. The problem with the old scaler was it added excessive (artificial) sharpness which caused a lot of (IMO) unpleasant artifacts, and ultimately they decided to go with a more neutral scaler due to complaints from both customers and developers. The blur (for the most part) is simply because it's relatively lower res than native 1080p. I don't care how good the scaler is, you can't make sub-1080p look as good as native.

I have been pretty vocal of the XB1's scaler from the first comparison screens that were posted (DF's BF4 pre-release comparison). IMHO, it's a good thing that they removed the sharpening. Apparently, DICE, Crytek and Respawn agree.
 
Last edited by a moderator:
the "blur" might come from bilinear filter, if it use bicubic or lanczos it'll look better, though the ops is more costly.
 
the "blur" might come from bilinear filter, if it use bicubic or lanczos it'll look better, though the ops is more costly.
Their scaler is supposed to be more advanced than the X360's scaler (which used some form of Lanczos IIRC), and Respawn supposedly worked with MS to improve it, so I doubt that meant switching to a low quality scaler like bilinear. Unless AMD's native scaler uses bilinear and they decided to fall back on that. But again, I doubt it (although pending a comparison, there's a good possibly that I would even consider bilinear scaling to what the old scaler looked like).
 
Last edited by a moderator:
The blur (for the most part) is simply because it's relatively lower res than native 1080p.
Right, blurry compared with native imagery, which is what Rockster is presumably complaining about. It may not be introducing a significant lack of detail relative to the original image, but it's still a form of blurriness for a user of a high-resolution screen.

When I said "most scalars", I wasn't referring so much to overall quality, but rather to the simple fact that you don't usually have much choice besides balancing blurriness, ringing, and pattern artifacts (usually blocking, i.e. with nearest-neighbor scaling). "Most scalars" err reasonably far in the blurring direction.

I'm not one to defend the original XB1 scalar, but it's really not inconceivable that some people would prefer some sharpening artifacts like ringing to the raw blur. Preferably that would be handled in TV settings so that developers and anti-sharpening gamers aren't having to bash their heads into their desks over it, but if someone's TV doesn't have a very well-suited sharpening process (i.e. the filter is too wide or too skinny), they do have a reason for complaint (though of course, there's the question of where that complaint ought to be directed :smile:).
 
Right, blurry compared with native imagery, which is what Rockster is presumably complaining about. It may not be introducing a significant lack of detail relative to the original image, but it's still a form of blurriness for a user of a high-resolution screen.
What I meant was that 720p native will always look blurrier than 1080p native, regardless of how good the scaler is. Unless artificial sharpness is applied, but there's no way of doing this without introducing artifacts. Rockster is complaining about the blurry image relative to the old scaler, and the fact that turning up the sharpness / enabling edge-enhancement on his TV would affect 1080p native inputs as well (which he doesn't want), and his TV doesn't do as good of a job artificially sharpenining as the old XB1 scaler did. If he's claiming that there's noticeable blur being added from scaling, then there's no way to prove that. The only way would be to compare a 720p image scaled to 1080p with the XB1's scaler on a 1080p native display, to a 720p native output on a 720p native display, with all other things being equal (screen size, processing etc.).


When I said "most scalars", I wasn't referring so much to overall quality, but rather to the simple fact that you don't usually have much choice besides balancing blurriness, ringing, and pattern artifacts (usually blocking, i.e. with nearest-neighbor scaling). "Most scalars" err reasonably far in the blurring direction.
"Most scalers" are good enough to scale 720p to 1080p without noticeable blur. As taisui said, even something as simple as Bicubic would be sufficient. Unless the XB1's scaler is now worse than the X360's, then there shouldn't be any noticeable blur being added from scaling.

I'm not one to defend the original XB1 scalar, but it's really not inconceivable that some people would prefer some sharpening artifacts like ringing to the raw blur. Preferably that would be handled in TV settings so that developers and anti-sharpening gamers aren't having to bash their heads into their desks over it, but if someone's TV doesn't have a very well-suited sharpening process (i.e. the filter is too wide or too skinny), they do have a reason for complaint (though of course, there's the question of where that complaint ought to be directed :smile:).
I'm not saying that people don't prefer the old scaler. I'm simply saying that there's probably not any added blur, it's simply blurrier because it's lower res than native 1080p.
 
Last edited by a moderator:
My comment was not that the current scaling implementation is introducing "additional" blur, as you correctly point out that the lack of detail is simply a function of resolution. However, the previous implementation created more apparent detail by adding contrast around detected edges providing the "illusion" of greater resolution. This did have the side effect of aliasing/moire in certain areas, but IMO that was fairly minimal and not really an issue in practice. And why, in early COD/BF4 comparisons for instance, people felt the Xbox had better more detailed textures. But, it is what it is. Such is life.
 
And why, in early COD/BF4 comparisons for instance, people felt the Xbox had better more detailed textures. But, it is what it is.

Or they were just viewing the information through the lens of their bias, its not like it was a blind test. The Xbox fans chose the Xbox version, they rationalized it after the fact. The did it with the sharpness, they do it now with the black crush.
 
"Most scalers" are good enough to scale 720p to 1080p without noticeable blur.
it's simply blurrier because it's lower res than native 1080p.
I don't think we disagree with each other. Nobody is claiming that the scalar is removing a significant amount of information relative to the original image, which seems to be the point you're hung up on.

Or they were just viewing the information through the lens of their bias, its not like it was a blind test. The Xbox fans chose the Xbox version, they rationalized it after the fact. The did it with the sharpness, they do it now with the black crush.
Fan bias might be part of the equation, but those things do make an image look more "vivid." When people judge similar images at a glance, "best" often takes the form of "what's most immediately striking."

In fact, I personally wouldn't be surprised if a place like NeoGAF is actually more skewed toward "at a glance" preferring the PS4 images than the general population is, since the people over there have just enjoyed several months of "sharpening and blowing out an image at the source is bad" training. Show the Thief comparisons to entirely random people, and you'd probably get a moderate number of "most of these images look washed out" sorts of responses.
 
My comment was not that the current scaling implementation is introducing "additional" blur, as you correctly point out that the lack of detail is simply a function of resolution. However, the previous implementation created more apparent detail by adding contrast around detected edges providing the "illusion" of greater resolution. This did have the side effect of aliasing/moire in certain areas, but IMO that was fairly minimal and not really an issue in practice. And why, in early COD/BF4 comparisons for instance, people felt the Xbox had better more detailed textures. But, it is what it is. Such is life.
I wonder if you tried Assassin`s Creed IV on the Xbox One, because the moire effect was so pronounced to the point of being motion sickness inducing. When moving around the map or rotating the camera, the foliage and the leaves of the trees were sending like light flashes directly into your brain, it was so disturbing. There were white coloured dots everywhere.

CoD was not so bad, but I didn’t get far in the game. I agree with djskribbles, and he truly knows his stuff.

Aside from that, sharpening the image must be done in a sensible way. Sharpness adds a LOT of artifacts, especially white halos around the edges and adds information that didn’t exist in the first place. :???: The effect is very noticeable in games. I tested this with my now calibrated TV vs non calibrated settings with sharpness on.

In a game like Powerstar Golf, for instance, the portraits of the players before starting a event had a white halo around the edge, it looked so bad!

Then for games in general, the sparkling skin surfaces of the characters produce such a strong bloom with sharpness on to the point of becoming annoying and unnatural.

I can tell you that much because I obsess over those things and took me months to calibrate the TV to the point of being happy with it.

edit: I used Word so the font looks larger than usual
 
Last edited by a moderator:
Cyan, I appreciate where you are coming from but believe me, I obsess more than anyone. Don't buy TV's without access to the service menu and have them ISF calibrated. I know everyone's environment is different so we simply have a difference of opinion. Even mentioning from the outset that I'm likely in the minority. Just be careful that your argument doesn't imply ignorance on my part. Would hate to feel slighted. :)
 
Cyan, I appreciate where you are coming from but believe me, I obsess more than anyone. Don't buy TV's without access to the service menu and have them ISF calibrated. I know everyone's environment is different so we simply have a difference of opinion. Even mentioning from the outset that I'm likely in the minority. Just be careful that your argument doesn't imply ignorance on my part. Would hate to feel slighted. :)

When I got my (still loved) Mario Sunshine on my Pal RGB Gamecube I opened the back on my good old CRT TV and while the TV was on with Mario on it I adjusted all the "hardware" settings (screws) I could find to have the most perfect sharp and vivid image.

I wouldn't recommend it to anyone though, very dangerous thing to do while the TV is on. And those CRT TVs were power hungry!

I think I am more obsessed than you two about image fidelity! :cool:
 
When I got my (still loved) Mario Sunshine on my Pal RGB Gamecube I opened the back on my good old CRT TV and while the TV was on with Mario on it I adjusted all the "hardware" settings (screws) I could find to have the most perfect sharp and vivid image.

I wouldn't recommend it to anyone though, very dangerous thing to do while the TV is on. And those CRT TVs were power hungry!

I think I am more obsessed than you two about image fidelity! :cool:

Yea I shorted a CRT monitor a long time ago, a real long time ago.
 
Cyan, I appreciate where you are coming from but believe me, I obsess more than anyone. Don't buy TV's without access to the service menu and have them ISF calibrated. I know everyone's environment is different so we simply have a difference of opinion. Even mentioning from the outset that I'm likely in the minority. Just be careful that your argument doesn't imply ignorance on my part. Would hate to feel slighted. :)
Well, it wasn't my intention at all to initiate a down spiral of comments on how smart/ignorant people's choices are. In fact, reading your post makes me feel that I don't obsess enough with those things --Globalisateur is up there in the list, too, he heh. Being happy with your choices or finally knowing what you prefer seems to be a lifelong quest.

I was speaking from personal experience and my migraines don't help with effects that enhance the more luminous areas of an image. I noticed those effects. As I said, CoD was bearable, AC IV not so much.

Sharpness can be okay at times, and I kinda liked an effect on my TV called Advanced Sharpness, which doesn't introduce many artifacts, it just enhances the textures without the rest of the -imo- negative effects.

That was palatable, but yeah, sharpness is not for me overall.

Over time I developed this theory in which Microsoft wanted to enhance the sharpness on the Xbox One 'cos Xbox 360 games looked better than the competition, more sharp.

On the X360 though, it was natural. People said there weren't differences in some games where they blatantly existed.

In fact to me 95% of the multiplatform games that seemed equal or very similar, looked crisper on the X360, resolution aside.

I think the key was that the textures were better on the X360, even if just a notch, because it had more free video memory, so let's say to make myself understood that PS3 textures were on low to medium settings, and X360's were on medium settings.
 
Hi everyone first post here at B3D. Long time reader and interested in the topic; I've ported an indie game to PSN@Home and done a little bit of Steam stuff integration for indies, but I never really had anything to contribute to the hardware discussion. Most of you guys are way over me, aside from understanding what your'e talking about to a degree, I've never invested enough time to really know what's happening.

Anyway,

The X1 scalar is known to cause 'crushed blacks' as per written by many, it's definitely darker when you compare footage etc.

Someone recently posted this picture comparing OpenGL vs DX11

maxresdefault.jpg


And I couldn't help but find some correlations in how much darker DX11 was as this is running off a Geforce 660 (apparently). I guess I should do some primary research and take some screen caps of my own since I also own the same card, but could anyone with DX11 experience provide any insight is this a pattern or intentional..?

tldr; could DX11 be the cause for crushed blacks on X1?
 
On a different note -

RE: Sunset Overdrive black crush
http://www.neogaf.com/forum/showpost.php?p=136643179&postcount=591
ok guys, I've chased this down and around and have a solid understanding. It may surprise some of you that I'm actually pretty into display calibration, but there's so many different pieces I wanted to make sure I had it all right before I gave the details.

Firstly - we DO have an issue with our gamma correction slider/menu in the game. So if you set it based on the logo just barely appearing at the start of the game, it's wrong, and it'll be way too dark. So don't do that, we're going to patch it ASAP. That's on our end. You may just want to set it to default (middle) or slightly higher if you use a PC Monitor as your display.

That said we're using HDTV standard called rec-709 for color space display. That standard uses a gamma curve that's darker, especially in very dark colors, and expects colors to be in the limited range (ie 16-235).

We use that standard by default, and it's the XB1 default and it is the encouraged range by MSFT since its HDTVs are what most people will be playing on.

What this means that if you're watching on an HDTV, and you've calibrated that TV with the XB1 calibration utility (or any blu-ray calibration disc in your XB1), Sunset Overdrive will look the way we intended the game to look.

Of course, if you're playing on a PC monitor or your TV was calibrated to give a PC-like response or is expecting the sRGB/PC standard (Full), then Sunset Overdrive will look darker than intended. For folks like this, we have gamma correction in the game, but it's not working quite right, as noted above (so again, use default or maybe a notch or two above).

Also - I know some of you noted something about earlier games having had issues. A couple issues of gamma were corrected earlier on in the XDK / upscaler etc. Regardless, those issues shouldn't effect us as we don't use the hardware upscaler: SO renders at 900p early in our rendering pipeline and then it's switched to 1080p later in the pipeline and that's how we output.

hope that helps.
 
Interesting. The gamma slider in that game was obviously totally broken. Good to know the way I'm playing now should be correct, because I left the slider at the default.
 
Yeah it was a little confusing. I couldnt get any logo to show at all on my tv even turning my tvs brightness and contrast full blast. It is also interesting to find out that like Ryse they dont use the hardware upscaler.
 
On a different note -

RE: Sunset Overdrive black crush
http://www.neogaf.com/forum/showpost.php?p=136643179&postcount=591
Thankfully, I've never experienced black crush on my TV -sadly, it was pretty common for me in my X360 days- on the Xbox One.

I always use Standard RGB and it never falters, 'cos as I've explained in a thread I created about my TV, I am not even sure the TV accepts full RGB despite being 3D, 36 bits colour and modern and all.

It took me a lot of time but it's been months I didn't touch the settings, the TV is now fully calibrated and images look so lovely.
 
Interesting. The gamma slider in that game was obviously totally broken. Good to know the way I'm playing now should be correct, because I left the slider at the default.
I've never known a gamma slider work! I just ignore them and slide it so I can see the darks which black out otherwise.
 
I've never known a gamma slider work! I just ignore them and slide it so I can see the darks which black out otherwise.
Gamma sliders seem to work for me, but if what I see on my TV is what the developers want to me see is another question.

Displays on devices are generally well calibrated these days, my 2010 and 2014 Sony Bravia TVs both needed very minor tweaks, Apple devices - iPhones, iPads, iMacs are all incredibly well calibrated according to Anandtech reviews going back several years. Likewise Google Nexus devices and LG phones are also well calibrated, as I suspect as many mid-range to high-end mobile devices. AMOLED also seems oversaturated but I think that's a bias to that particular technology, but it can be compensated for.

So things should be getting better! I would frankly much prefer to play with a gamma slider than mess around with the brightness and contrast on my TV. Sacrilege! :yep2:
 
Back
Top