Digital Foundry Article Technical Discussion Archive [2013]

Status
Not open for further replies.
Will have to wait for that.As of now, this is the best we got.
Not really. It's a worthless comparison that doesn't give any indication of what 900p will look like on XB1/PS4 compared to 1080p. You wouldn't look at a trilinear upscaled image to determine what Sony versus Samsung upscaling algorithms are like; my TV does a superb job of upscaling SD broadcasts, completely different to a simple upscale. Ergo it should just be ignored rather than taken as early proof of anything.
 
All of 'em! They all have the same issue - you can't use the upscaled comparisons as an indicator of what the final game experience will be. XB1 rendering any game at 900p or 720p and upscaling to fill your 1080p display will look better than taking a 900p or 720p render, upscaling to 1080p via the simplest upscaler as used in the DF article, and displaying that.
So youre saying the upscaling algorithm used in say (photoshop most likely) which uses a powerful CPU and takes MSECS to complete, is going to be worse than the hardware scaler that has to run @60fps.

i.e. youve got it backwards, the upscaling in photoshop (if used correctly) will look better than what the hardware scaler does, its logical
 
So youre saying the upscaling algorithm used in say (photoshop most likely) which uses a powerful CPU and takes MSECS to complete, is going to be worse than the hardware scaler that has to run @60fps.

i.e. youve got it backwards, the upscaling in photoshop (if used correctly) will look better than what the hardware scaler does, its logical

So you don't know what upscaling algorithm is used for the previous images...so it's hard to draw conclusion one way of the other. He's saying that the simulated images don't represent the actual quality because they just aren't.
 
1080P is the native resolution for most HD TVs, it would be nice to skip scaling and have a nice 1:1. PC gamers like myself have had it for years. I never play anything in sub-1080P on my PC.

I do my PC gaming almost exclusively on a 42" (1080p) plasma about 8' away. I use all kinds of different combinations of resolutions & settings depending on the game and the FPS & IQ target I want to hit. I don't understand why any PC gamer would choose to ignore any of the options available to them to customize a games visuals to their exact preferences.
 
Well, plasmas are better at displaying different resolutions than LCDs.

That's the first I've heard that.

It's irrelevant to this discussion, though. My graphics card is set to scale everything to the native resolution of the display (in this case 1080p) so that's all the plasma ever sees.
 
So youre saying the upscaling algorithm used in say (photoshop most likely) which uses a powerful CPU and takes MSECS to complete, is going to be worse than the hardware scaler that has to run @60fps.
No, I'm saying the hardware upscaler is much better than the simple flash-based trilinear upscale being used in the DF article. I'm talking only about the DF comparisons Delta9 is using as a reference for how sub-1080p will look next-gen. You're right that other upscalers could produce better than console results when comparing images, but we were talking about a specific subset of comparison images. Also, the highest quality, 5 seconds to compute, upscale on PS will be far closer to the final console upscale than the web-based super-cheap upscale, such that viewing a 900p image upscaled in PS will give a better representation of the game experience. The quality will be superior to the console output*, but the delta between comparison and output will be far smaller.

If someone wants to post high quality upscaled images for reference (take the 720p and 900p sources from the DF article), I'll happily endorse them as probably representative of how the games will appear (although of course the UI will be upscaled whereas UI will be native on consoles, so we'd have to ignore those).

*that really does depend on the algorithm. Just because a PS filter can take its time, doesn't mean it's intrinsically a better filter, as hardware can implement exactly the same filter in realtime.
 
No, I'm saying the hardware upscaler is much better than the simple flash-based trilinear upscale being used in the DF article
....

While I agree DF's approach is pointless, it doesn't seem to use flash based upscale, but BF4's own internal scaler:

"The PC version includes a resolution scaling option that allows you to maintain a desired output resolution (optimal for fixed pixel displays and for reducing input lag) while altering the internal rendering resolution."

This is how they seem to have done the comparison.

The result shouldn't be representative of TV upscalers or hardware scalers of next-gen either way.
 
"The PC version includes a resolution scaling option that allows you to maintain a desired output resolution (optimal for fixed pixel displays and for reducing input lag) while altering the internal rendering resolution."
.

ALL games should have this option.... SSAA FTW!!
 
While I agree DF's approach is pointless, it doesn't seem to use flash based upscale, but BF4's own internal scaler.
I don't think so, because if you click the small images you are taken to the originals which I linked to, which are native resolution. And, as I say, the upscale used for those comparison is very simple. I seriously doubt that's BF4's internal scaler, because if it is, all they are doing, basically, is using the framebuffer as a texture, slapping it on a quad, and rendering the quad full view with texture interpolation.
 
I don't think so, because if you click the small images you are taken to the originals which I linked to, which are native resolution. And, as I say, the upscale used for those comparison is very simple. I seriously doubt that's BF4's internal scaler, because if it is, all they are doing, basically, is using the framebuffer as a texture, slapping it on a quad, and rendering the quad full view with texture interpolation.

I did exactly that and the image sizes were 1080p?

edit: Case in point: http://cfa.eurogamer.net/2013/articles//a/1/6/2/2/4/3/2/1_900_copy.png

If you use the Opera 12 browser, you can easily check the image size without having to save it anywhere.
 
You're right. My bad. The source images have been upscaled and aren't being upscaled via the viewer. But I still say that they aren't proper upscales because the results are the same as a crappy trilinear upscale which is not what the consoles will be using as their upscale algorithm.
 
You're right. My bad. The source images have been upscaled and aren't being upscaled via the viewer. But I still say that they aren't proper upscales because the results are the same as a crappy trilinear upscale which is not what the consoles will be using as their upscale algorithm.
So what's the difference? The game's using an internal scaler, the consoles will be using an internal scalar... I'm not seeing how this is going to be a night-and-day improvement. It's still not pixel-for-pixel, there's going to be some softness and some lost detail, it's inevitable.

Frankly, I'm surprised that they're still having so much trouble rendering at the native resolution of everyone's TV, and that neither one of them appears to have mandated 1080p output for all games. I'm not "okay" with waiting another seven or eight years for 1080p games.
 
So what's the difference? The game's using an internal scaler, the consoles will be using an internal scalar... I'm not seeing how this is going to be a night-and-day improvement. It's still not pixel-for-pixel, there's going to be some softness and some lost detail, it's inevitable.
There's a world of difference between different upscaler results. Here's a good upscaling plugin with radically better results than the simple upscale.

http://www.alienskin.com/blowup/blowup_examples.aspx

If XB1/PS4 were to integrate that algorithm in hardware running in realtime, the results would be far, far better than the DF examples. Without any knowledge of the scalers in the the consoles, we can't find any reference material to predict what their output will be like. People really should just wait and see!
 
TV's tend to have very good upscalers. They HAVE to have very good scalers, as there's few native resolution content, they almost always have to work with non-native resolution, be it when they play a video file or display a satellite channel.

edit: Oh, the example upscale plugin is superb. http://www.alienskin.com/blowup/blowup_example-4.aspx for example uses almost every pixel to good use!
 
So what's the difference? The game's using an internal scaler, the consoles will be using an internal scalar... I'm not seeing how this is going to be a night-and-day improvement. It's still not pixel-for-pixel, there's going to be some softness and some lost detail, it's inevitable.

Frankly, I'm surprised that they're still having so much trouble rendering at the native resolution of everyone's TV, and that neither one of them appears to have mandated 1080p output for all games. I'm not "okay" with waiting another seven or eight years for 1080p games.

No one is having trouble rendering at 1080p. They just choose not to. If you want 1080p, buy a pc and then accept the fact that post processing, transparencies, shadows and other things are still not always rendered at the display resolution.
 
Yeah, I've got one, thanks. But having it on the 60" would be nice for a change.

At the very least, if they actually spit out a 1080p signal, that would be something. As it is now, a lot of titles are rendering at [insert random sub-HD resolution here], the system is upscaling to 720p, sending that to the TV, which is then upscaled again to 1080p by the TV.
 
Yeah, I've got one, thanks. But having it on the 60" would be nice for a change.

At the very least, if they actually spit out a 1080p signal, that would be something. As it is now, a lot of titles are rendering at [insert random sub-HD resolution here], the system is upscaling to 720p, sending that to the TV, which is then upscaled again to 1080p by the TV.

I don't think ps3 or 360 output 720p unless you tell them to. They should both output 1080p. The new consoles will both have high-quality hardware scalers to output 1080p.
 
PS3 frequently outputs 720p unless you force it to output 1080 by disabling 720p support in the display settings. As every non-ancient HDTV supports 720p input and upscaling, Sony left it to the TV to upscale, which caused some issues for some users. PS3 also doesn't have a decent upscaler, so you're better off leaving it to the TV anyhow.
 
TV's tend to have very good upscalers. They HAVE to have very good scalers,
But as I pointed out, logically speaking they can be as good as a software scaler BUT not better.
the rule is software >= hardware
reason software doesnt have to go as 60fps. Plus generally a PCs hardware is much more powerful than a TV
eg that one you linked to, http://www.alienskin.com/blowup is a photoshop plugin running in software now show me evidence that the hardware scalers use the same algorithms
 
Status
Not open for further replies.
Back
Top