Digital Foundry Article Technical Discussion Archive [2013]

Status
Not open for further replies.
Have you ever considered the "Blowup" filter is not suited for images other that cartoons?
If you limit the use you can get better results that generic scalers.

And everything thats running in software can be made in in hardware. state-of-the art upscaling is Lanczos AFAIK (pictures, not video), and that can be done easily in hardware (just like anything really) and will run many times faster than the software algorithm.
Besides, I cant remember Photoshop being a reference for processing video, which allows/requires different sets of algorithms for better results.
 
360 already supported up to Lanczos (amongst other filters) IIRC. Perhaps the # of taps has increased since then :?:

edit: Also (IIRC), various games were manually scaled so that devs could use a 720p HUD this generation, and then you'd have scaling to 1080p on top of that based on the system settings (or not at all, leaving it to the TV).
 
With filmed content there are very good upscaling algorithms, Imax blowups are great, the blowup integrated in Sony's 4K projectors also looks very good with high quality sources. But the reason it works is because there's enough info in the signal (film is like infinite FSAA). With game rendering there's a ridiculously amount of aliasing, there's much less data than the resolution would allow because it needs significant filtering. Whatever is used to blow up in real time will often lead to artifacts unless it's a very neutral thing like bicubic. FIlters in film are used to reveal details, filters in games are used to mask artifacts. Trying to blow it up again with advanced detail enhancement after filtering it down is a weird proposition. Maybe there's a new algorithm? I hope next gen will output 1080p no matter what happens internally in the console, it's up to the game engine to decide how the filtering will happen, and prevent chaining multiple steps of unknown scaling by the TV or the A/V Receiver... just output native.
 
Last edited by a moderator:
I don't think ps3 or 360 output 720p unless you tell them to. They should both output 1080p.
PS3 frequently outputs 720p unless you force it to output 1080 by disabling 720p support in the display settings. As every non-ancient HDTV supports 720p input and upscaling, Sony left it to the TV to upscale...
Yup. I figured this out a looong time ago. Both HDTVs I've owned will display the current input resolution in the "info" section. Most TVs have that, either built-in on the main display or part of the menus. And anything not directly 1080p in-game was typically output to the TV as 720p.

I had a 1080p TV before the HD consoles were even released. There's really no excuse anymore for not targeting that as a rendering resolution, IMO.
 
Have you ever considered the "Blowup" filter is not suited for images other that cartoons?
If you limit the use you can get better results that generic scalers.

And everything thats running in software can be made in in hardware. state-of-the art upscaling is Lanczos AFAIK (pictures, not video), and that can be done easily in hardware (just like anything really) and will run many times faster than the software algorithm.
Besides, I cant remember Photoshop being a reference for processing video, which allows/requires different sets of algorithms for better results.

The custom video renderer I use for video playback on my PC uses an algorithm called Jinc coupled with an anti-ringing filter for upscaling. This provides only slightly less sharpening than Lanczos, but without the aliasing artifacts that Lanczos scaling typically introduces.

This was implemented by a guy for free in his spare time using pixel shaders. I'd expect a commercial product using dedicated silicon to be able to achieve better results.
 
Last edited by a moderator:
I can only repeat myself here.

You guys have spent a generation playing games at 720p without complaining - at least I've never seen anyone expressing problems with full 720p games. Some got flak for upscaling, but only here - and there were people who thought Black Ops II on the PS3 was running at native 1080p. This means that the resolution was not much of a problem in the real world.

Also, if I had to make a choice, I'd prefer a 720p render with as little aliasing issues as possible, at the best rendering quality the developer can push. Yeah 1080p would be sharper, but I'd rather not see lowres geometry and textures with lots of aliasing in more detail...
 
The custom video renderer I use for video playback on my PC uses an algorithm called Jinc coupled with an anti-ringing filter for upscaling. This provides only slightly less sharpening than Lanczos, but without the aliasing artifacts that Lanczos scaling typically introduces.

This was implemented by a guy for free in his spare time using pixel shaders. I'd expect a commercial product using dedicated silicon to be able to achieve better results.
MadVR FTW. I use the same. To use Jinc with >1080p, you need at least a 7750 or similar for smooth playback. Even Ivy Bridge' integrated graphics can handle any other scaling algorithm.

I can only repeat myself here.

You guys have spent a generation playing games at 720p without complaining - at least I've never seen anyone expressing problems with full 720p games. Some got flak for upscaling, but only here - and there were people who thought Black Ops II on the PS3 was running at native 1080p. This means that the resolution was not much of a problem in the real world.

Also, if I had to make a choice, I'd prefer a 720p render with as little aliasing issues as possible, at the best rendering quality the developer can push. Yeah 1080p would be sharper, but I'd rather not see lowres geometry and textures with lots of aliasing in more detail...
People thought Blacks Ops II ran at 1080p? I sure hope no one here did.

This gen we went from SD to HD. No one complained because 720p was kind of expected. People complained when games went sub-HD. Next gen I hope for at least 900p with good scaling.
 
Last edited by a moderator:
...
This gen we went from SD to HD. No one complained because 720p was kind of expected. People complained when games went sub-HD. Next gen I hope for at least 900p with good scaling.

But they didn't complain. They stood in line for early releases and bought millions of copies of the most sub-hd games that were available (COD).
 
Lack of complaints due to an understanding of hardware limitations of a 6+ year old machine isn't exactly a good argument. I think most people saw the big difference between the twins and modern day PCs but didn't bother to complain because they knew the twins really are just 2006~2007 hardware.
 
No-one complained about Ryse being "900p" until they were told that it was "900p".

And COD was 60 fps. I think looking at resolution in isolation from frame rate is unproductive.
 
That probably has something to do with all the native 1080p screenshots and videos they were using to promote the game.

Those were actual output of the X1 as the X1 renders all output at 1080P. 900P is the internal rendering and then it is scaled to 1080P automatically. So if you couldn't tell then difference then. the point sticks; no one would have known unless told.
 
That probably has something to do with all the native 1080p screenshots and videos they were using to promote the game.

Those were actual output of the X1 as the X1 renders all output at 1080P. 900P is the internal rendering and then it is scaled to 1080P automatically. So if you couldn't tell then difference then. the point sticks; no one would have known unless told.

Were those original 1080p screenshots and videos confirmed to be native 1080p renders with no upscaling? TBH, I haven't been paying that much attention. That whole controversy leaves me :rolleyes:
 
Were those original 1080p screenshots and videos confirmed to be native 1080p renders with no upscaling? TBH, I haven't been paying that much attention. That whole controversy leaves me :rolleyes:

All the screenshots were 1080P and some mentioned specifically as from video buffer. Aside from that, MS has stated that all output is 1080P since the TV needs to stay constant as you can step out from the game to OS with game in window or snap another app and they never scale those from their native output.
 
I can only repeat myself here.

You guys have spent a generation playing games at 720p without complaining - at least I've never seen anyone expressing problems with full 720p games. Some got flak for upscaling, but only here - and there were people who thought Black Ops II on the PS3 was running at native 1080p. This means that the resolution was not much of a problem in the real world.

Also, if I had to make a choice, I'd prefer a 720p render with as little aliasing issues as possible, at the best rendering quality the developer can push. Yeah 1080p would be sharper, but I'd rather not see lowres geometry and textures with lots of aliasing in more detail...

We all know that there are tradeoffs regarding resolution/frame rate/pixel love. And no one here thought Blops2 ran at full 1080p.
 
There was a time when many assumed that any PS3 game with 1080p on the box was not upscaled, unlike the 360 equivalent. Lair was the example of FullHD when it was really subHD.

In those were days not so many people had large 1080p TVs. 720p to 1080 should present a clear difference to most. 900p to 1080, not so much at typical viewing distances.
 
No one complained about iPad's screen until the retina came out. now the old one looks terrible. I spent little time watching a 4K TV on a store and it looked incredible. I don't understand why so many have either problems seeing good benefit in more resolution or have issues with people who do.
 
No one complained about iPad's screen until the retina came out. now the old one looks terrible. I spent little time watching a 4K TV on a store and it looked incredible. I don't understand why so many have either problems seeing good benefit in more resolution or have issues with people who do.
I think it's because most are looking at 40-46" TVs from 6-8'. If you can't make out individual pixels, will squeezing more into the same space make a noticeably difference?
 
We should all wait for actual next-gen face-off based on actual next-gen console material.
These pre-release face-offs are premature to say the least.
 
Status
Not open for further replies.
Back
Top