Digital Foundry Article Technical Discussion Archive [2014]

Status
Not open for further replies.
On closer inspection, we see parallax occlusion mapping used sparingly on PS4 and PC in the The City hub area, which adds texture extrusions over the Xbox One release. For a majority of surfaces this effect is avoided entirely across all platforms, and for others it's subtle. The most notable contrast is to the centre road brick-work of Stonemarket, though texture quality is identical for any surfaces surrounding it. It's a curious point in favour of the PS4 version in a comparison that is already close - though as documented, its blurrier texture filtering is a trade-off that is worth considering. This is also a separate issue to tessellation - which is engaged on all platforms. We can also confirm both versions are fully installed before starting this test, as with all Face-Offs where the option is available. We hope this clears up any confusion.

http://www.eurogamer.net/articles/digitalfoundry-2014-thief-next-gen-face-off
 
1080i is much better than 720p in most cases if you use good software deinterlacing algorithm. Inside the rendering engine you know exactly how much each pixel has moved from the last frame, so you don't need to analyze the movement on screen like TV sets do. The result is much better. Stalker game used similar approach last gen.

Also refreshing every single pixel per frame is not always the wisest thing to do. Most pixels (in a 60 fps game) could just be reprojected (+ color tinted) without any noticeable image quality difference. This is exactly what modern video compression algorithms (including BluRay) do. There will be games that use similar rendering techniques in the future. It's completely dynamic how big percentage of the screen these styles of rendering engines will recalculate every frame (as the error metrics will dictate that). The final output of course will be 1080p, but some low contrast areas might sometimes be more blurry (because the human eye doesn't notice the difference in motion). Enlarged screenshots however would look partially blurry (just like BluRay screenshots look like).

Thank you sebbbi for the info. So, these temporal upscaling algos are based on Video compression principles, in extremely simple terms. This is pure news to me, I had no idea we were already using previous frame datas in games to reduce rendering loads. I thought it would be too compute expensive as video encoding does take a lot of time and horsepower. Is such stuff happening in Frostbite too? saving rendering costs by analysing previous frames? This is whole new topic to explore :p !

In this case it did work. I play on the same monitor as always and it was nowhere blurry like a normal upscaled game. I just thought it looked different due to their optimisations. Didn't notice it at all. Did notice the artifacts, but had no idea it was due to halved horizontal res. Wish they give a complete talk during GDC, but considering this is highly volatile topic, I don't think we will see it.
 
I think not. The i or p suffix is only talking about vertical resolution, as, by rights, HD is defined only by its vertical resolution, at least when the terms were being drawn up IIRC. 1080i defined a broadcast made up of 1080 horizontal lines (of no specified resolution), sent as two fields of 540 lines. The TV reconstructs the fields into a single frame by alternating the lines appropriately.

In the case of console outputs, there are no interlaced outputs (1080i was present last gen) so there's no reason to use broadcast nomenclatures. In this case, we have a progressive framebuffer made of interlaced 960x1080 fields. 1920i x 1080 is probably the most accurate shorthand, but I'm unconvinced it needs a shorthand when the technique is so rarely used it can just be described as is. ;)

I think there is some misunderstanding. DF says this effect (960X1080 > 1920X1080) "is not cheap from computational perspective". This means that another half of the frame (960 x 1080) is not directly copied and pasted from previous frame. If KZ really only updates 960 X 1080 and copies another 960 X 1920 from last frame then it is 1920i X 1080.

The only possible solution is that the other half of the frame is "interpolated" from current and last frame so this effect is not cheap. It's not directly upscale. It is more like "frame interpolation" discussed in console technology forum. The difference is that what we discussed is to update a whole frame 1920 X 1080 every 1/30 second and interpolate another complete frame to create 60 frame per second, while KZ updates 960 X 1080 (1/2 frame )and interpolates another half.
 
Exactly, I think even DF mentioned that it has a temporal aspect, ie. using information from other frames too.
 
Semantics, but it just sounds like a smarter deinterlace, judging by sebbbi's input. The artefacts are what they are. *shrug* I'm open to a name that doesn't take a world of explanation though. ;)

Temporal field interpolation? Field Interpolated Temporal Reprojection?

Shifty's Bitch's naughty lace interpolation?
 
Is seeing the difference between 1080p30 vs 1080i60 supposed to be as easy as 1080p30 vs 720p30?
The first has the same temporal resolution. I would imagine that 1080i60 looks a lot better than 720p30.

edit: missed sebbbis post :oops:

What I don't understand is how suddenly people can claim this resolution looks blurry but it is somehow significantly better than 720p at the same time? A game like Ryse at 900p would not be noticed if pixel counting on zoomed in screen captures did not exist. That is why this wasn't noticed. Not because interlaced half resolution looks objectively better. It's because it was a technique that people weren't expecting and there wasn't someone to zoom in and count the pixels.

I think I'm mostly frustrated by the fact that there is a resolution hyperbole mostly by uninformed masses and it makes games look worse than they otherwise could. I swear everyone should just run Quake 3 at 4k and be done with it. Image quality in Ryse being better than Forza despite lower resolution should say something.
 
I was just about to say the same thing. It is simply 1080i (doesn't matter if it is vertical or horizontal lines, the end result is the same, which is half the pixels in each frame).

Personally I would rather prefer if they chose 720p and upscaled it. It has slightly fewer pixels so it could push closer to 60fps, and it will look the same when upscaled if not better.

So destroy every image equally rather than make the best image based on temporal information? Frame rate aside (we don't know if it would be better), the image quality for 720P would be worse since it would be in every frame and not depend on motion.

Ethics aside (this is a tech forum), they found a method to improve the frame rate for MP. There are many ways to do it, this was one innovative way. We might see more of it, sure seems beats the over used 'drop the res' club devs have been swinging far too much in the past.
 
What I don't understand is how suddenly people can claim this resolution looks blurry but it is somehow significantly better than 720p at the same time? A game like Ryse at 900p would not be noticed if pixel counting on zoomed in screen captures did not exist. That is why this wasn't noticed. Not because interlaced half resolution looks objectively better. It's because it was a technique that people weren't expecting and there wasn't someone to zoom in and count the pixels.

I think I'm mostly frustrated by the fact that there is a resolution hyperbole mostly by uninformed masses and it makes games look worse than they otherwise could. I swear everyone should just run Quake 3 at 4k and be done with it. Image quality in Ryse being better than Forza despite lower resolution should say something.

Forza is a 60fps game and Ryse basically was built to be a graphical showcase. Not a surprise that it does look good. Any way the importance of resolution varies on the type of setup one is gaming on. Larger the perceived screen size becomes the resolution also becomes more important. I personally prefer clarity in the image quite a bit and resolution helps there, especially on a large display. I hate all sort of blurs, whether it's due to low resolution, some post AA method, artistic choice or flaws in the display tech.
 
There are many ways to do it, this was one innovative way. We might see more of it, sure seems beats the over used 'drop the res' club devs have been swinging far too much in the past.

It's certainly something I'd like to see compared to a naive 1440/1536/1600 x 1080 (normal upscale, no temporal AA) since the increased blur was still noticeable to many a folk.

(Of course, they could have just stuck with 1080p30 as originally intended, and their backtrack with the patch for SP does kind of say something about their decision making post-E3.)
 
It's certainly something I'd like to see compared to a naive 1440/1536/1600 x 1080 (normal upscale, no temporal AA) since the increased blur was still noticeable to many a folk.

(Of course, they could have just stuck with 1080p30 as originally intended, and their backtrack with the patch for SP does kind of say something about their decision making post-E3.)

Maybe they were worried that 30fps was not possible to maintain with an unpredictable number of players shooting and exploding all in one area. Maybe the trade off was hard to detect interlacing trick with 35-50fps versus occasional sub-30fps in 1080P. DF would have a field day showing the worse case scenario in MP and the Internet would say nothing but 25fps KZ.
 
How about if a single 1080p cheerleader had realised they were looking at half the fucking resolution?
If they complained that the game looked blurred they actually saw that something was wrong, just because they didn't pinpoint the exact resolution or reason doesn't make them wrong in thinking something was wrong, right?


I'm going to take a break, and shout at random people on the street for killing off plasma tvs (and the Dreamcast)..
Yeah, shout a bit for ME as well.. fucking stupid world! :)
 
Last edited by a moderator:
If they complained that the game looked blurred they actually saw that something was wrong, just because they didn't pinpoint the exact resolution or reason doesn't make them wrong in thinking something was wrong, right?

It's not wrong, it's just hypocrisy. If people gonna hammer on the 1080p debate, they need to have ONE STANDARD for ALL, not just when it's convenient.
It's even more hilarious that after like 3 month, people just think it's "blurry" but since they were told it's 1080p, they think it's something else (FXAA).

Imagine what kind of shitstorm would it be if CryTek had claimed Ryse to be 1080p, but one must ask, can people actually tell?
 
Last edited by a moderator:
Anyway, DF needs two things in my opinion:

1) They need to stop naming a definitive "winner". In this case, it's definitely come back to bite them in the ass, and they need to stop it. Just stick to the facts. Give us a pro/con breakdown, maybe in a table or something easy to read, and let us make up our own minds. PS4 does this better, XB1 does that better, pick the one you want.

And really, these face-offs are more about dick-waving our favorite console than anything else. I don't think anyone is waiting around for DF to decide which version of the game they should buy... most of us aren't in the position to pick and choose, we're limited by what we own.
Excellent point. I completely agree with that. The DF article is fine –truth be told he missed the POM thingy being present on PS4, but they also said that PS4’s version never dips below 25 when it can be seen clearly in the video it dips to 21 right away, so give and take.

Still, if they weren’t judgmental they wouldn’t have a problem. But now they got GAF’s Seventh Cavalry against them, the knight errantry of justice by definition. :smile:

But seriously, the Digital Foundry Verdict does more harm than good, for all the parts involved.

It can also seriously affect sales.

What I don't understand is how suddenly people can claim this resolution looks blurry but it is somehow significantly better than 720p at the same time? A game like Ryse at 900p would not be noticed if pixel counting on zoomed in screen captures did not exist. That is why this wasn't noticed. Not because interlaced half resolution looks objectively better. It's because it was a technique that people weren't expecting and there wasn't someone to zoom in and count the pixels.

I think I'm mostly frustrated by the fact that there is a resolution hyperbole mostly by uninformed masses and it makes games look worse than they otherwise could. I swear everyone should just run Quake 3 at 4k and be done with it. Image quality in Ryse being better than Forza despite lower resolution should say something.
You read my mind. I was about to post almost exactly the same, word by word. Ryse looks better than Forza -DrEvil is right though-, and better than Fifa 14, and better than NBA 2k14, and Crimson Dragon, and Halo: Spartan Assault, and NBA Live 14, and Need for Speed Rivals… etc etc.

In addition, I am not a big fan of UE3, never have been. It has never been console friendly and the games looked very similar for the most part, whatever the reason was. Give me CryEngine for next gen consoles any time of the week. :smile2:

Still, the 1080p lovers seemed to have it under control, but… woah, they were waaaaaaay off....
 
Indeed.
Grab you guns people, let's hunt some pixel counters :devilish:
It'd be better if people didn't care about a number but what things looked like on screen. Those of us who discuss the techniques do so purely from an analytical POV to understand how the developers approach the issue of working on finite hardware. Every game is a compromise; the interest comes from seeing which compromises are used and how, especially when new ones are developed (like Lair's alternative AA resolve).

The Order 1886 also has a blurry look. Could it be doing something like this as well?
I'm with Al. Looks like lens simulation as there are no pixel-perfect artefacts from what I've seen.
 
Status
Not open for further replies.
Back
Top