Digital Foundry Article Technical Discussion Archive [2014]

Status
Not open for further replies.
Maybe it's time for console game makers to optimize two presets to choose from for the final game, one being 1080p/30fps and the other whateverp/60fps. This shouldn't be too difficult should it?

I think it would be rather difficult. Frame rate has an impact on the whole system. You're entire engine has to run at 60fps. While resolution is mainly limited by GPU and bandwidth, I think.

Dealing with something like X1's ESRAM, I think it is more difficult with various resolutions even if the the resources and buffers are tiled. You ideally want to optimize for one setting and not introduce overhead associated with variability.

I don't think any developer would spend time and money on something like this.
 
"DF: But one thing is clear from our preliminary testing: matching next-gen console quality is neither difficult nor particularly expensive. "

I personally would like to know what MS or Sony developers have to say about this.
 
Last edited by a moderator:
I'm not going to go and look in a MP thread for comments from people who think they've noticed 'something' about the graphics. That's just background noise.

Now that people actually know what they've been looking at for months, a thread just about the resolution (the 'something') has exploded over there and it's filled with a mixture of rage, denial, and a couple of people confusedly trying to take pot shots at DF.

The resolution hasn't changed. It's the reaction that has changed. It's louder and a mob has formed and is growing. All because people have been told what they were looking at.

If people had been furious about the AA, because that's what they mistakenly thought was the issue, and if there had been threads and rage and criticism similar to the shit that gets flung at "sub 1080p" games, then I could have some sympathy for your position. But there wasn't, and so I have none.



And they 'lied' because ...



... because of people like you. People who want to think they're looking at 1080p, and think they know what's best.

And no, 1080p is not the best possible solution. There is no single best possible solution. The best possible solution will depend on a large number of things that need to be balanced on a per-game basis.

If possible 1080p is the best solution. I don't know how you can denied this. If not possible reduce resolution like Guerrilla has done. But the problem is lying. At least other devs were honest...

If the framerate is equal a game is better on 1920*1080p than 1920i*1080p or 900p or 720p this is the only things I tell... After I know PS4 and XB1 have no unlimited power and are not very powerful and the dev will try to find the best compromise....
 
Last edited by a moderator:
"p" in 1080p means that it's not interlaced... also 960x1080p is incorrect (you can't use "p" for interlaced formats).

1080i is the correct term.
 
"p" in 1080p means that it's not interlaced... also 960x1080p is incorrect (you can't use "p" for interlaced formats).

1080i is the correct term.
I was just about to say the same thing. It is simply 1080i (doesn't matter if it is vertical or horizontal lines, the end result is the same, which is half the pixels in each frame).

Personally I would rather prefer if they chose 720p and upscaled it. It has slightly fewer pixels so it could push closer to 60fps, and it will look the same when upscaled if not better.
 
"p" in 1080p means that it's not interlaced... also 960x1080p is incorrect (you can't use "p" for interlaced formats).

1080i is the correct term.

More like 1920i x 1080p :) (Fool HD (not to call anybody fool, just that it fools you))
 
"p" in 1080p means that it's not interlaced... also 960x1080p is incorrect (you can't use "p" for interlaced formats).

1080i is the correct term.

It's probably 1920i as we're dealing with interlacing on the vertical lines

That is the general understanding but I believe they're gaming the broadcast origins of the i/p suffix. TV works in scan lines due to the raster nature of the old analogue formats so the GG form of interlacing is essentially impossible. However as all 1080 lines are present it is still strictly speaking a 'p' format. As the GG form of interlacing is impossible to actually broadcast though (I can only imagine the soupy mess that 960ix1080 would produce on a CRT) it shows the risks of using broadcast nomenclature for computers.
 
It's probably 1920i as we're dealing with interlacing on the vertical lines

That is the general understanding but I believe they're gaming the broadcast origins of the i/p suffix. TV works in scan lines due to the raster nature of the old analogue formats so the GG form of interlacing is essentially impossible. However as all 1080 lines are present it is still strictly speaking a 'p' format. As the GG form of interlacing is impossible to actually broadcast though (I can only imagine the soupy mess that 960ix1080 would produce on a CRT) it shows the risks of using broadcast nomenclature for computers.

Not to mention that, if they chose a "true to its form" interlaced format, the resulting image would be vastly different between TV sets, with respect to how well they can handle interlace (I've seen ones that simply run at 30fps, worst and luckily rare, up to ones that possibly employ their motion interpolation for the ultimate 60fps upconversion of a 30fps interlaced feed, with the middle ground majority possibly doing Bob+Weave based on motion detection)

So calling this 1080i would not be appropriate indeed.
 
Last edited by a moderator:
More like 1920i x 1080p :) (Fool HD (not to call anybody fool, just that it fools you))

It's probably 1920i as we're dealing with interlacing on the vertical lines

Interlacing is not about the horizontal aspect of it's function, it can be vertical, horizontal, diagonal, it doesn't matter, it is a technique of halving the pixels in each frame, then combining it with the other half in the next frame.

And since "1080p" is just a name for a specific resolution (1920x1080, or approximately 2.0MP), then 1080i is the correct name for the possible combinations of resolutions that produces half the pixels (1.0 MP), be it 1920x540 or 960x1080.

GG could have gone for 1920x540 and achieved exactly the same results, if not for their decpetive intention of maintaining 1080.
 
Interlacing is not about the horizontal aspect of it's function, it can be vertical, horizontal, diagonal, it doesn't matter, it is a technique of halving the pixels in each frame, then combining it with the other half in the next frame.

And since "1080p" is just a name for a specific resolution (1920x1080, or approximately 2.0MP), then 1080i is the correct name for the possible combinations of resolutions that produces half the pixels (1.0 MP), be it 1920x540 or 960x1080.

GG could have gone for 1920x540 and achieved exactly the same results, if not for their decpetive intention of maintaining 1080.

It's just that the broadcasting terms come to my mind when I see 1080i, 1920i would be more descriptive. Not that I'm defending their decision to advertise as 1080p.
 
"p" in 1080p means that it's not interlaced... also 960x1080p is incorrect (you can't use "p" for interlaced formats).

1080i is the correct term.
I think not. The i or p suffix is only talking about vertical resolution, as, by rights, HD is defined only by its vertical resolution, at least when the terms were being drawn up IIRC. 1080i defined a broadcast made up of 1080 horizontal lines (of no specified resolution), sent as two fields of 540 lines. The TV reconstructs the fields into a single frame by alternating the lines appropriately.

In the case of console outputs, there are no interlaced outputs (1080i was present last gen) so there's no reason to use broadcast nomenclatures. In this case, we have a progressive framebuffer made of interlaced 960x1080 fields. 1920i x 1080 is probably the most accurate shorthand, but I'm unconvinced it needs a shorthand when the technique is so rarely used it can just be described as is. ;)
 
I've not played much of the multiplayer for this game, but I'd definitely like to take another look at it to see whether the difference is obvious. I don't remember having much of a problem with it before.

Personally, I think this gives a better look to the image than 720p, so I think GG went with the better option. I'd like to see more devs using this solution to increase the framerate.

Having said that, the game is actually quite slowpaced so the increased fps doesn't affect gameplay too much for me.
 
1080i is much better than 720p in most cases if you use good software deinterlacing algorithm. Inside the rendering engine you know exactly how much each pixel has moved from the last frame, so you don't need to analyze the movement on screen like TV sets do. The result is much better. Stalker game used similar approach last gen.

Also refreshing every single pixel per frame is not always the wisest thing to do. Most pixels (in a 60 fps game) could just be reprojected (+ color tinted) without any noticeable image quality difference. This is exactly what modern video compression algorithms (including BluRay) do. There will be games that use similar rendering techniques in the future. It's completely dynamic how big percentage of the screen these styles of rendering engines will recalculate every frame (as the error metrics will dictate that). The final output of course will be 1080p, but some low contrast areas might sometimes be more blurry (because the human eye doesn't notice the difference in motion). Enlarged screenshots however would look partially blurry (just like BluRay screenshots look like).
 
1920i x 1080 is probably the most accurate shorthand, but I'm unconvinced it needs a shorthand when the technique is so rarely used it can just be described as is. ;)

2Ki? :p (since 4K is "2160p")


I'm actually curious as to how much it actually saves them in their render time. It seems... roundabout.
 
Interlacing is not about the horizontal aspect of it's function, it can be vertical, horizontal, diagonal, it doesn't matter, it is a technique of halving the pixels in each frame, then combining it with the other half in the next frame.

And since "1080p" is just a name for a specific resolution (1920x1080, or approximately 2.0MP), then 1080i is the correct name for the possible combinations of resolutions that produces half the pixels (1.0 MP), be it 1920x540 or 960x1080.

GG could have gone for 1920x540 and achieved exactly the same results, if not for their decpetive intention of maintaining 1080.

That's the common sense understanding of interlacing and as you note, and Hesido and I agree with you, GG are calling 960x1080 '1080p' solely for marketing reasons. The whole i/p thing does have a defined meaning in broadcast land though and 1920i would be the way to describe that res there as the interlacing is happening on the vertical lines only (the letter denotes the axis the interlacing occurs on).

TLDR: GG are being deceptive but 960x1080 != 1080i but it does =1920i
 
Also refreshing every single pixel per frame is not always the wisest thing to do. Most pixels (in a 60 fps game) could just be reprojected (+ color tinted) without any noticeable image quality difference.

This is something that I'd so much like to see, I had even started / participated in a few 60fps "frame upscale" threads. I remember you talking about error metric driven re-usage of previous frames. It just doesn't make sense to re-render all those extremely similar pixels. I hope we don't end up playing 30fps games this gen when the devs try to up the ante with visuals. (But I'm afraid that's probably what's going to happen.)
 
Thief 4

DF updated their article, and they do state the XO lacks POM (in the city hub area) while the PS4 has it. I personally think if the PS4 didn't improve it's AF quality, then it's advantage in that aspect is moot anyway.
 
Is seeing the difference between 1080p30 vs 1080i60 supposed to be as easy as 1080p30 vs 720p30?
The first has the same temporal resolution. I would imagine that 1080i60 looks a lot better than 720p30.

edit: missed sebbbis post :oops:
 
Last edited by a moderator:
GG could have gone for 1920x540 and achieved exactly the same results, if not for their decpetive intention of maintaining 1080.

Previous gen titles have demonstrated that the 16:9 aspect ratio is much more friendly with horizontal upscale. 1024-1150x720 games looked much better than games with perhaps even higher number of pixels but both vertical and horizontal upscale.

So, 960*1080 is the better option. Perhaps someone can take an existing full 1920*1080 image from the SP part and create both versions to demonstrate the differences...
 
Status
Not open for further replies.
Back
Top