Digital Foundry Article Technical Discussion Archive [2014]

Status
Not open for further replies.
Judging by their own explanation, they do it not from 2 but from 3 960x1080 frames + one 1920x1080p frame for the temporal AA.
So it's not just an interlaced update, but with reconstructed interleave data to subdue a lot of interlace artefacting. It's very much in the vein of post FX and a clever solution. It's sad that the internet is too stupid to appreciate that. :(
 
So, all the fuss is over MP being sub 1080p to achieve 60fps in a first generation game, and multiplayer maps at that.

Actually rendering only half the pixels is not just a fuss, in the context of evaluating the performance of the PS4 relative to the X1. KZ's multiplayer mode was - as far as I recall - in debates as proof that the PS4 can easily do 1080p at 60fps in a nextgen game, compared to X1 struggling at it.
 
So it's more like 2880x1080, amortized. ;)

;)

----

mm... makes more sense now with motion vectors during G-buffer pass - nothing for the transparents.
 
Actually rendering only half the pixels is not just a fuss, in the context of evaluating the performance of the PS4 relative to the X1. KZ's multiplayer mode was - as far as I recall - in debates as proof that the PS4 can easily do 1080p at 60fps in a nextgen game, compared to X1 struggling at it.

Incorrect, they always said they were taking short cuts for the improved frame rate, you are practicing revisionist history to try to score points. No one ever expected a fully featured AAA game to pull off 1080@60. We were never expecting 720p or sub-1080 and sub-30 fps.
 

Cool.

So, they use a full 1080p previous frame for anti-aliasing. Sounds similar to what Crytek is doing in Ryse with their temporal AA. Could it be the AA step that is adding a slight blur to the multi-player image, or are they using the same AA in single-player?

I'm curious to know what kind of vectors they're using to track pixels. I'm guess it's a vector in 2D screen space? But since the camera can move in 3D space, the movement of a pixel would be very approximate, no? Two pixels near each other may have different vectors, and a pixel may actually disappear, in a sense. Each time you rasterize, you'll get different pixels during scan conversion, no? There may be a pixel that goes missing, for example, if a triangle has moved enough to fit into sample coverage during scan conversion differently. I'm not sure if I'm explaining myself well. Then you have changes based on dynamic lighting, shadows, shading. Are these previous frames colour buffers, pre-shading? I mean, if you have a pixel that is near black because it is in shadows, you don't want to re-project it into a space on screen that should be bright because it is outside the shadows.
 
Actually rendering only half the pixels is not just a fuss, in the context of evaluating the performance of the PS4 relative to the X1. KZ's multiplayer mode was - as far as I recall - in debates as proof that the PS4 can easily do 1080p at 60fps in a nextgen game, compared to X1 struggling at it.

I guess I'm not concern since I don't play MP, but that's besides the point. It's seems to me for an early generation title, KZ:SF MP mode needed tailoring on achieving 60fps. But given the hardware between the two... I would assume future First Person titles will have an advantage (MP wise) on PS4.
 
Incorrect, they always said they were taking short cuts for the improved frame rate, you are practicing revisionist history to try to score points. No one ever expected a fully featured AAA game to pull off 1080@60. We were never expecting 720p or sub-1080 and sub-30 fps.

This is nonsense. The BF4 thread was trolled hard because this other game was doing 1080p60 easy and BF4 was not on either platform. Obviously this technique they're using is very smart, and of good quality. I think it's great. At the same time, it's not what we would generally have agreed to be native 1080p rendering during last-gen, and I still don't think we'd agree to it being called "native" after their explanation of how it works. That isn't to say it is bad at all.
 
Incorrect, they always said they were taking short cuts for the improved frame rate,

I'm not talking about GG, I'm talking about forums like this. Noone mentioned any significant shortcuts (like cutting 50% of the fragment processing) about the MP, only that it wasn't as good looking. But it was definitely paraded as a proper 1080p 60fps game.
 
I'm curious to know what kind of vectors they're using to track pixels. I'm guess it's a vector in 2D screen space? But since the camera can move in 3D space, the movement of a pixel would be very approximate, no?
The moment you rasterise the 3D game space into 2D screen space, you transform all spacial properties into a 2D projection. The 3D movement is similar transformed into 2D movements, so a movement towards you appears as a movement up and to the left, but with the 2D size also increasing (actually exactly the same as human vision, rasterising 3D world space into 2D retina space). For the purposes of reprojection, a 2D motion vector could be enough, although the devs still have the option of a 3D motion vector which could be used for deforming the image based on changes in size down to perspective.
 
Actually rendering only half the pixels is not just a fuss, in the context of evaluating the performance of the PS4 relative to the X1. KZ's multiplayer mode was - as far as I recall - in debates as proof that the PS4 can easily do 1080p at 60fps in a nextgen game, compared to X1 struggling at it.

Any "proof" that requires game A from platform X to be compared with other games on platform Y is kind of stupid to begin with - not to mention silly because resolution and framerate on one platform has nothing to do with a different platform, except for perhaps in multiplatform titles where one game is ported to two differing platforms, both limited by time, effort and resources - which has nothing really to do with KZ it being a platform exclusive.

I'm not sure what debates you recall, other than people being surprised that Guerrilla was aiming for 60fps [in MP] somewhere nearing its launch date. Good to know how they pulled it off though - and I'll happily take these kind of solutions (with 60fps) - or the dynamic framerate in WipEout HD over fullHD at 30fps. :D
 
only that it wasn't as good looking. But it was definitely paraded as a proper 1080p 60fps game.

Afaik it never was a 60fps game, it tried to be but it fell somewhat short? And it is 1080p and in some cases it's actually a 1920x1080 game with equal qualities. It's in any case better than upscaled games..

in debates as proof that the PS4 can easily do 1080p at 60fps in a nextgen game, compared to X1 struggling at it.

We all know that both machines can do 60 if something else is turned down. If it was a VS question then it would be interesting to know if the XB1 would even be able to do the same stuff that they do here. Is it thanks to the GDDR5 they can do it, would the ESRAM be perfect for something like this? Is the extra CU power showing it's worth?

The other VS thing is currently done, the PS4 outpowers XB1 easily on MP titles, we are just waiting for something else to be the case before that is a discussion worth taking again.
 
Sony and KZ, I guess I should've known. Not gonna mention this again, but the attitude has once again been noted.
 
Anyone who's been blissfully unaware of the lowered resolution for the last 3 months or so, only to throw a fit now that some pixel counter has revealed that the developer pulled off some clever tricks for performance's sake, has some very real issues.
 
Any "proof" that requires game A from platform X to be compared with other games on platform Y is kind of stupid to begin with - not to mention silly because resolution and framerate on one platform has nothing to do with a different platform, except for perhaps in multiplatform titles where one game is ported to two differing platforms, both limited by time, effort and resources - which has nothing really to do with KZ it being a platform exclusive.

I'm not sure what debates you recall, other than people being surprised that Guerrilla was aiming for 60fps [in MP] somewhere nearing its launch date. Good to know how they pulled it off though - and I'll happily take these kind of solutions (with 60fps) - or the dynamic framerate in WipEout HD over fullHD at 30fps. :D

Read the BF4 thread history.
 
This image is incorrect, because it assumes the computer generated image is a continuous signal, and pixel colors are continuous integrals of light coming to the pixel. If this would be the case, we would also have perfect anti-aliasing, but unfortunately the rasterization process samples only a infinitiely thin sampling point in the middle of each pixel, and thus we get aliased result.

This also means that if we sample odd/even 1080p pixels every other frame and combine them together, the result is perfect 1080p when the image doesn't move (with all the same aliasing artifacts that native 1080p has). No blurring is added at all. The image is also perfect every time when the scene scrolls sideways (just scrolling, not any other movement). When something else happens the reconstruction starts to become lossy. But still, a well designed interlacing algorithm that uses all the internal scene data (to generate perfect motion vectors) at (half) 1080p would most of the time beat upsampled 900p in image quality (at lower pixel processing cost).

so one, that figure is exactly trying to demonstrate that the pixels are discrete (see the horizontal lines), so I don't understand why you'd say its wrong. If you are being picky on the exact positions on the samples, then they are wrong because I draw this quickly. The point of that you are losing details when you lowered the resolution, and you don't get to reconstruct that back perfectly.

and 2, I am saying there is no way that you can pick and choose your don't get all your odd pixels into your half wide odd frame, and your even pixels into your half wide even frame.
 
In your argument you've missed one vital fact - the sampling offset can change between frame. You render every odd pixel on every odd field. You render every even pixel on every even field. If the odd field and even field are from the same image, you reconstruct perfectly the alternating odd and even pixel data as you'd experience sampling continuously across the image.

Going back to my earlier visual representation, you said,
I was showing what the source data at 1080p should look like, and what the data rendered using either trick was. You wouldn't render all the pixels in a field and then replace half of them - that'd be a complete waste of time!

Why would you average the two values when sticking them back together? Just draw black line, white line, black line, white line. An upscale (render or sample at half res) would render all black as it only samples every other line.

Or in a concise summary, you've got it all wrong. ;) You've misunderstood the interlacing method and what it's doing. Your original analysis of the image quality was incorrect and subsequent arguments aren't valid.

you are saying that there is a way to sample the black lines into 1 frame and the whites into another one, I am saying that you can not. With a lowered resolution buffer you always lose details. Reading the blog, its not what GG is doing anyways.
 
@Scott

Well, I actually just went through the effort to skimp through 25 odd pages of the 38 of that BF4 thread, and honestly, I can't find much on anything that had much to do with framerate, KZ, XBone and PS4 in the same context. I don't really want to derail the topic either (which is already on the verge of happening) and I'm not really bothered if there were people who claimed stuff or made rather silly comparisons. Fact is, we have two differing next gen consoles, both that require certain trade-offs to be made, either with resolution, framerate, complexity or a combination of all. It is threads like these that are why I enjoy reading this forum in the first place; to gain a better understanding on how the games we like to play are achieved and some of the trade-offs (trickery) that are involved. I really don't see the point in drawing parallels to fans or what [some] members might have claimed. It just feels a bit like childish finger-pointing, which usually only results in a lot of noise.

Having that said, I'm finding the whole discussion on interlacing/interpolating highly interesting/fascinating. So please, disregard my post.
 
I'm pretty sure that GPU has all the scene information necessary to render an image.
Thus it can sample it in any way it needs to.

I did show that how you can combine 2 960x1080 images with sample centers in slightly different locations to form a full 1920x1080 image.

My point is that you never had that 1920x1080 image to begin with, because if you do, you can just output this native 1080p at 60fps.

I would love to hear what you mean by the mathematically impossible part you repeat.

edit:
Actually, it seems like you propose that when rendering an image, final color of a pixel comes from all information within a volume of a pixel pyramid. (area that pixel covers in a world space.)
This is not the case when rendering an image using rasterization, each sample is a point in space.
what? Then what's the point of texture filtering like AF...
 
Last edited by a moderator:
Producing perfect 1080p imagery with no motion is irrelevant. When does that ever happen in a MP portion of a fps game? When you are camping and sit the controller down to take a bite of your sandwich?

Does it look good in motion? Thats all that really matters. Since GG got away with it for the last 3 to 4 months, I assume it does.
 
Status
Not open for further replies.
Back
Top