Digital Foundry Article Technical Discussion Archive [2014]

Status
Not open for further replies.
So it wouldn't be unreasonable to guess that Lego Marvels 30fps limit is perhaps due to poor multithreading resulting in it only being able to utilise a low percentage of the Jaguars capabilities. For example if its tuned to 3 cores because of Xenon that would mean its using less than half the consoles cpu potential but more than 3/4 of my 2500K's potential. That would account for the 4x performance increase quite nicely.
 
Sebbi, not sure if I get you right - shouldn't a 2x increase in fps result in a 2x increase on vertex shader load? I mean it has to push literally double the amount of vertices!

Or did you just mean that the change in resolution does not modify the cost per individual vertex, so it's a straight 2x increase most of the time?
 
Sebbi, not sure if I get you right - shouldn't a 2x increase in fps result in a 2x increase on vertex shader load? I mean it has to push literally double the amount of vertices!
I think he was talking about the effect of resolution on vertex shaders.
Vertex shader load doesn't change much with resolution (per frame).
 
Yeah, that's what I think he means, but it can be easily misunderstood - that vertex shader load does not change with frame rate either, which is kinda absurd. But you never know, even here on B3D, nowadays.
 
sebbi is just saying that doubling the frame rate, which basically makes everything 2X, is more costly than bumping just the pixel count from 720p to 1080p, which makes total sense since the former needs to account for the CPU, GPU, and more strict timing, and the later probably would just have major impact on the shader (pixel-stage)
 
There's a topic on framerate somewhere in this forum. No need to start it again.

Well I read his post as he is saying something other than the typical framerate benefit of smoother picture and faster response, which he mentioned as extras. He seemed to put great emphasis on this. 480p@120 is about the same as those two other examples in this metric. I personally got the feeling he is talking about something along the lines of motion blur/resolution like what the LCD tech usually brings and that the picture quality is basically the same between 720@60 and 1080@30 in motion, but wanted some clarification from him.
 
Well I read his post as he is saying something other than the typical framerate benefit of smoother picture and faster response, which he mentioned as extras. He seemed to put great emphasis on this. 480p@120 is about the same as those two other examples in this metric. I personally got the feeling he is talking about something along the lines of motion blur/resolution like what the LCD tech usually brings and that the picture quality is basically the same between 720@60 and 1080@30 in motion, but wanted some clarification from him.

Somewhat.

It's similar to how temporal AA (as long as FPS is high enough) will produce superior AA results with less samples per frame than a static AA pattern with a larger number of samples. At over 60 fps, 4x temporal MSAA (with sample pattern that changes each frame) will have similar or better AA than 8x static MSAA depending on sample pattern as well as how much over 60 FPS you are. However, below 60 FPS temporal AA will start to show increasing artifacts that makes it much less useful and more artifact prone.

As temporal resolution increases, your eye is able to see more "definition" per period of time versus a higher spatial resolution at lower temporal resolution. This isn't always true of course. A static image should always look better with a higher spatial resolution (up to what a particular person's eye is capable of resolving) as displaying more of the exact same pixels over time isn't going to allow for any increase in temporal resolution.

Now at lower FPS, temporal resolution isn't going to be a benefit, and can actually be a detriment as the AA example I first used shows.

Not only do you need motion, but you also need a high enough frame rate for temporal resolution to offer a meaningful increase in perceived detail (or IQ). That's why some temporal implementations on last gen games (Halo: Reach) weren't very satisfying.

Regards,
SB
 
At least on a big screen, no amount of AA can make up for a resolution below the panels native output. The difference between even 900p and 1080p on my monitor is pretty massive to my eyes, not necessarily because of a huge increase in aliasing but because of the scaling blur it introduces. I guess if my panel was 900p this wouldn't be much of an issue but then it'd also have to be smaller since I'm already pushing the limits of pixel size at 27" from 2 feet!
 
At least on a big screen, no amount of AA can make up for a resolution below the panels native output. The difference between even 900p and 1080p on my monitor is pretty massive to my eyes, not necessarily because of a huge increase in aliasing but because of the scaling blur it introduces. I guess if my panel was 900p this wouldn't be much of an issue but then it'd also have to be smaller since I'm already pushing the limits of pixel size at 27" from 2 feet!

In theory, with a high enough framerate you could get much higher motion definition at 900p versus slower framerate at 1080p due to being able to display more information per arc minute in any given time interval.

As stated, for static images, temporal resolution won't provide meaningful increases over spatial resolution except in some cases. Anti-Aliased lines in a static 3D generated scene for example.

This also applies to video. Where high FPS low resolution video can appear far more detailed when there is relatively fast motion than low FPS high resolution video when played, but have obviously lower detail when paused. BTW - this is assuming the high FPS low resolution video was actually recorded at high FPS and isn't just the same low FPS recording as the higher resolution video.

Regards,
SB
 
I'm not really talking about aliasing though, I'm talking about the full screen blur that scaling introduces. That can't be overcome with any amount of framerate. For example Black Flag looks sharper to me at 1080p and 30fps than Lego Marvel does at 900p and 120fps on my 1080p monitor.
 
At least on a big screen, no amount of AA can make up for a resolution below the panels native output. The difference between even 900p and 1080p on my monitor is pretty massive to my eyes, not necessarily because of a huge increase in aliasing but because of the scaling blur it introduces. I guess if my panel was 900p this wouldn't be much of an issue but then it'd also have to be smaller since I'm already pushing the limits of pixel size at 27" from 2 feet!

Thing is, monitors are not TVs and vice versa. Because of the distance from you to a monitor, anything but integer scaling on a monitor always looks ugly. On a TV, regular people still put up with standard definition on their HDTVs (which is quite sad honestly) and the absolute trash that is 720p and 1080i TV feeds.
 
I'm not really talking about aliasing though, I'm talking about the full screen blur that scaling introduces. That can't be overcome with any amount of framerate.
Now you raise that point and got me thinking, I reckon it could. As the human brain will be seeing integrals of data over time per point, changing the blur sampling details each frame could restore image detail. That's very hypothetical, but you raised the challenge that no amount of framerate can restore static sharpness - I'm moderately confident fancy maths and crazy high framerates could. ;)
 
Now you raise that point and got me thinking, I reckon it could. As the human brain will be seeing integrals of data over time per point, changing the blur sampling details each frame could restore image detail. That's very hypothetical, but you raised the challenge that no amount of framerate can restore static sharpness - I'm moderately confident fancy maths and crazy high framerates could. ;)

Don't people complain about the 48fps HFR films look too clear and thus fake-looking?
 
That's compared to movies. Computer games are a different kettle of fish. I don't recall anyone complaining about higher framerate in a game.
 
I think my point is that having high frame rate seems to give people the perception of high resolution even when it's not?
 
I've seen a number of subjective accounts of HFR looking "unrealistic", but not going into more detail than that.
Is the problem that HFR is not realistic, or that it doesn't look movie-like?

Is it the same as how people think games filled with lens flare, chromatic aberration, lens dust/moisture, and film grain are "realistic"?
Why, are there people born with old movie cameras for eyes?
 
Last edited by a moderator:
I think the 24fps exposure gave motion pictures the "look" that we are used to (motion blur/stutter). If you've see actual movie props, they look fake and cheap. I suppose the HFR would make that more obvious given that 2X amount of visual details are provided in terms of frames and not the resolution, thus the props looks "fake" when it's closer to what they actually look like.
 
Status
Not open for further replies.
Back
Top