Digital Foundry Article Technical Discussion Archive [2015]

Status
Not open for further replies.
No it doesn't. By waiting for the refresh, triple-buffering delays and stutters its output, causing the appearance and controls to feel unresponsive and janky compared with just turning vsync off. Assuming good implementations, at some performance levels it can even do worse than double-buffered vsync (i.e. if you're hovering just faster than a harmonic fraction of the refresh; in this case the buffer flip is capable of being a good frame kickoff timer, which triple-buffering can't benefit from).

There is no down side to triple buffering compared to double buffering, other than increased memory usage.

The name gives a lot away: triple buffering uses three buffers instead of two. This additional buffer gives the computer enough space to keep a buffer locked while it is being sent to the monitor (to avoid tearing) while also not preventing the software from drawing as fast as it possibly can (even with one locked buffer there are still two that the software can bounce back and forth between). The software draws back and forth between the two back buffers and (at best) once every refresh the front buffer is swapped for the back buffer containing the most recently completed fully rendered frame.

As for control input delay, yes, there is some, but the higher the framerate the less noticeable it will be. For example, if you have a game that dips to 50FPS, with no vsync you will have a bunch of torn screens, but you would theoretically have a more even update. With that said, when framerates dip, your getting variable updates anyway. So with the triple buffered Vsync game, your getting updates like this, 16.66ms-16.66ms-16.66ms-33.33ms-16.66ms and so on. Without Vsync, you could be getting updates anywhere in between, so 17ms-19ms-16ms-20ms-19ms and so on, coupled that with torn frames and you have an uneven mess. I didn't say triple buffering is perfect, but its a vast improvement to double buffering, and Im pretty sure that triple buffering is what developers are using on these X1 and PS4 games that have fluctuating framerates with no tearing.
 
Last edited:
There is no down side to triple buffering compared to double buffering, other than increased memory usage.
Yes, there is. In the particular instance I used as an example, the kickoff-to-display time for double-buffering averages lower than triple-buffering because the double-buffered frames are being kicked off in such a way that they finish immediately before the refresh happens.

I modelled this stuff a while ago assuming negligible CPU load, the time between input and beginning output on a 60Hz signal looked like this, for instance:

j1RySMH.png


The graphs look similar if you're considering different sorts of variance. At some performance levels, double-buffering is more responsive than triple-buffering by pretty much any sane measure.

I also personally find that although double-buffered framerates are always less than or equal to triple-buffering, it simply has a less stuttery feel, because at any given performance level the double-buffering is clamped to something with predictable timing. Yes, it fluctuates between levels, but not as wildly as triple-buffering's constant stutter during performance loss. I'd usually rather drop to 20 than float around 25 for a while.

The only situation where I think I generally prefer triple-buffering to double-buffering is when targeting full refresh (i.e. 60fps) with spikes, since the drop that occurs in double-buffering when the refresh slips is just huge.

With that said, when framerates dip, your getting variable updates anyway. So with the triple buffered Vsync game, your getting updates like this, 16.66ms-16.66ms-16.66ms-33.33ms-16.66ms and so on. Without Vsync, you could be getting updates anywhere in between, so 17ms-19ms-16ms-20ms-19ms and so on
You're getting variable updates, but your own example demonstrates that the variance is astronomically smaller in the tearing case. A 3ms difference can slip by without the brain registering it as a meaningful inconsistency; a 17ms difference is more problematic.

I didn't say triple buffering is perfect, but its a vast improvement to double buffering, and Im pretty sure that is what developers are using on these X1 and PS4 games that have fluctuating framerates with no tearing.
It's true that triple-buffering is commonly used. It's also true that those games feel absolutely awful when they're floating around the 20s, and it's further the case that in unlocked instances where the games tend to float above 30fps, plenty of people have asked for caps to eliminate the stutter. It's a compromise that makes sense some of the time, but it's not a "vast improvement" to double buffering in general, and for responsiveness it is not an alternative to tearing (if you think it is, I'd strongly suggest that you try out the vsync on and off options in Bioshock Infinite on 360).
 
Last edited:
http://www.eurogamer.net/articles/digitalfoundry-2015-dark-souls-2-performance-analysis

1080p on PS4 and Xbox One but better framerate on PS4

Xbox One pays a persistent price for matching the PS4's visual standard. Both formats target 60fps and engage v-sync at all times, but Microsoft's platform suffers the greater drops between the two in each scene of our frame-rate analysis. The Forest of Fallen Giants area is a good example, where a barrage of enemies causes a read-out of between 40-50fps on Xbox One, while PS4 operates within the 50-60fps range. Even while uncontested beneath the giant, arching trees of Things Betwixt, a regular margin of 10fps exists between the two - PS4 operating at a near perfect 60fps, while Xbox stutters along at 50fps.

Unfortunately this has the knock-on effect of making combat sluggish on Xbox One. In one example, an encounter with The Last Giant boss gives us our lowest drop, a record tumble to 36fps cued by a batch of floating souls. The PS4 goes entirely unruffled by the effect here, and it's fair to say the smoother controller response makes it easier to tackle a lingering knight after this boss battle's finished. Sony's machine does not produce a perfect 60fps of course, but it is a consistently better performer - and in a game that demands pinpoint timing for rolls and ripostes, the smoother frame-rate can make a difference.

wiPb6so.png
 
Would have been a good opportunity to include a 30Hz lock for XO.


Edit:

Well wait... ugh. 7% dropped frames? I need to watch the video later.

edit:

Some of these drops don't make any sense from a visual standpoint (sensitive to enemies vs just walking around) aside from the cut-scenes.
 
Last edited:
I think 30hz lock is a bit much as the XB1 game sustains 60fps on many areas too. And a stuttering ~50fps on some areas, even during prolonged moments, is much, much better than a 30fps lock (with its induced 30hz constant blur / lack of clarity) in my opinion.

But like Borderlands Handsome collection it could have deserved a 900p or even better a 1080pr image which is a quite good trade-off.
 
I think 30hz lock is a bit much as the XB1 game sustains 60fps on many areas too.
Yeah, already saw the video. It's not quite as bad.

I'm not sure a resolution drop would affect things much if it's mostly enemy count causing issues.

Has anyone investigated CPU performance on PC?
 
Yes, there is. In the particular instance I used as an example, the kickoff-to-display time for double-buffering averages lower than triple-buffering because the double-buffered frames are being kicked off in such a way that they finish immediately before the refresh happens.

I modelled this stuff a while ago assuming negligible CPU load, the time between input and beginning output on a 60Hz signal looked like this, for instance:

j1RySMH.png


The graphs look similar if you're considering different sorts of variance. At some performance levels, double-buffering is more responsive than triple-buffering by pretty much any sane measure.

I also personally find that although double-buffered framerates are always less than or equal to triple-buffering, it simply has a less stuttery feel, because at any given performance level the double-buffering is clamped to something with predictable timing. Yes, it fluctuates between levels, but not as wildly as triple-buffering's constant stutter during performance loss. I'd usually rather drop to 20 than float around 25 for a while.

The only situation where I think I generally prefer triple-buffering to double-buffering is when targeting full refresh (i.e. 60fps) with spikes, since the drop that occurs in double-buffering when the refresh slips is just huge.


You're getting variable updates, but your own example demonstrates that the variance is astronomically smaller in the tearing case. A 3ms difference can slip by without the brain registering it as a meaningful inconsistency; a 17ms difference is more problematic.


It's true that triple-buffering is commonly used. It's also true that those games feel absolutely awful when they're floating around the 20s, and it's further the case that in unlocked instances where the games tend to float above 30fps, plenty of people have asked for caps to eliminate the stutter. It's a compromise that makes sense some of the time, but it's not a "vast improvement" to double buffering in general, and for responsiveness it is not an alternative to tearing (if you think it is, I'd strongly suggest that you try out the vsync on and off options in Bioshock Infinite on 360).

I see what your saying, I guess I was mainly looking at it from the perspective of its impact on games that target 60fps, but frequently dip below that target. A 30fps game that constantly dips isn't good no matter how you slice it. Playing in the mid 20's with tearing isn't fun, and playing at 20fps thanks to double buffer Vsync is going to feel very sluggish, even if it is a consistent update. I think we agree that triple buffering is better when targeting a framerate well above 30fps. When you have a 30fps game with dips, then we start to decipher which is the lesser evil.
 
Digital Foundry: Has Rockstar really downgraded GTA 5?
  • PS4: POM removed, AF unchanged, framerate improved.
  • XBO: POM removed, AF unchanged, framerate unchanged.
Interesting but inclusive. It could be the removal of POM did influence framerate on PS4 but doesn't on XBO, or it could be POM and framerate are entirely unrelated and some other change improved the framerate on PS4. I guess we'll find out when/if Rockstar rectify the bug they're currently investigating.

Rockstar have released patch 1.10 which restores POM, seemingly with no impact to the performance improvements introduced in the previous two patches. Happy Easter! :yes:

UPDATE 2/4/15 5:41pm: Rockstar has just rolled out patch 1.10 for Grand Theft Auto 5, which it says fixes "graphical issues across GTA Online and Story Mode". After downloading the 4.7GB update on PS4, we can confirm that parallax occlusion mapping is back in the game, and based on initial observations, the performance improvements seen in patch 1.09 remain in effect on the Sony platform.​
 
Two patches to fix things on the PS4 and none yet for XB1?
Maybe there are different processes for approving patches on the different platforms and the Xbox One patch is in the pipe. Based on various GAF users playing Dying Light on PS4 it seems patches can even vary region to region; the USA had a bunch of small patches and the EU had fewer-but-larger patches :runaway:

All of the patch rules are out of the window!
 
Rockstar have released patch 1.10 which restores POM, seemingly with no impact to the performance improvements introduced in the previous two patches. Happy Easter! :yes:

UPDATE 2/4/15 5:41pm: Rockstar has just rolled out patch 1.10 for Grand Theft Auto 5, which it says fixes "graphical issues across GTA Online and Story Mode". After downloading the 4.7GB update on PS4, we can confirm that parallax occlusion mapping is back in the game, and based on initial observations, the performance improvements seen in patch 1.09 remain in effect on the Sony platform.​

Nice. That being said, my PS4 edition shall be retired within 12 days. :mrgreen:
 
The Xbox One is brain bound. Whoever made the decision to match the PS4 version point by point, with 1080p native and every single feature included at the same quality, knowing full well the framerate would suffer, and knowing that even the PS4 can't run it like that locked at 60fps, is the real issue here.
 
Last edited:
I mentioned transparency fillerate limitations during the GTA V comparison with grass and shrubs etc being cut back on XBox One, the culprit most likely being half the ROPS.

I think this is going to be the same issue with The Witcher 3 on XB1. Every XB1 Witcher video that I have seen seems to have framerate hitching or stuttering issues. I'm surprised DF hasn't done a video analyst yet...
 
I think this is going to be the same issue with The Witcher 3 on XB1. Every XB1 Witcher video that I have seen seems to have framerate hitching or stuttering issues. I'm surprised DF hasn't done a video analyst yet...
It could be that they don't have good enough footage to perform an analysis on yet, or they are working on it and haven't released it yet.
 
Status
Not open for further replies.
Back
Top