DX12 Performance Discussion And Analysis Thread

That's... what "solid VSynced 60 FPS" means. If I disabled VSync I'd probably get an average over 90 FPS with dips to 70.
I know, it's just whenever you talk to people and they say, well my PC's rockstable/hitting solid 60 Fps/whatever, and you dig a bit further, turns out that mostly they have some caveats like "mostly 60 Fps in Skyrim expcept for when I get in a fight with lots of spells whirring around" and the like. :)

Plus, what Andrew said.
 
Right but the lie of AFR is that you're not actually reducing your latency to the level of if you were getting double-buffered "solid 60 vsynced" on a single GPU. You still have an extra frame of lag in there, as if it was only 30.
With solid VSynced 60 FPS, I see no difference between Crossfire and single GPU.
I'm not saying there isn't some measurable difference like you suggest, but it seems to me that this difference is completely negligible.

To whom does this single frame of lag (16.6 milliseconds?) make a difference, realistically?
Top-notch pro gamers who will just turn off every non-essential detail to get higher and more reliable framerates, regardless?


:S So the image on the screen may look smoother (debatable vs. g-sync though IMHO), but it doesn't actually help the "feel" much.
Again, not my perception at all. I also doubt the majority of people who play games would share your lag perception.
But I guess nothing short of a proper controlled study with a sizeable sample of people (with placebos thrown to the mix, that would be fun) would prove either of us right or wrong.
 
To whom does this single frame of lag (16.6 milliseconds?) make a difference, realistically?
VR applications, definitely.
Besides, there is not just that single frame of latency added by AFR, but easily another 1-4 frames between simulation state and screen, depending on the render pipeline and buffer setup.

Yes, if AFR is properly synchronized it can actually yield more intermediate frames and provide a smoother animation. But it's not helping with the latency at all.
 
Is this single frame of lag (16.6 milliseconds?) make a difference, realistically?
Instead of absolute lag, isn't it more about lag difference between multiple parties in a multiplayer setup of otherwise equally capable (but not necessarily top) players?

If one player has a full pipeline lag of 3 frames and the other has a lag of 4, and both are just amateur players of the same level, the latter will probably get a higher k/d ratio. Even if they wouldn't be able to notice the difference in terms of absolute lag by themselves.
 
If one player has a full pipeline lag of 3 frames and the other has a lag of 4, and both are just amateur players of the same level, the latter will probably get a higher k/d ratio. Even if they wouldn't be able to notice the difference in terms of absolute lag by themselves.

I really don't think this one frame would make any difference in people who don't train towards e-sports levels.
Regardless, it would be really fun to make some sort of formal study over this, especially with placebos in the middle.
 
Instead of absolute lag, isn't it more about lag difference between multiple parties in a multiplayer setup of otherwise equally capable (but not necessarily top) players?

If one player has a full pipeline lag of 3 frames and the other has a lag of 4, and both are just amateur players of the same level, the latter will probably get a higher k/d ratio. Even if they wouldn't be able to notice the difference in terms of absolute lag by themselves.
If you are speaking about multiplayer games, the server usually has a fixed frequency update ration (like 25/30/60/64/96/128 updates per seconds), moreover you have to consider the network quality connection of players, plus additional device latency noise. 1 or 2 frame of latency usually is not a big deal even on multilpayer games. Depending on the network code, a big deal could be an higher frame-rate to send more accurate data to the server.
 
Last edited:
To whom does this single frame of lag (16.6 milliseconds?) make a difference, realistically?
It makes a difference in games where you aim/control the camera with a mouse, ex. FPS. This is why those games typically already modify the default "maximum pre-renderered frames" (and indeed there are control panel settings to play with this in some games too) down to a single one. While this potentially even drops the frame rate slightly, it actually improves the experience and responsiveness.

In other genres it doesn't matter and it's fine to have an extra frame in there. But in those genres solid 30fps (or 30-60 w/ adaptive sync) is usually okay too.
 
It makes a difference in games where you aim/control the camera with a mouse, ex. FPS. This is why those games typically already modify the default "maximum pre-renderered frames" (and indeed there are control panel settings to play with this in some games too) down to a single one. While this potentially even drops the frame rate slightly, it actually improves the experience and responsiveness.

In other genres it doesn't matter and it's fine to have an extra frame in there. But in those genres solid 30fps (or 30-60 w/ adaptive sync) is usually okay too.
Yes. And FPS pros tend to prefer 120 hz displays as it further reduces the frame length -> latency by 8.3ms.

Latency is also important in other game genres. Fighter games are always locked 60 fps, and most tournaments used CRTs instead of LCD HDTVs, because LCD completely ruined all the timings. Most fighting games have "just frame" inputs that require frame perfect timing.

When I was developing the Trials games, I tested rendering latency effect regarding to gameplay on consoles. A single frame of extra latency definitely affects the timings of the most difficult obstacles. 60 fps -> 30 fps however is a considerably bigger impact on gameplay.
 
When I was developing the Trials games, I tested rendering latency effect regarding to gameplay on consoles. A single frame of extra latency definitely affects the timings of the most difficult obstacles. 60 fps -> 30 fps however is a considerably bigger impact on gameplay.
Yeah there's definitely two components - your brain's ability to "predict" the motion of objects based on smooth motion (this one depends on good frame rate and pacing/smoothness) and how 1:1 your input feels with what happens on the screen (which is primarily latency related). Ideally you want both, but I've just found the tradeoffs of AFR - potentially more smoothness (again debatable vs. adaptive refresh in some cases) for more latency - are not that compelling even if you ignore the poor value proposition as many of us routinely have access to multiple high end GPUs if desired.

I'm definitely a convert to the high refresh monitors bandwagon, having recently acquired a 144Hz display. Even just on the windows desktop I can't go back to 60 now, and definitely not for FPS and similar "mouse controls camera" games. :)
 
Andrew, Im pleased about the dx12 discussion but when can I get some Anal ;)
lTp4xcn.jpg
 
I'm definitely a convert to the high refresh monitors bandwagon, having recently acquired a 144Hz display. Even just on the windows desktop I can't go back to 60 now, and definitely not for FPS and similar "mouse controls camera" games. :)

Yes, this is the main reason I'm not getting a high refresh monitor anytime soon. As I've moved entirely to 60 FPS gaming, I can't go back to 30 FPS no matter what. Hence it's ruined any enjoyment I used to get out of consoles. A move to 120 or 144 would likely have a similar impact. And it's much more difficult to have a consistent 120/144 than it is to have a consistent 60. There's also the fact that there are no [edit]high refresh rate[/edit] 4K displays with large screens available. Having moved to a 49" 4k monitor (not TV), I cannot go back to anything smaller for my workflow.

Regards,
SB
 
Last edited:
I'm definitely a convert to the high refresh monitors bandwagon, having recently acquired a 144Hz display. Even just on the windows desktop I can't go back to 60 now, and definitely not for FPS and similar "mouse controls camera" games. :)

I switched to a 144hz display a week ago and the difference on the desktop is immediately noticeable. Not feeling the same in all games though.

Witcher 2 is noticeably smoother but other games (Bioshock Infinite, Max Payne 3) are sluggish in comparison. Hard to explain - it feels like the game is updating player position and movement slower than the reported fps.
 
I switched to a 144hz display a week ago and the difference on the desktop is immediately noticeable. Not feeling the same in all games though.

Witcher 2 is noticeably smoother but other games (Bioshock Infinite, Max Payne 3) are sluggish in comparison. Hard to explain - it feels like the game is updating player position and movement slower than the reported fps.

Some games seem to have internal caps on parts of the simulation, so perhaps you're experiencing that? I remember Bioshock 2 on my old Opteron suffering from player movement being out of sync and slower than the frame rate. It was jarring, felt horrible.
 
Back
Top