Great post throughout with some great ideas thrown in.
Originally Posted by Andrew Lauritzen
So during sections B and C the GPU may well be happily delivering frames to the display at a consistent rate, so where's the problem? Remember, the CPU is timing the rate at which it is allowed to fill the input queue and using that as the rate at which it updates the game simulation. Thus a long frame like the one at B causes it to conclude that the rendering pipeline has slowed all-of-a-sudden, and to start updating the game simulation by 38ms each frame instead of 16ms. i.e. onscreen objects will move more than twice as far on the frame following B as they did previously. In this case though, it was just a spike and not a change in the underlying steady state rate, so the following frames (C) effectively have to move less (10ms each) to resynchronize the game with the proper wall clock time. This sort of "jump ahead, then slow down" jitter is extremely visible to our eyes, and demonstrated well by Scott's follow-up video using a high speed camera. Note that what you are seeing are likely not changes in frame delivery to the display, but precisely the affect of the game adjusting how far it steps the simulation in time each frame.
This seems to be a good ID of the problem involved. And I am skeptical, as 3d already pointed out, that the driver is optimizing the shaders between frames. My hunch is that it almost certainly due to the variations in CPU load alone. However, there might be other issues involved of the similar nature. There seems to be a rather simple solution to this problem of object level stutter. The game is using a too simple a heuristic to calculate the time step of the game physics. ALL of the volatility is immediately reflected into the time steps which is the cause of the stutter.
The embedded world has to deal with such problems all the time. They have a fairly well standardized solution to this. PID controllers
. This is a robust solutions to smooth out the variations in the input. The existing systems to find the time step are too simple, hence too volatile.
PID systems use a goal signal to minimize the error. The games can use the average delay in the last 10 frames as the goal signal to smooth out the variations in frame time.