Micro-stuttering and how to solve it

Humus

Crazy coder
Veteran
Since micro-stuttering is an issue for some people I thought it would be interesting to see if there's a way to solve it. Personally I'm not bothered by it and haven't actually been able to even see it, but I can prove it's happening with Fraps on my 3870x2. So I spent a couple of hours to implement a simple waiting mechanism to feed the GPUs at a more balanced rate. For best stability you want the second GPU to start working only once the first GPU is in mid-frame (assuming 2 GPUs). This is the result:

MicroStutter.png


Now, since I'm not able to see the micro-stuttering in the first place I can't really say if it's any smoother in practice, but at least the Fraps graph looks good. :)

What I did was first to create a GPU limited test case using my InteriorMapping demo and slowing it down a bit by adding work to the shader until it ran at about 120fps. Micro-stuttering happens when you get GPU limited since you're feeding commands to the GPU faster than it can accept them. The more GPU limited you are the worse the problem is because the GPUs will start working on their frames shortly after each other and then after crunching through the frames both finish almost at the same time too, which makes one frame appear for 1ms and the next 15ms instead of both 8ms, as shown in the graph.

So I inserted an event query at the end of each frame. I use two queries, one for each GPU. Then at the end of the frame it syncs on its event query and counts how long it has to wait until this GPU's previous frame is complete. Then I sum up all the wait that was necessary and simply distribute it equally over the next two frames, which puts the frames better interleaved on the GPU. The best thing, the framerate did not change because of this.

I think something like this should be added to the driver with a simple checkbox in CCC to enable it. It can probably also be made more accurate than my simple prototype.
 
The problem is that with your example the rendering will settle into a steady state with the same FPS as the stuttered rendering. The more variation in the rendering times the more delays will eat into benchmarks ... which is not a bad thing necessarily from a user point of view, but from ATI's point of view the stutter costs them nearly nothing in (it's just not made a big deal off) but a reduction in benchmarks far more. At least in the short term.

Or in other words, someone will probably need to write their own driver hook, I don't see ATI doing it (there is information out there how FRAPS does it's thing).
 
Well if it's a check box option in CCC you could test it both ways, and I have a feeling ATi would recommend you test it with it OFF for flat-out speed tests.

Brilliant work Humus, thanks!
 
It's been my observation that you wont 'see' microstuttering at very high FPS because its hard to differentiate the higher it is, particularly on typical LCD monitors. It's noticable (to me) at around 35-65 FPS actual, where the scene might only appear to be rendering at 20-50, depending on how bad it is.
 
Here's an idea for your test. Include a pseudo-random number generator and include a dynamic loop that counts up to the random number. It should give the same sequence every time to be repeatable for testing. It should also be the worst possible scenario for micro-stutter. Another interesting test scenario would be a render to texture every couple of frames and then use the textures in the following frames.

Implementing an improved method of v-sync should be able to combat the problem almost entirely. Yes the framerates get capped to some degree but the frames would be coming at fairly predictable intervals.
 
I think something like this should be added to the driver with a simple checkbox in CCC to enable it. It can probably also be made more accurate than my simple prototype.

Yes, it's a pretty obvious solution, which I've discussed with some people years ago, when talking about multithreaded software rendering.
I originally assumed that this is what drivers did anyway (the driver can just keep track of average frame rendering times and have pretty accurate estimates for when to start the next frame, more or less like what you described), until people started reporting micro-stutter issues.

I can think of only one reason why the drivers don't: Your estimate won't be 100% perfect, and will have to be adjusted continuously, so you might get a slightly lower framerate than when you bruteforce the commands through as quickly as possible.
 
I can think of only one reason why the drivers don't: Your estimate won't be 100% perfect, and will have to be adjusted continuously, so you might get a slightly lower framerate than when you bruteforce the commands through as quickly as possible.

Which would be a good argument for making it an optional tickbox. This way the user can choose to have slightly higher framerates and ignore microstuttering, or slightly lower, but smoother framerates.
 
Which would be a good argument for making it an optional tickbox. This way the user can choose to have slightly higher framerates and ignore microstuttering, or slightly lower, but smoother framerates.

Ofcourse, but I suppose the driver writers are just in denial on this issue.
I don't think anyone has ever officially admitted that this problem even exists, and you can't fix problems that don't exist, right?
 
Wow so simple and clean! Utterly brilliant Humus. I for one have noticed microstuttering in HL2 weirdly enough. But it could be because my hard drives are really badly fragmented right now. I have never ever noticed any form of microstuttering till this weekend. I am going to defrag and give the game a shot some time today but still very brilliant work!
 
Wow so simple and clean! Utterly brilliant Humus. I for one have noticed microstuttering in HL2 weirdly enough. But it could be because my hard drives are really badly fragmented right now. I have never ever noticed any form of microstuttering till this weekend. I am going to defrag and give the game a shot some time today but still very brilliant work!
Valve has a built in defrag mechanism for Steam. Try it first.
 
Ofcourse, but I suppose the driver writers are just in denial on this issue.
I don't think anyone has ever officially admitted that this problem even exists, and you can't fix problems that don't exist, right?

You're wrong. On both accounts.
 
You're wrong. On both accounts.

In that case, point me to an official admission that the problem exists, and the intention to fix it.
If you fail to do so, I must be right. Which I probaby am anyway, else you would have done so in your previous post. But thanks for playing.
Intelligent, civilized people don't just go screaming "You're wrong" without any kind of argumentation whatsoever. Instead, they point out the facts, without having to get personal or anything. Which leads to a much better discussion and altogether atmosphere.
 
@suryad - Microstuttering != Visible stutters.

Microstuttering causes the viewer to perceive lower framerates than what's rendered, and cant be confused with the usual stuttering, which is the plainly visible hitching, or pausing (no matter how brief) he might get from HD activity or having his settings too high.

You cant *see* microstuttering. You only get the impression of lower FPS.
 
Learn German, it's good for you:

http://www.pcgameshardware.de/aid,6...oruckler_Erste_Statements_von_AMD_und_Nvidia/

And while you're at it, study manners too, no one screamed at you so stop acting like a butthurt nubile. I have about 0 interest in your person, I was merely addressing a statement which is false, if stated as an absolute, as it suggests inner knowledge that you lack (knowledge of what IHVs are doing).

Actually, your manners aren't perfect either. ;)

Also, more people would be better off if all good articles on the web are written in English, don't you think? ;)

Und ich sage das hier nur weil ich selbst auch Deutch lesen kann (schreiben aber ...), und schon weiss das es sehr viel gutes uber ICT zum lesen gibt auf Deutch - ich bin ja schon sehr länge ein 'Fan' von das Deutsche C't Magazin gewesen. ;)
 
Learn German, it's good for you:

http://www.pcgameshardware.de/aid,6...oruckler_Erste_Statements_von_AMD_und_Nvidia/

And while you're at it, study manners too, no one screamed at you so stop acting like a butthurt nubile. I have about 0 interest in your person, I was merely addressing a statement which is false, if stated as an absolute, as it suggests inner knowledge that you lack (knowledge of what IHVs are doing).

Das Einteilen dieser Frames (bzw. der command buffer) wird zumindest unter Vista vom Kernel übernommen, das heisst dieses Verhalten kann auch über den Treiber nicht beeinflusst werden.

Unless my German fails me (which it doesn't, since us Dutch people bother to learn other languages in school), they claim that the driver cannot fix the issue and put responsibility under the Vista kernel.
Sounds like denial to me.
In fact, they actually admit that they go for maximum framerate, because they believe that this is what the user wants.

nVidia says no more than that they're 'investigating the issue'.

Also, I still say it's extremely poor form to just shout "You're wrong". You could have posted the link right away, at least you'd have something to back up your disagreement, even though it doesn't quite disprove what I said.
Aside from the fact ofcourse that I was referring to official statements, which can never be 'inner knowledge' by default.
 
Back
Top