The Power Of Refresh Rates

Cowboy X

Newcomer
This quote says it all ( courtesy THG ) :

" Suitability In Practice: AMD Plainly For Gaming Fans

If you're looking for a high-performance system for 3D games, you could do no better than to go with the AMD system. In particular, it does not create problems in the SLI setup with the nForce4 SLI chipset. The extra performance produced by the increased refresh rate makes itself noticeable above all in mainstream 3D games. "

Who needs pipelines ??

Gimme some more screen refreshes :)
 
Rofl it increase the refreshrate actually decreases preformance because it uses up memory bandwidth.
 
bloodbob said:
Rofl it increase the refreshrate actually decreases preformance because it uses up memory bandwidth.

Stop trying to rain on our parade .................. this is a new and legitimate optimisation :D Shader replacement with screen refreshes .
 
Hmmm... Is there any technical reason why LCD screens are locked in to fixed refresh rates, rather than just being refreshed only when a new frame is actually ready to be displayed?
 
They aren't locked to fixed refresh rates, at least not on the interface side. However, updating all pixels is a process that takes time, and given the high response times there probably isn't much reason to spend money on components that could work faster.
 
I think arjan was thinking about temporarily lowering the refresh rate, and the same idea has hit me too. Can't a TFT show a stable image at significantly lower refresh rates than 60 Hz? That could be used in games. So if your game isn't ready with the next frame after 1/60 second, don't resend the old frame. Wait until the new frame is ready, and send it directly when it is. Effectively getting rid of the quantized framerates with vSync enabled, with a lower latency than triple buffering.

Ie a refresh rate that varies dynamically to follow how fast you can render.
 
Oops :oops:
I guess that should indeed be possible, however a fixed framerate is better (if sustainable).
 
Basic said:
Ie a refresh rate that varies dynamically to follow how fast you can render.
It would confuse the hell out of the monitor, it would lose synch all the time. Older monitors could be wrecked in this manner, but I think newer are smarter. The screen would likely go black constantly however, so you couldn't see anything anyway; most unfortunate! :D

Besides, the savings of going below 60Hz would be limited to say the least. By doubling from 60 to 120, bandwidth useage jumps by 100%, but lowering from 60 to 40 is only 33%, and I don't think anyone would want to game at less than 40Hz even if it was possible (many monitors won't synch to anything less than 50Hz).
 
Guden Oden said:
Basic said:
Ie a refresh rate that varies dynamically to follow how fast you can render.
It would confuse the hell out of the monitor, it would lose synch all the time. Older monitors could be wrecked in this manner, but I think newer are smarter. The screen would likely go black constantly however, so you couldn't see anything anyway; most unfortunate! :D

Besides, the savings of going below 60Hz would be limited to say the least. By doubling from 60 to 120, bandwidth useage jumps by 100%, but lowering from 60 to 40 is only 33%, and I don't think anyone would want to game at less than 40Hz even if it was possible (many monitors won't synch to anything less than 50Hz).
you've missed the point completely, it seems. arjan and Basic are just talking about a pseudo-v-sync that uses the lack of a fixed refresh rate on LCDs in order to prevent the tearing that happens when you don't use v-sync without limiting the possible framerate to a factor of the refresh rate.

and wooh, all I did was regurgitate Basic's post.
 
Could have some special vsyc signal sent that says ditto, display the current frame again. Avoid sending along data that is already being displayed. Think video cards are programmable enough to create any new standard in vsync signals, would just need new monitors supporting the feature. Basically, the idea is to keep everything digital and possibly buffered and you can even do the same thing with crt monitors, not just lcd monitors. Take the legacy out of it all, we don't need to be driving electron guns directly with analog signals along cables today.

All for the sake of what though, saving a bit of gfx card memory bandwidth if you happen to be using a special monitor? If you saved 99% of all of the related memory bandwidth in some application getting 1fps or something, it's still peanuts compared to the total available ram bandwidth of today's cards.

Tearing is when you swap the data going to the monitor before the end of the previous frame, only way to stop that is to always send complete frames. Really, if your game can do 100fps, unless your display can update 100 times a second, you would still get tearing, you'd just be moving the problem.
 
Yep, The Baron got it right.

Himself:
No, a special vsyc signal that says "reuse the old frame", would not give the same effect. And saving a bit of gfx card memory has nothing to do with it. (It's just a secondary, and pretty insignificant benefit.)

A TFT screen is in a way similar to a DRAM. Each color component of a pixel works like one DRAM bit. It consist of a transistor to enable updating of the stored value, and this value is stored as a charge in a capacitor. The difference is that a TFT can store an analog value, and that the charge control the LC.

Now the idea was to use this fact, that a TFT is like a memory. So instead of refreshing it with the old frame, don't do anything. Just let it keep its old value. Then wait until the new frame is ready, and directly after that, update the screen.

The benefit is that you don't need to wait for a redundant screen refresh to finish when your new frame is finished. Saves between 0 and 1 screen refresh of latency (depending on in what "phase" it is) compared to triple buffering. And it would be a very small change of the hardware. (There's even a chance that it could work without any HW changes, just BIOS changes.)
 
You guys have now ruined a perfectly good thread with postings containing logic and knowledge :( .................. :) .
 
Guden Oden said:
Basic said:
Ie a refresh rate that varies dynamically to follow how fast you can render.
It would confuse the hell out of the monitor, it would lose synch all the time. Older monitors could be wrecked in this manner, but I think newer are smarter. The screen would likely go black constantly however, so you couldn't see anything anyway; most unfortunate! :D
On a legacy CRT monitor, this would be a perfectly valid complaint. But why does this need to be the case with TFT monitors too? It would seem to me to be trivial to build a TFT monitor so that it it could e.g. accept a variable length vsync signal, so why aren't TFT monitors (and graphics cards) actually built with this capability?
Besides, the savings of going below 60Hz would be limited to say the least. By doubling from 60 to 120, bandwidth useage jumps by 100%, but lowering from 60 to 40 is only 33%, and I don't think anyone would want to game at less than 40Hz even if it was possible (many monitors won't synch to anything less than 50Hz).
As Basic pointed out, adjusting the refrest rate to follow the frame rate like I suggested would allow a frame to be displayed immediately after it has completed rendering, making triple buffering redundant (no tearing) and cutting ~1/2 frame of latency and considerable memory footprint compared to triple buffering.
 
digitalwanderer said:
Ok, I've heard enough...how do I overclock my CRT? :|
As in, running the monitor above its advertized hsync/vsync rates, thus giving you a higher refesh rate at your favorite resolution?

http://www.nvidia.com/object/custom_resolutions.html seems to offer what you want if you are running an Nvidia card under Windows, not sure if ATI offers anything similar (if not, there are, or at least were, registry hacks that would do it for you). AFAIK, most CRT monitors will withstand about 5-10% overclocking - if you go too far, the monitor will either lose sync (black picture or garbage, harmless) or overheat (dangerous; not common but consider yourself warned).
 
Basic said:
Yep, The Baron got it right.

Himself:
No, a special vsyc signal that says "reuse the old frame", would not give the same effect. And saving a bit of gfx card memory has nothing to do with it. (It's just a secondary, and pretty insignificant benefit.)

The benefit is that you don't need to wait for a redundant screen refresh to finish when your new frame is finished. Saves between 0 and 1 screen refresh of latency (depending on in what "phase" it is) compared to triple buffering. And it would be a very small change of the hardware. (There's even a chance that it could work without any HW changes, just BIOS changes.)

I am not sure what you are trying to accomplish, lol. The pixels of an lcd can only change state so fast, so you have an upper bound on it's "refresh rate" regardless of signals.
 
Himself said:
Basic said:
Yep, The Baron got it right.

Himself:
No, a special vsyc signal that says "reuse the old frame", would not give the same effect. And saving a bit of gfx card memory has nothing to do with it. (It's just a secondary, and pretty insignificant benefit.)

The benefit is that you don't need to wait for a redundant screen refresh to finish when your new frame is finished. Saves between 0 and 1 screen refresh of latency (depending on in what "phase" it is) compared to triple buffering. And it would be a very small change of the hardware. (There's even a chance that it could work without any HW changes, just BIOS changes.)

I am not sure what you are trying to accomplish, lol. The pixels of an lcd can only change state so fast, so you have an upper bound on it's "refresh rate" regardless of signals.
Let's say that we have a TFT screen that has a pixel response time of, say, 10 ms and thus a maximum refresh rate of 100 Hz, and a graphics card that constantly sends one frame to it every 10 ms. Now, let's assume we have a game that renders one frame every 15 ms (~66 fps). Now for at least half the frames, we get the situation that the frame cannot be sent to the monitor immediately without tearing because the DAC is still busy re-sending the previous frame to the monitor. What I am proposing is that when the DAC has no new frame to send, it should stall (by e.g. extending the vsync signal; there is no reason that I see that such a stall period MUST be a multiple of 10 ms for this scenario) until a new frame is ready rather than sending the entire old frame over again. This will result in reduced latency from a frame is rendered until it is displayed on screen, in the case where the game's frame rate is LOWER than the maximum refresh rate of the screen. My question still remains, why isn't this done or at least possible with TFTs?
 
Back
Top