20 fps in not enough...

Pussycat

Newcomer
...but I am tired of explaining it to people. Could someone write a demo, like the old one from 3dFX, that SHOWS there is a difference between (for example) 30 and 60 fps, even if it's only by showing a rotating cube (have you seen the low framerates of the directX 8 D3D test!!)

Ir would finally end the discussion, and also for the ones that don't have a 3dfx card or a glide wrapper left.

Thanks in advance :smile:
 
If someone say "we can't see more than 60 fps" what they should be saying is "I can't see more than 60 fps". I've sat down at many a computer where the refresh rate was 60 Hz and the person using the computer hadn't ever noticed flickering. Many people eyes just aren't that good.
 
The way I see it (no pun intended), you have to know what to look for before you can see it. I didn't know the real benefits of FSAA or what it cleaned up (besides jaggies) until John Reynolds pointed it out. To this day, I still curse you John for making me super picky. ;) Anyways, one day I decided to look and really see what the difference that FSAA/non-FSAA gave, and then suddenly once I started seeing the artifacts that no-AA had, I'm always seeing it, even when I don't look for it. Now I can't really play games without FSAA enabled.

Same with refresh rates. It wasn't until I looked closely for the flicker that 60hz caused that I was able to see the flicker, even when I wasn't aware that I was looking for it. Now I can't go below 75hz at all, and prefer 80hz+.
 
I don't know. Sometimes people just need to see the "better" results.

I remembered that several years ago, when I use 3D Studio on a 486 back in the DOS era, I rendered a scene without anti-aliasing to speed up. Then I found that the no-AA scenes are "in very low resolution" compared to the normal AA scenes. But they are all rendered at the same resolution (640x480). I realized the effect of high quality AA at that time.
 
Hi, I'm the author of 'FPS compare v0.2'. Nice to see that you remembered it Matt! :smile:

I saw the original question and thought, cool, someone who might need FPSCompare, I better register and post right away, only to find that someone has beat me to it. :cool:
 
After viewing the program I'm amazed at the difference 30/60 makes but I don't understand how. After all don't Films run at 24 fps and no one ever complains that it's jerky. Is it just monitors this affects - would 30fps be ok on a TV?
 
On 2002-03-05 18:04, king_iron_fist wrote:

After viewing the program I'm amazed at the difference 30/60 makes but I don't understand how. After all don't Films run at 24 fps and no one ever complains that it's jerky. Is it just monitors this affects - would 30fps be ok on a TV?

The reason is due to 3D game graphics sampling a single instant in time for each frame. This means you get temporal aliasing.

Films, OTOH although not perfect, have the shutter open for a significant percentage of the frame time (approx 1/48th of a second=>50%) and so "include" more of the motion. You still get some aliasing effects (eg some slight jerkiness with panning and the notorious "wagon wheels going backwards") but it still looks much smoother than a game.

Furthermore, you may have seen some TV sports coverage that has used "high-speed" video cameras to capture the images so that the freeze-frame 'still' images look sharp. You will notice, however, that when played at normal speed, the video doesn't look 'smooth'.

_________________
"Your work is both good and original. Unfortunately the part that is good is not original and the part that is original is not good." - Samuel Johnson

<font size=-1>[ This Message was edited by: Simon F on 2002-03-05 18:47 ]</font>
 
I can see jerky on films. Actually films are displayed at 48 fps (every frame displayed twice) to reduce some jerkyness.

NTSC are not 29.97 fps, they are 59.94 fields per seconds, interlaced. The main reason using interlaced display is that 29.97 fps is not enough for smooth motion, but the bandwidth is not enough for encoding more data. So they decided to use interlaced display for somewhat smooth vision with the same amount of data.

Note that motion blur can significantly reduce the motion jerkyness of lower frame rate video. However, with frame rate as low as 30 fps, motion blur can be annoying to some people.
 
I want to see genuine motion blur effects for graphics cards, I don't mean stupid trails like you seeon a kyro II tech demo but genuine motion blur up to the standard found in 3d apps. Admittedly when enabling it on Vue D'esprit the render gets 4 times slower but is there a way to implement it properly in real time?
 
Movie films are not fluid, although they've motion blur. I definitly can see the frames in fast scenes, though my collegues can't. I hope hollywood will switch to the IMAX standard or something equivalent over time.
 
Yeah...one thing that playing 3D games and scrutinizing image quality has done for me, is make me "notice" all the artifacts in motion film that I never used to see. :cry:
 
On 2002-03-05 19:09, king_iron_fist wrote:

I want to see genuine motion blur effects for graphics cards, I don't mean stupid trails like you seeon a kyro II tech demo but genuine motion blur up to the standard found in 3d apps. Admittedly when enabling it on Vue D'esprit the render gets 4 times slower but is there a way to implement it properly in real time?

The penstarsys article helps to explain why motion blur is not used more frequently in real-time graphics.
 
Back
Top