So, after the method for measuring framebuffer resolution has already been more or less proven to work, we as a anal-retentive, magifier-glass-wielding technofreak community need a way to measure the _real_ framerate on release titles. Do you have any ideas? I can't think of anything except for high-speed cameras... I figure to measure reliably framerates in the 30-60 range you'd need at least 150-200 fps. Is there anything in the consumer space that can do this sort of thing?
What got me wondering was a little birdie who told me that an upcoming high-profile multiplatform title will be running at a significantly better framerate on one of the platforms, although none of the versions will be locked at either 30 or 60 fps, and it won't be advertised because of, ahem, a gentle warning given by the vendor of the framerate-challenged platform.
Since the gaming press has already written tons of bullshit about previous versions of the same game, I don't think we can trust them to find and expose the difference - if they receive marketing materials saying "the visuals are exactly the same on both platforms", they will parrot it, and add some "fluid" and "smooth"-s on their own.
What got me wondering was a little birdie who told me that an upcoming high-profile multiplatform title will be running at a significantly better framerate on one of the platforms, although none of the versions will be locked at either 30 or 60 fps, and it won't be advertised because of, ahem, a gentle warning given by the vendor of the framerate-challenged platform.
Since the gaming press has already written tons of bullshit about previous versions of the same game, I don't think we can trust them to find and expose the difference - if they receive marketing materials saying "the visuals are exactly the same on both platforms", they will parrot it, and add some "fluid" and "smooth"-s on their own.