While playing Star Fox 64 on my Wii U for the past couple days I noticed the game seemed more difficult than I remember on the N64. After doing some research, I learned that the emulation on the Wii U keeps the game running at the target framerate almost all the time, while the original N64 had a lot of slowdown. This causes the Virtual Console version to play a lot faster, and has taken some time to acclimate.
Up until the PS2/GC/Xbox generation, slowdown and framerate drops were essentially the same thing. If the framerate dropped, the game speed slowed down. This was certainly true with the 8bit and 16bit consoles. What caused/enabled developers to separate the game simulation from the rendering framerate?
I sort of prefer the simulation being tied to the framerate. It is easier to manage a chugging framerate when the game speed goes into slow motion. The judder we experience these days with framerates not maintaining the target framerate is more distracting in my opinion.
So I suppose I am really just curious as how this game to be, and would be interested in knowing if this was more of a hardware or software related evolution?
Up until the PS2/GC/Xbox generation, slowdown and framerate drops were essentially the same thing. If the framerate dropped, the game speed slowed down. This was certainly true with the 8bit and 16bit consoles. What caused/enabled developers to separate the game simulation from the rendering framerate?
I sort of prefer the simulation being tied to the framerate. It is easier to manage a chugging framerate when the game speed goes into slow motion. The judder we experience these days with framerates not maintaining the target framerate is more distracting in my opinion.
So I suppose I am really just curious as how this game to be, and would be interested in knowing if this was more of a hardware or software related evolution?