NFS: Hot Persuit 2

BenSkywalker:
For SMW it is easy to explain why it couldn't run at 60FPS, the SNES and its carts did not have the technical capability of doing so. Unlike a 3D game every frame had to have seperate images already drawn and loaded on to the cart. All of these images had to be pulled from the cart and then displayed, carts did not have the memory to store the kind of data that a game like SMW would require. The other issue for the SNES is bandwith and throughput of the video output system. I suppose it would be possible in a theoretical sense to draw something at 60FPS on the SNES, but not a game with multiple objects moving around. 30FPS is smooth on a console as long as the framerate doesn't fluctuate.

Hmm, wow... First of all, you basically don't draw anything on the SNES in standard 2D games like SMW. All you need to do to scroll the entire BG is write to two 16-bit registers and the PPU takes care of what to show. No blitting is done to any framebuffer - the chip decides what color to transmit for every pixel during the scanning, based on the various register settings and the contents of the character RAM (tile data). The same goes for all sprite movement, transparancies, etc.

Basically there's no data transferred over the cartridge bus during actual gameplay, only code. The graphics data needed is usually DMA'd to the internal PPU VRAM before each level, with some additional tile animation and sprites DMA'd during VBlank, when necessary. It is also during every VBlank you update background scrolling and sprite positions.

Believe me, the whole SNES architecture (and practially every other machine up until the 3D era) is built to run in sync with the television raster scanning, where a vertical blanking occurs at a frequency of 60 or 50Hz depending on the region. Actually there are registers that read the current X and Y position of the cathode ray, and the current field if running in interlace mode (I believe only one SNES game ran interlaced though, the non-interlaced ("progressive" if you wish) 256*224@60Hz mode was used almost exclusively). You can even set an interrupt to go off at an excact position of the ray. Put some palette modifying code on the interrupt and you can see the color changing on that position on the screen. The basic principle of having everything to run in perfect sync with the TV is the same on most consoles and home computers up to the early 90s, the SNES just happens to be a good example (and a system I've worked with). Another great example is the "copper" co-processor of the Amiga, it had the sole purpose of keeping track of the television scanning and execute related code.

And btw, 30fps is not smooth, *especially* not with 2D graphics. A basic scrolling 2D background is perhaps the best example of how big a difference there is between 30 and 60 fps movement.
 
A little info on the bit about movies frames being shown twice- this is done to avoid lighting flicker in a darkened room at a typical rate of 24 Hz. So each frame is shuttered twice and the flicker rate becomes 48 Hz, which becomes more difficult to detect (if at all). The effective image frame rate is still 24 fps. I, myself, have never noticed a problem with any sort of flicker at a theater showing (implying that the 48 Hz shuttering is adequate for me), but the "discrete-ness" of motion for moderate to fast moving objects is certainly noticeable to me at the typical presentation 24 fps (hell, I can see it on a regular TV showing a film source that was natively 24 fps, but upconverted to 30 fps using 2:3 pulldown- pretty much that applies to all movies and also a great deal of 1 hour series shows). Anything but a slow camera pan and I can observe the stuttered motion.

It's ironic that with the advent of HDTV, we are promised greater image quality in the form of higher resolution and progressive imaging, but the mainstay of home theater presentation (movies) will still be hampered by its relatively apathetic 24 fps (and to some extent, whatever lossy digital compression gets applied on the way to the consumer whether it be cable broadcast or whatever new HD-DVD standard they figure-out). Hollywood needs to facelift its technology, but that will be no small undertaking if it ever even happens.
 
Sorry Ben, but your SNES and SSBM (being the only 60FPS game this gen) are extremly questionable, to say the least.

I wish archie, faf, cyba, etc would come here and clear this up. As it is, I have no reason whatsoever to believe you and not my own eyes, which can CLEARLY and EASILY see the difference between 60 and 30FPS on an interlaced TV.
 
I don't have a problem seeing the difference between 60FPS and 100FPS on a PC(not refresh rate which anyone should be able to see, FPS), telling the difference on a console comes down to input latency.
I disagree, side by side, the motion difference is rather obvious.

Also, on interlaced TV's the trailing/combing artifact from overlapping fields of different frames is much more visible when updating at 30fps as opposed to 60. (it goes to the point of 30fps having the double image effect, while 60 looks like a solid picture in motion).

SSBM is the only game I have from the current generation that runs at 60FPS(out of 22 for the Cube and XBox).
Maybe it's time to get you a PS2 then 8)
 
I'll pick a couple of more good, well-known, games that run in 60fps from the 32-bit generation.

PlayStation: Bloody Roar 1-2, Castlevania: SotN, Ehrgeiz, Einhänder, Klonoa, Kula World, Marvel vs Capcom, Ridge Racer Hi-Spec, Tekken 1-3.

Saturn: Dead or Alive, DecAthlete, Final Fight Revenge, Radiant Silvergun, Soukyu Gurentai, Virtua Fighter 2, and pretty much every 2D game released (a rather hefty amount).

N64: F-Zero X, Super Smash Brothers

The only argument against it would be saying that the few of these games that run in interlaced mode are only outputting 30 full frames. While true in a sense, every one of those full frames are essentially "field rendered", to use a broadcasting term, and do indeed represent two discrete frames of animation - that are shown sequentially on the screen. 60fps. Can't believe there's people to question it really since it's such a well-established fact in game development.
 
VNZ-

I think our disagreement on this one comes down to one particular issue-

I believe only one SNES game ran interlaced though, the non-interlaced ("progressive" if you wish) 256*224@60Hz mode was used almost exclusively

Which isn't even a half frame using standard resolution and below the NTSC 'normal' resolution limits. Add to that the amount of animation per character that the SNES's carts allowed and you honestly consider refreshing the image @60Hz to be a true 60FPS?

For your game list, do you have any from this generation? I own one of the games out of those you listed(FZX).

Faf-

I disagree, side by side, the motion difference is rather obvious.

And if you could run half the screen at 60FPS and half at 30FPS it would be even more obvious. Simply playing a title and the difference isn't nearly as clear.

Also, on interlaced TV's the trailing/combing artifact from overlapping fields of different frames is much more visible when updating at 30fps as opposed to 60.

Depends greatly on the speed of the gameplay. If you take a FPS and spin around then it is fairly obvious(in comparison). If you are playing a fast paced racer then it can also show up quite easily. If you are playing a fighter or adventure game then the difference doesn't show up visibly nearly as easily(even using side by side). The flicker filter also makes a quite noticeable difference in terms of issues with artifacts on an interlaced display.
 
and you honestly consider refreshing the image @60Hz to be a true 60FPS?

I do. Refresh rate has nothing to do with resolution. It makes as much sense as saying that one classic animation has less frames per second because frames are drawn on a smaller sheets of paper than the other one. What about Quake 3 running at the 320x200 resolution at 100FPS on a monitor? Is that considered 100FPS because the resolution is so low?


If anything, 60FPS would appear more choppy on an interlaced display then 30FPS(due to fade rate and alternating line updates).

So what happened to this ;) ? It's flat out wrong Ben, just as your comment that only 'few' games this gen run at 60FPS... Flicker filter doesn't make any 'issues'. It sorts out flickering perfecty, as I've witnessed in too many games, and those games look much smoother than any 30FPS offering.
 
Marconell!-

I do. Refresh rate has nothing to do with resolution. It makes as much sense as saying that one classic animation has less frames per second because frames are drawn on a smaller sheets of paper than the other one. What about Quake 3 running at the 320x200 resolution at 100FPS on a monitor? Is that considered 100FPS because the resolution is so low?

If you were running @100Hz and drawing an actual new frame every other frame then it would be 50FPS.

So what happened to this ? It's flat out wrong Ben,

More frequent less pronounced chop or less frequent more pronounced. Running @60FPS every single frame is half disjointed from the prior on any interlaced display.

Flicker filter doesn't make any 'issues'. It sorts out flickering perfecty, as I've witnessed in too many games, and those games look much smoother than any 30FPS offering.

Huh? If we use the GC's FF as an example they employ a three line blur filter when ouputting. They render each frame at twice the resolution that will actually be displayed and blur it with the adjacent lines that will not be displayed on that frame update. The flicker filter certainly makes discerning between 30FPS and 60FPS more difficult, you are dealing with a blur filter. It doesn't matter if a game is running at 60FPS, 30FPS, 20FPS, 15FPS or 10FPS in terms of if the flicker filter is enabled or not.
 
This discussion is rather bizarre, but hey... For many people the difference between 60 and 30 frames / sec on a television or arcade monitor is so clear it's non-discussable. For others it's not. And now that interlaced output has become the norm the question has turned somewhat more complicated, since a TV per definition only can show 30 full interlaced frames per second. It is common knowledge though, that using a technique called "field rendering" in broadcasting, you can depict 60 discrete frames of animation on the 30fps interlaced screen.

BenSkywalker, you need to realize that the television system - prior to these 'progressive scan' days - always supported non-interlaced resolutions, and that up until the Dreamcast almost every console game ran non-interlaced. The # of vertical scan lines are halved, and if you watch the screen closely you can see cleary that the scanlines are perfectly steady as opposed to alternating the vertical position slightly between each pass. Yes, the resolution gets rather low, but that has absolutely nothing to do with the framerate which will be 60fps - if the screen contents are updated at every vertical blanking interval.

Heck, even the old Pong machines ran in perfect 60fps. Actually they had to since they had no VRAM/buffers or any other means to hold the on-screen contents - it was pretty much a question of turning the cathode ray on or off at the right moment those days.


Some current-gen games running at 60fps:
Dead or Alive 3 (Xbox), Gran Turismo 3 (PS2), Rez (PS2), Crazy Taxi (DC), Ikaruga (DC), Burnout 2 (PS2), Rallisport Challenge (Xbox), Super Monkey Ball (GC).

Some running at 30fps:
Halo (Xbox), Rez (DC), Super Mario Sunshine (GC), Final Fantasy X (PS2).

Although they all show only one full 640*480 (a bit more usually to fill the entire screen) frame every 1/30s, the difference is that games running at 60fps update the front buffer during the vertical blank between the two fields of every full frame. The result? Arcade-smooth movement! If you can't notice a clear difference? Well, cheers, you will enjoy arcade-smooth movement even from games that don't actually deliver it! :)
 
If you were running @100Hz and drawing an actual new frame every other frame then it would be 50FPS.

You don't say!!!? :rolleyes:

Of course I know THAT, but you tried to somehow downplay those old machines' FPS capability stating that they don't output the maximum TV resolution.

As VNZ said, this discussion is becoming rather bizarre, IMO.
 
There are quite a few Dreamcast games that run at 60 frames per second. Most notably Virtual On: Oratorio Tangram and Maken X of those that I own, both run at a smooth and constant 60 fps. It's very evident by the control, especially with VOOT. I believe that Fighting Viper 2, Daytona 2001, Sonic Adventure 2, F355, and Outrigger run at 60 fps as well.

Someone can feel free to correct me if I'm wrong?
 
VNZ-

BenSkywalker, you need to realize that the television system - prior to these 'progressive scan' days - always supported non-interlaced resolutions, and that up until the Dreamcast almost every console game ran non-interlaced.

Which TVs supported non interlaced output?

Yes, the resolution gets rather low, but that has absolutely nothing to do with the framerate which will be 60fps - if the screen contents are updated at every vertical blanking interval.

The only way it is 60FPS is if there is different informtaion being sent to the screen for each frame, other then that it is simply refreshing at 60Hz which all the consoles do for every title(obviously). Do you consider SMB, the original NES title, to run at 60FPS?
 
Which TVs supported non interlaced output?
Every single one, to my knowledge. I never saw one that didn't.

The only way it is 60FPS is if there is different informtaion being sent to the screen for each frame, other then that it is simply refreshing at 60Hz which all the consoles do for every title(obviously). Do you consider SMB, the original NES title, to run at 60FPS?
Yes, I do. Why? It sends different information to the screen each frame, ie. 60 times per second.
 
Each field is coming from a distinct frame, not sharing a frame. You do see that, don't you Ben? True, it is not a "genuine" 60 frames/sec, but that is about as close as you will get with an interlaced TV setup. Motion-wise it is essentially 60 Hz motion. Obviously it isn't technically full resolution, but what is when talking about interlaced systems?
 
Ah, that's right. :) I checked out NFS:HP2 during the weekend and it's running at 30fps with frequent slowdown. Burnout 2 is by far the best arcade / street racing game on PS2 yet.
 
BenSkywalker said:
Phat-

Besides, each time you're watching Friends or the news on TV, the animation rate is 60Hz, and the display is interlaced, and no one's complaining. That's the norm.

All NTSC video feeds are 30FPS. The same frame is redrawn twice for a vid feed.

No. All NTSC video feeds are 60 fields per second, and each field can be (and is in many cases) from a different frame of animation. That's why

1) We can tell the difference between 60fps and 30fps games on NTSC.
2) We get nasty inter-field motion artifacts (feathering of moving edges) when we try to combine fields from live action video or 60fps games.
3) VCRs, even the digital ones with frame buffers, halve the vertical resolution when you pause them. There's just no sure-fire way to combine two fields of arbitrary video and guarantee no inter-field motion artifacts.
4) Some DVD players offer different pause modes (frame, field, auto). Frame mode is only useful for movies where the source material is only 24fps and as a result each frame is spread across 2 or 3 fields and can be recovered perfectly. Field mode is for programming shot on video, and when you pause, you only get a half-resolution image.

Also, if it were true that the same frame is redrawn twice, then imagine what the implications would have been for the poor analog engineers that created NTSC. They'd have to store a frame's worth of information in analog so that it can be transmitted twice, once with just the odd lines and again with the even. It's impractical to engineer temporary storage in analog that can hold a signal with a frame's worth of content.

Phat.
 
Sorry for bringing up this topic again, but I've just experienced something very interesting while watching a tv broadcast.

Let me first describe the equipment here and how it's set up. I have a DISHNETWORKS receiver hooked up to a regular non HDTV rear projection tv via S-Video.

Ok as I was switching through the channels, I noticed a broadcast on HBO that sparked my interest. The broadcast seemed VERY smooth in terms of framrate. At the time I didn't know the reason why the framerate was higher/smoother than other shows on different channels etc. so after the movie I waited for the credits to scroll through to see if the movie was shot with special cameras. To my surprise the cameras used were HD cameras.

So my question is, why was that broadcast smoother than the others when my PT doesn't have progressive support or HD? Isn't NTSC broadcasts on regular tvs 30 fps? How did I notice the higher framerate if that's the case?

I'm guessing that the answer is related to 30 fps and 60 fps games on a regular tv set? :-?
 
Back
Top