what are frames per second


Standard Member
I read all the time about games being 30 and 60 fps. What is meant my this? Super monkey ball 2 supposedly has backrounds at 60 fps. This fps rate can´t refer to the actuall speed of the game does it, becuse I remember in street figther 2 turbo(by far the best fighting game EVER) on the snes you could select the speed of the game and even make in ridicilously fast, I wasn´t changin the fps rate any was I. And is the fps the same on 50hz pal games and 60hz ntsc games? :clown:


Distinguished Member
FPS (when not talking about First Person Shooters) is the amount of frames the machine draws per second.

The human eye will usually find that 25 frames is adequate for animation, however to get extra smooth gaming, developers tend to go for 60.

The FPS does generally indicate the speed of the game as far as you perceive it. Of course, when your game gets choppy and things start to skip that is where FPS drops. The game itself is running on the computer still, yet you will only see it being updated say 10 times per second.

The higher your FPS the more smooth and accurate your controls will be, obviously the greater the lag between what you see and what is actually happening the worse your responses will get.

I can't really answer your question on SF 2 Turbo, as theoretically they could have just increased the framerate or they simply may have just run the software faster at the same frame rate. It may have required both to be upped, or a framerate increase was simply a consequence.

In the end though, the higher your framerate the better and smoother your experience will be.

IMO there is no excuse for any slowdown in a console game, developers know what they are using and any slowdown is bad design. On the PC is par for the course :)


Standard Member
OK, this is about what I tought. But I still don´t get how this stuff relates to street fighter speed settings and is this all the same on pal and ntsc


The links AccursedDelphi should help if you've read them.

As for the Street Fighter II Turbo thing... My guess is that when upping the speed the CPU is just working harder to increase the speed (and probably its frames per second). You can see this happing quite often on old emulators on a top spec PC where a frame limiter is needed to keep the game playable. Quite good if this is the case since the SNES has a rather slow CPU... 3.84Hz or something o_O It also has 2 or three other pre-set slower speeds but this isn't the time to go into depth...

Sinzer was totally correct at how the naked eye can notice FPS. Movie reels found in cinemas only have 24FPS for example and thus need a conversion when pressed on DVDs (it also needs a 4% speedup on PAL DVDs to keep things in sync).

PAL is a standard accepted in Europe and runs at 50Hz, 25 fields a second.

NTSC is mainly used in Japan and America and runs at 60Hz, 30 fields a second.

Both are interlaced and consists of two fields flickering/interlacing to create a solid picture... So for example the first field would display the odd lines "1 - 3 - 5 - 7 - 9 - 11" etc and the second field would display all the even lines "2 - 4 - 6 - 8 - 10" etc.

1-3-5-7-9-11 Odd field
-2-4-6-8-10 Even field



Hmm... crap examples... So much for corurier being identically spaced out >_<

Anyway, these two fields combined creates an illusion of "one solid" image. On PAL TVs the scan rate is at 50Hz and so the screen is filled with odd and even lines 50 times per second. For NTSC the scan rate is 60Hz and thus the screen is filled 60 times per second.

More differences between the two display standards are colour and resolution. PAL has an extra 100 lines or something (please correct me on this since I always forget) and those lines are responsible for the 17.5% slower gameplay and borders. With a 60Hz option the screen is all used up and games are played they way it should be.

So to answer Keysers question if SF2T wuld run the same on PAL and on NTSC consoles.. The answer would be no since the fastest setting on both would make the NTSC version the fastest by some margin.

Also to add... The magic of progressive scan - This signal produces one full field at a time and not an interlaced one. So instead of seeing one half of the image, then on the next field the other half of the image (interlaced) you will see the full field at any given time - Much like how you would see pictures on a movie reel. The benefits of progressive scan is obvious and Europe really needs to get its arse into gear and support this full stop.

Anyway.. thats enough ranting for now. I can't remember all the specs and details all the time so feel free to correct me ^_^


Distinguished Member
To expand on neoblades explanation there is this web page


It is pretty techy, but gives you a good idea of the reasoning behind FPS.

The higher your Htz on your TV the more frames you can display.

From looking into it more, I would guess that the game in normal mode was limited to run at a certain CPU speed.

For example, for the 5 star rating it would compute 1 button press every 1 second. For the 10 star rating it would be allowed to run twice as fast allowing you to get in two button presses every second.

Your actual FPS would not be increased, as the TV at the time would only support a maximum of 30(NTSC)/25(PAL) FPS anyway. What would be increased is the amount of times you made your character move, creating the appearance that FPS had been increased. If anything the framerate would decrease as you are asking the processor to compute twice as many inputs and produce twice as many outputs.

If anyone does know it would be quite interesting (only really because I am at work today and it is more interesting than IT support! :p )


Active Member
Originally posted by Sinzer
To expand on neoblades explanation there is this web page

Your actual FPS would not be increased, as the TV at the time would only support a maximum of 30(NTSC)/25(PAL) FPS anyway.

This is not quite correct, as the interlacing fits two separate video fields into the period of each frame. Since the fields are drawn successively on screen, you can display 60 or 50 genuine motion updates per second. Each field relates pretty much the same to the preceding field, whether it is theoretically part of the same frame or not. The term 'frame' is only valuable when talking about the signal structure.

Forgive me for holding a speech here, but the link made me angry. 'Michael Cranford' apparently has not really understood the theory himself, but only learnt to repeat it, like a parrot. That is why he refers to some perfectly valid statement as 'nonsense', instead of filling them in with a proper explanation. These tech parrots often have too much information to give, but have to learn that definitions do not always properly reveal practice.

There is a common misconception that an even field inside of a frame blends only with the odd field of the same frame, and not with the odd field of the succeeding frame. How on earth would that be possible? Each field never even fully shows in its entiety but as it is drawn from top to bottom, it fades out from top to bottom. By the time the bottom is drawn, the top has already started fading out. Some of the image stays around until the next scan because of phosphor persistence.

The slight timing difference between odd and even fields translates from a temporal into a spatial offset of the wave in which the image is drawn.

You can often see the filling in and fading out if you took a photo of a TV screen; there will be a dark band somewhere in it which will be a bit blurry near the top and sharper towards the bottom, and have more or less of the image still showing in the dark band.

In TV sets and CRT monitors, the image is more like a wave travelling across the screen than it is like a film projector, which continuously displays each frame while the shutter is open (the movement of the shutter does in a way wipe through the image, but as it does it instantaneously removes the current frame from view).

Finally, the older video game systems (up to and including the PlayStation) ran non-interlaced, so they gave a low-resolution progressive image that had none of the artifacts associated with interlacing.


I haven't read the articles listed on this thread so far.. I've only read ones listed on the Progressive Scan forum in the TV & Plasma section and have a general feel of the technology. I'm certainly no expert at it but its nice to meet peeps that do ^_^

You can usually see the ugly side of interlacing on LCD screens than on CRTs which would often "blur" the fields together due to the phosphor persistence as Zacabeb stated. The Tosh ZP18 suffered from too much phosphor persistence and there are many threads about it too. I say LCD screens more because it seems to be a lot more obvious. I was looking at a 32" LCD screen at Dixons XL and there was some skateboarding going on. The image produced was interlaced like hell, you could see feathering/combing (I think this is the term o_O) on the male skater and objects as the camera panned to keep him in shot. While this did look poor I was interested to see how it would perform with a progressive scan image since I'm sure it would produce an image to rival a plasma.

As for the PSX giving out a low res progressive scan image... If that was the case then wouldn't 99% of CRTs be unable to display it? Or was the progressive scan image then converted into an interlaced image to run on a CRT? Much like how DVD players display its images since DVDs store data as 480p or something....

That didn't make sense ^_^;


Distinguished Member
Thanks Zacabeb,

not being expert on TV hardware, I had assumed this guy had it correct.

I should have really thought about the refresh rates you see on a TV when viewing a monitor. You can see a line that goes down the screen, this is the picture being refreshed, I completely forgot that the field is not swapped as whole but is drawn in line by line.


Active Member
Originally posted by NeoBlade
As for the PSX giving out a low res progressive scan image... If that was the case then wouldn't 99% of CRTs be unable to display it? Or was the progressive scan image then converted into an interlaced image to run on a CRT?

No, it is actually output 'as is'.

As far as I understand, it is the relative timing of the fields that produces the interlacing. The even fields are delayed by about half a scanline (sync pulses and all) so that the horizontal retrace happens later in the vertical cycle, causing the image to shift down by half a line's height. This is how I have understood it to work, but I might be wrong.

In the case of a non-interlaced signal, the even fields would simply not be delayed, making all fields behave as if they were odd.

TV's usually accept non-interlaced signals, but some TV's (digitized sets I guess) actually have problems with them causing the image to jitter. VHS decks have problems recording and playing back non-interlaced signals though, as you might have discovered if you tried to record a PSX gaming session and view it in slo-mo.

BTW, the information on that page is not all wrong, but the guy (Michael Cranford) was missing the point he should be making and was mocking some valid statements...


Ah right... I was just wondering since I've got a Toshiba Strata that is capable of displaying 480p and was wondering why it would be so hard to implement this on other TVs...

In europe we're so behind the rest of the world when it comes to display technologies that its rather annoying >_< The only way to be future proof it seems is to buy a plasma which can cope with all the display formats or a decent RPTV... But then RPTVs and gaming IMO don't mix well.

The latest video from AVForums

Samsung S95B QD OLED Review - A Quantum Leap for OLED!
Subscribe to our YouTube channel

Full fat HDMI teeshirts

Support AVForums with Patreon

Top Bottom