why on cameras a higher frame rate means a slow mo effect but with monitors and other things like that means a smoother experience instead and the spoed stays the same

928 views

why on cameras a higher frame rate means a slow mo effect but with monitors and other things like that means a smoother experience instead and the spoed stays the same

In: Technology

4 Answers

Anonymous 0 Comments

Cameras are *receiving* information at a higher framerate. Since each frame captures a thinner slice of time, that means, if played at normal speed, it will be in slow-mo.

However, a monitor is *outputting* information. The computer handles what actually goes into each frame. So it can give the same effect as slow-mo (less time progression between frames) and higher fps at the same time for a richer visual effect, but keep the game time running at normal speed.

That said, I have seen retro games run at like 200fps and they actually are in super speed. The guy who insisted on playing that way was a little bit bizarre hehe.

Anonymous 0 Comments

It’s because they’re not telling you everything. You need to know the relationship between capture and playback to know the speed up/slowdown. they’re only telling you the capture speed for the camera, not the playback speed for whatever you’re playing it back on. and their only telling you the refresh rate on a monitor, which is not related to either. It’s not apples and oranges

A camera catching 200 frames in a second second will look normal speed on something displaying 200 frames in a second. Same for capturing and displaying 5 frames in a second.

But if you capture 200 frames a second and show 50 frames in a second… Second one you see frames 1-50, second two 51-100,etc. that one second of capture takes four seconds to play, so looks like it’s going slower

However, a monitor isn’t showing you a list of frames in the order received. A monitors refresh rate is how often it looks at the source and updates the screen to match. The speed of playback is already set by the recording frames per second and the playback frames per second, and is just churning away in the source. The monitor glances at that, then draws you a picture, then glances, then draws. The faster the monitor does the glance and draw, the more what you see will look like you were able to actually just watch the source at whatever speed it was actually playing back at, nice and smooth. But it won’t change how fast the playback is.

Anonymous 0 Comments

If a camera records a scene at 240 frames in a second, then that means 240 images have been captured. If I play those captured images back at a rate of 60 frames per second, then that means it will take four seconds (240 frames divided by 60 frames/sec) to play back that one second’s worth of footage, which gives us a slow-mo effect.

Playback speed and refresh rates, however do not create a slow-mo effect as the content that you are viewing on a high-refresh rate monitor is being played back in real-time. So if I watch a 60fps video of someone’s gameplay on YouTube on my 120Hz (120 screen refreshes in one second) monitor, then each frame from the video will be output on my monitor twice.

High-refresh monitors do not slow down time since that would be impossible, but instead display frames from slower footage multiple times.

_Rendered_ content, however, will run at whatever framerate that the outputting device can handle. So if you have a PC that can render a game at 120 frames per second, then each frame will be displayed by your monitor exactly once, resulting in a smoother experience. The more frames that can be displayed in a given span of time, the less choppiness you will experience as there is less of a delay between frames.

Anonymous 0 Comments

The thing surrounding all of this is frames per second (FPS).

Movies, for instances, are typically played at 24 frames per second, meaning the screen switches to a different image once every 24th of a second. If the film was also recorded at 24fps (as it usually is), then the image will move at the “right” speed. However, if you want to play something that was recorded at 48fps at the same constant playback speed of 24 frames per second, you can do one of two things: **a.**, you can skip every other frame, so that each of the frames you *do* play is where it should be relative to time, or **b.**, you show every frame. The player can only change pictures 24 times a second (or, it’s at least limiting itself to that rate), so it takes twice as long as it “should” to get through every frame, and the slow-down is what we call slow motion.

High refresh-rate monitors, on the other hand, are *not* locked at 24 frames per second (or 24 Hertz, when speaking about the hardware), and their high frame rate is more likely to come into play when rendering video games than when watching video. Video, as stated above, is recorded and saved at a specific frame rate (typically 30 or 60 fps), while the frames of a video game are rendered as rapidly as the hardware can draw them, with the game’s scale of time being calculated independently.

An interesting note about this is that some very old games (and maybe some very badly made recent ones) would impose their own slow motion because their hardware couldn’t go fast enough. Space Invaders, for instance, speeds up as you defeat more enemies because it no longer has to calculate the position or sprite for each defeated enemy, ergo the time taken to calculate each frame of the game gets shorter.