Filter Guide: Video Stabilizer - Crowsinc/LiveVisionKit GitHub Wiki

An ultra-fast video stabilizer used to smooth out video footage in real time.

The video stabilizer analyses the motion of video and moves each frame to produce the illusion of smooth camera motion. The frame must be cropped for it to move around without introducing any visible gaps in the video, so some content will be lost around the edges. Additionally, proper stabilization requires that the video content is suitable for camera motion estimation. This is detailed in the FAQ section.

Specifications

  • Filter Type: Video or effects filter
  • HDR Support: No
  • Performance Impact: High (>3ms)
  • Special Requirements:
    • Must have an OpenCL capable GPU.
    • Must be applied before any HUD elements (donations, subscriptions etc.)
    • If used as an effects filter, you must manually delay any audio to match the stabilized footage.

Settings

  • Smoothing Radius: How many past and future frames to consider for stabilization. Unfortunately time travel doesn't exist yet, so we must settle for introducing a video delay proportional to the radius. Higher values generally lead to smoother motion that is less reactive to camera movements.

  • Stream Delay: Displays the video delay introduced by the stabilizer. The delay scales with the frame rate, so switching from 60FPS to 30FPS will double the delay. If the stabilizer is being used as an effects filter, you will need to manually delay the audio of the source by this amount. This is detailed in the FAQ section under Why is my audio out of sync?.

  • Subsystem: The underlying model used for camera motion estimation and smoothing. The choices are:

    • Homography: Uses a perspective transform with 8 degrees of freedom.

    • Vector Field (Experimental): Uses a 16x16 grid of motion vectors for accurate stabilization and handling of parallax effects. This system is unfinished and only provided for testing purposes - use Homography instead.

  • Quality Assurance: Previously known as Suppression Mode, this setting controls whether the stabilizer should automatically turn off stabilization for frames where the video content is considered unsuitable for stabilization. The choices are:

    • Strict: Only stabilizes above a predicted stabilization quality of 95%. This is the recommended option for IRL content, where the stream content is dynamic and sensitive to bad stabilization. It may also be suitable for first person games that focus on exploring a 3D world.

    • Relaxed: Only stabilizes above a predicted stabilization quality of 40%. This is the recommended option for most game/VR content that does not involve moving within a 3D world.

  • Independent X/Y Crop: If enabled, the Crop X and Crop Y sliders can be used to apply independent crop values to the width and height of the frame respectively. Otherwise, the Crop X value is used for both.

  • Crop X: The percentage crop to apply to the video (only applies to the width if Independent X/Y Crop is chosen). The crop region provides a hard limit on how much a frame can move for stabilization. Choosing a crop requires compromising between how much video resolution is lost and how large of a camera movement can be stabilized.

  • Crop Y: The percentage crop to apply to the height of the video if Independent X/Y Crop is chosen. Does nothing otherwise.

  • Auto-Apply Crop: Whether the output of the stabilizer should be automatically scaled into the chosen crop region. Cropping is performed using AMD FSR 1.0 technology.

  • Background Color: The color displayed around the frame as it moves around the crop region. This is only visible if Auto-Apply Crop is not selected.

  • Disable Stabilization: Deactivates stabilization while retaining the specified crop and video delay. Used for seamlessly enabling and disabling stabilization without distracting viewers with the sudden application of the crop or pausing the video to reach the delay. Also useful for comparing the video with/without stabilization.

  • Test Mode: Aids in testing by drawing the following information to the screen:

    • Frame Time: The amount of time required to stabilize a single frame on your computer. To avoid stream lag, this number should be well below 16ms for a 60FPS stream and 33ms for a 30FPS stream.

    • Crop Rectangle: Displays the specified crop region.

    • Tracking Markers: Shows the tracking markers used for motion estimation.

    • Warp Grid: Shows the vector grid used to warp the frame for stabilization. Only relevant for the Vector Field subsystem.

Frequently Asked Questions

  • What video content is suitable for stabilization? The stabilizer estimates the motion of the camera by analyzing the motion of the scenery that dominates each frame. For the camera to be stabilized, the landscape (which moves roughly opposite to the camera) needs to be easily visible and must make up the large majority of the frame. Additionally, the camera movement should be continuous and the range of depth within the frame should be fairly uniform. For example, if 50% of the content is very close to the camera while the rest is very far away. The stabilizer won't know whether to stabilize the nearby objects or the far away objects, which move faster relative to the camera. In terms of games, you will want to avoid any gameplay that has HUD elements, as these will be warped and moved around by the stabilizer. Suppression modes can help automatically disable stabilization when the video content is detected to be unsuitable. You may also use the 'disable stabilization' setting to manually stop stabilization when the content is unsuitable, while retaining the specified crop and video delay.

  • How do I pick which crop to use? The choice of crop depends on both your video resolution and the expected motion within the video. It is best chosen experimentally by running the stabilizer on old video content, and making a compromise between how large of a motion can be stabilized and how much resolution will be lost. At 1920x1080, a crop of around 3-4% is enough for removing most vibrations, while a larger crop of 8-12% may be necessary to handle aggressive swaying motions caused by walking or running. Motions which cannot be stabilized with the given crop will appear as sudden jerks in a previously smooth video.

  • I'm using a shoulder mounted camera, why does my body become shaky? When the camera is mounted on your shoulder, it is shaking in synchrony with your body movement. Meaning you are perceived to be stable relative to the camera, while the landscape is seen as being shaky. When stabilization is applied, the landscape is forced to become still by counter-moving it based on its perceived shake. You are assumed to be part of the landscape, so the counter-movement forces you to become shaky while the landscape becomes stable.

  • Why does my stabilization look jittery or like jello? Jittery stabilization is often the result of inappropriate use of the affine motion model; try forcing the stabilizer to use the homography motion model. The jello effect is most often caused by your camera's rolling shutter distortions, which become accentuated once stabilization is applied. This is best fixed at the source by configuring your camera to minimize rolling shutter distortions.

  • Why is my audio out of sync? If you are using the video stabilizer as an effect filter, you will need to manually delay your audio by the amount shown in the stream delay setting. One way of doing this is by adjusting the sync offset of the source via the Advanced Audio Properties screen. This can be accessed by clicking one of the cog buttons in the OBS Audio Mixer, then clicking the 'advanced audio properties' button.

  • My video won't stabilize or has distortions no matter what settings I use, what can I do? If no combination of settings is able to produce a stabilized video, then your video most likely cannot be stabilized by the LiveVisionKit stabilizer at its current state. You may want to wait for future updates or try other existing stabilization solutions. However, consider that this method of stabilizing video in 'post-production' is heavily restricted by the amount of reliable information that can be extracted from the video content. There is currently no perfect solution for doing this, and certainly not one that also runs at over 60FPS. The best stabilization will always be achieved by fixing the problem at the source!