Camera object - jhhl/AUMI-Together GitHub Wiki
Camera Object
from the standpoint of the app, the Camera object delivers a frame through a callback or other signaling system. The camera independently chooses video sources from:
- the cameras visible to the browser
- built in test video files (like AUMI Sings)
- external video files and streams that are accessed via URL. Possibly detect formats or support only a few of them.
Certain sources, the cameras, may have properties to choose from for resolution, frame rate, image brightness and contrast and gamma and F stop etc. We will request certain properties and hope the camera can comply closely, downgrading during a mismatch.
The source image is affine transformed and possible clipped (zoomed) before further processing, to make that processing cheaper.
Sources without the image tweaking properties may implement some features in software as video processing in the camera object. Videos files may also change looping ranges and speeds.
There may be a split between the frame presented to the user for UI purposes and the frame presented to the tracker.
They could be the same though for now. Trackers will further process the frame for their own needs.
The Camera Object Frame rate may be artificially set to be lower than what's provided, but the full processed frames should be provided in a timely fashion. The Camera should monitor its properties for changes or wait for an update call rather than reconfiguring all the time.