Camera calibration - NickRowell/asteria GitHub Wiki
The camera calibration involves determining a bunch of quantities that describe various properties of the sensor, the optical system and the orientation of the camera. These can be roughly divided into the electronic, photometric and geometric calibrations as described below. To achieve accurate meteor triangulation it is essential to achieve accurate calibration of the projective geometry of the camera, which relates positions in the image to directions in the three dimensional coordinate frame of the camera, and the point spread function, which is used to precisely measure the positions of point sources in an image. However there are many calibrations of secondary importance that should also be performed to help really optimize the system.
Calibration methodology
The calibration is performed by taking multiple (baseline 61) images during a stable (no meteors or other moving objects) time. It's important that the scene is as far as possible stationary and unchanging for the duration of the calibration run. The multiple images allow characterisation of the noise properties of the sensor, and also enable a reduced-noise median image to be computed for use in the geometric and photometric calibrations.
Electronic calibration
The electronic calibration is relatively limited in scope. At the present time this only covers the readout noise in the image.
Readout noise
Knowledge of the readout noise is useful for setting appropriate thresholds for the detection moving objects; essentially it helps one to know whether the change in the value of a pixel from one frame to the next is due to statistical fluctuations or if it indicates that the scene is changing (e.g. due to a meteor).
In scientific applications the readout noise is generally modelled as a Gaussian process. The main noise sources are assumed to be readout noise and shot noise, which is caused by statistical fluctuations in the number of photons received from a source and is modelled as a Poisson process. However, initial investigations with two cameras (a Watec 902H-Ultimate and Chicony Electronics webcam) suggest that these devices tend to have unspecified noise reduction algorithms built in at the hardware level, making it difficult to apply a theoretical model of the noise. In practise an empirical model may be more useful.
Geometric calibration
The geometric calibration of the camera refers to the problem of determining the relation between points in the image frame (2D pixel coordinates) and directions in the camera frame (3D spatial coordinates), which is a three-dimensional reference frame centred on the camera and with axes pointing along the boresight and at ninety degrees to that. This calibration is necessary to determine the direction towards a meteor in the Earth-centred reference frame in order to perform the triangulation. The calibration is performed using observations of reference stars in the field of view.
Intrinsic parameters [focal length, principal point, radial distortion]
The intrinsic parameters of the camera calibration are those associated with the projective optics of the camera lens system. The parameter set depends on the particular camera model adopted, which in turn depends on the type of lens used. It's likely that only two models will be required: that of a fisheye lens (in the case of all-sky cameras), and that of a pinhole camera with radial distortion. The latter model is suitable for the fast lenses used by NEMETODE systems and will be the baseline. The parameters of this model are the focal length, the principal point (the location of the camera projection centre in the image) and the coefficients of a low-order polynomial used to characterise the radial distortion.
The camera model should be modular and easily swapped out for other types in order to support different types of camera.
Extrinsic parameters [location, orientation]
The orientation of the camera can be constrained using observations of reference stars, provided a reasonable initial guess. The location in the Earth-centred reference frame cannot be constrained from single site observations, and must be supplied by the user.
Note that when merging observations from several sites and solving for the triangulation of the meteor trajectory, it could be possible to additionally solve for the precise relative locations of the different cameras, perhaps giving more accurate location than can be obtained from GPS etc. It might also be worth doing simultaneous solutions for multiple meteor events to provide better constraint on the camera locations in the network.
Photometric calibration
The photometric calibration of the camera refers to the problem of converting the levels recorded by each pixel (in digital units) to estimates of the incident flux (for example in photons per second). This again will be determined using observations of reference stars of known magnitude. For triangulation purposes this calibration is of lesser importance than the geometric calibration.
Reference star catalogue
The reference star catalogue used in the calibration will probably be based on Hipparcos & Tycho-2. It needs to contain bright stars that are observable at high signal to noise in stacks of ten or so frames. Some trial and error will probably be needed to get a catalogue that contains a reference field sufficiently dense to provide enough stars in a single field of view to fully constrain the geometric model, but not too many that incorrect reference star associations are made.
Calibration process
It's likely that a running calibration will be used to allow for gradual evolution in the calibration, i.e. every ten minutes or so a selection of images will be used to solve for the camera calibration parameters, which will be merged into a running solution in a Kalman-filter-like process so that individual noisy or unconstrained solutions can be filtered and the current 'best guess' solution is always available.