Understanding the Gamma Correction Algorithm: Theory and Applications - 180D-FW-2024/Knowledge-Base-Wiki GitHub Wiki
Understanding the Gamma Correction Algorithm: Theory and Applications
1. Introduction
Gamma correction or better known as “Gamma” is a nonlinear operation used to encode and decode luminance values in computer vision. The original primary purpose of Gamma Correction is to compensate the nonlinear behavior towards voltage applied of older display devices such as cathode ray tube (CRT) and liquid crystal display (LCD) monitors, ensuring that images’ color and brightness appear consistent across different display devices and natural to the human eye. This is due to the nonlinear human eye perception sensitivity in differentiating darker tones than lighter ones.
Gamma correction addresses the nonlinear sensitivity of human vision to brightness. The human eye perceives differences in darker tones more distinctly than in lighter ones, necessitating a non-linear adjustment to ensure visual accuracy in digital images. This perceptual alignment is critical for realistic rendering of scenes in applications ranging from photography to computer graphics. Even though modern display technology such as light-emitting diode (LED) display does not have the nonlinear behavior like CRT and LCD had, gamma correction is still used today in display devices. The image signal processor (ISP) in these LED monitors emulate the nonlinear behavior of CRT and LCD monitors to maintain visual consistencies across different display devices, ensuring backwards compatibility with older display devices.
Furthermore, gamma correction has become integral to modern digital imaging standards, such as the sRGB color space. These standards ensure consistent visual output across devices, from mobile screens to professional displays, by encoding images with a predefined gamma value. Without gamma correction, modern devices would struggle to maintain compatibility with legacy systems or achieve visually accurate color and brightness representation.
2. Mathematical Definition
Gamma correction is defined by a power-law relationship between normalized input pixel and normalized output pixel value. Generally, the input pixels are normalized between 0 and 1. Note that the larger the value here, the brighter it is and vice versa.
The power-law function of Gamma correction ensures that pixel intensity values are adjusted to match human perception, in which, smaller changes in dark regions are more noticeable than similar changes in brighter regions. The gamma correction function modifies brightness in a way that aligns with this perceptual characteristic.
The only variable of this equation is gamma. If gamma variable is equal to one, then V_out and V_in would have the same value, in other words, linear relationship. When gamma variable is larger than one, the output pixels are darker. In contrast, when gamma variable is smaller than one, the output pixels are brighter.
3. Mechanism in Display Technologies:
As discussed previously, in older display technologies such as CRT and LCD, the relationship between the input voltage and emitted light intensity was nonlinear. It was found that these display technologies follow a power-law function with an exponent (gamma variable). The nonlinearity implies that if any image were to be displayed as is without any correction (in other words, linearly), the images were darker than it was intended. Gamma correction fixes this issue through a two steps process, “encoding” or pre-correction the image and “decoding” post-correction the image for display. Let us define gamma variable as gamma_encode = 1 / gamma_decode in which gamma_decode = 2.2 (standard value of Gamma Correction). gamma_decode larger than one while gamma_encode, its inverse, is smaller than one.
- Gamma Encoding (Pre-Correction): Image data is gamma-encoded by applying correction to its pixel value with gamma_encode = 1/2.2 = 0.4545. This decoding compresses the dynamic range of the image, consequently emphasizing the darker tones to align with human visual sensitivity. This encoding process is especially important for compressing the dynamic range of high-bit-depth images, ensuring that darker tones are more visible without significantly altering midtones and highlights.
- Gamma Decoding (Post-Corection): Upon display, the encoded image data is gamma-decoded by applying gamma correction with gamma_decode = 2.2. This results in expanded dynamic range while restoring the intended original brightness level for accurate visual representation across different platforms. Decoding also plays a crucial role in maintaining backward compatibility with legacy content created for CRT displays, ensuring visual fidelity even on modern LED or OLED screens.
Modern display systems integrate gamma correction as a hardware feature, often managed by GPUs or image signal processors (ISPs). This ensures real-time processing during image rendering and display. In addition, software tools like Adobe Photoshop and Lightroom allow users to manually apply gamma correction to images for creative and corrective purposes.
4. Standard Value of Gamma Correction
The standard gamma correction value is 2.2 (decoder) or 0.45 (encoder) which roots back to the behavior of CRT displays. As discussed before, the relationship between input voltage and emitted light intensity of CRT display follows a power-law with exponent value (gamma variable) around 2.5. As display technology evolved, giving birth to LCD and LED displays, gamma correction was eventually standardized to the value known today of 2.2, in which it is applied to the sRGB color space ensuring the uniformity across all different display systems. The value 2.2 is not only a technical standard but also a perceptual optimization, designed to closely mimic the way human vision processes brightness. This standard is embedded in the sRGB color space, which dominates digital imaging workflows across the web, mobile devices, and desktop applications.
5. Wider Applications Beyond Display Technologies:
The original primary purpose of gamma correction was to compensate older display technologies such as CRT and LCD nonlinear response towards voltage input and output. In our modern time, gamma correction has been applied in much broader computing imaging application with the same goal, that is to maintain color and brightness images’ consistency across different platforms:
- Photography: Analog camera’s film also has nonlinear response similar to CRT and LCD displays. Consequently, gamma correction is used to compensate the nonlinearity to ensure consistency of images that analog camera captures. Similarly, digital cameras gamma correction is used to adjust the raw sensor data from the photosensors so that it appears natural and reflect the true scene’s brightness and color.
- Image compression technology: Gamma correction ensures that rendered images have consistent brightness and color across various display devices (typically at gamma=2.2), ensuring that the images does not appear too bright or too dark while maintaining color integrity.
- Medical Imaging: Gamma correction is used to enhance the visibility of medical images such as X-rays and MRIs, adjusting its contrast and brightness to highlight specific features.
- Television and Broadcasting: Broadcasting standards uses gamma correction to ensure that the video content is consistent across different televisions.
- Printing and Publishing: Gamma corrections adjust digital images to account for the nonlinear response of printers and inks. It ensures the printed output matches the intended design’s color and brightness.
- Low-Light Image Enhancement Algorithm: Gamma correction can be used as a low-light enhancement algorithm rivaling neural network approach.
Source(s):
https://paroj.github.io/gltut/Illumination/Tut12%20Monitors%20and%20Gamma.html https://www.baeldung.com/cs/gamma-correction-brightness https://pyimagesearch.com/2015/10/05/opencv-gamma-correction/ https://link.springer.com/chapter/10.1007/978-981-15-4466-8_3
Note: Only images are taken from Wikipedia