What the heck is grayscale gamma?

If you’ve ever explored the menu options on your TV, chances are that you’ve seen something called “gamma” or “grayscale gamma." “Grayscale” itself is pretty self-explanatory (it's the difference in scale from black, to gray, to white), but what does “gamma” have to do with anything?

Let’s take a trip through time and look back to an era where televisions used to be huge, heavy boxes that housed cathode ray tubes (CRT). In those days, the picture on your TV was actually the result of an electron gun shooting electrons down a vacuum tube to eventually strike tiny chemical dots (called “phospors”) that were painted on the back of the glass TV screen. These dots were meant to light up in three colors: red, green, and blue. These red, green, and blue phosphors were the predecessors of the pixel, is what makes up a modern-day LCD TV screen.

crt
Credit: Getty Image users Thampapon and Pillon

In the early days of TV, television manufacturers ran into a problem. Because the first televisions had images rendered only in black and white, the grayscale was very important (and, as we know, it remains important, even when pictures are shown in color). They wanted to be able to show images with depth and shadow, but without having every image look too dark or too washed out.

Back in the CRT days, the amount of light visible on the screen was related to the voltage applied. It’s like a dimmer switch for the lights in your home; as you slide the dimmer up, more voltage is applied, and the light bulb gets brighter. For ease of discussion, the luminance (or brightness) of the image was divided into increments of the total luminance, where 100% represented maximum brightness possible for that TV, and 0% represented total darkness. The rest of the light was usually described in increments of 5% or 10% (also referred to as increments of 5 or 10 IRE), and the TV grayscale was born.

Okay, you think, this all sounds perfectly logical. So what’s the problem? Bear with me for a little bit. It was eventually discovered that the relationship between the voltage applied and the light brightness was NOT linear, that is, increasing the voltage by 10% would not increase the brightness by 10%, but by an amount more or less than 10%. It turns out that the relationship between input voltage and brightness for a CRT TV is related by a gamma law, which takes the form of...

equation_2

where Vin is the input voltage, A1 is a constant, γ (gamma, a Greek symbol) is a constant exponent, and Lout is the output light level. Here's what it looks like on a chart:

gamma25

As a result, if you were to attempt to divide that non-linear curve into equal 10% increments (as though it were linear), from off (total darkness) to fully on (full brightness), this is what it looks like:

linear_gray

This is the so-called “linear” grayscale. If you look at it, this grayscale has a lot of different shades of light, and fewer shades of black. The problem is that the human eye has much more trouble differentiating between lighter shades than it does differentiating between darker shades; humans would perceive the “linear” grayscale on a TV as being too bright and washed out, with not enough subtle shading and detail.

To address this problem, cameras built to record television content were designed to use the “gamma correction”:

equation_2

Lin is the light level recorded by the camera, A2 is a constant that may or may not be equal to A1, Vout is the voltage level output by the camera, and γ is a value is identical to the gamma value in the first equation.

Let’s solve these equations. The output voltage of the television camera is equal to the input voltage received by the television:

substitution

A3 is a new constant that equals A2γ. We now see from the last equation that with the gamma correction, the input (recorded) light levels and the output (TV) light levels are linearly related to one another.

Let’s try an example: Say that you’re receiving a signal from a TV camera that does not have the gamma correction applied, and instead, it generates an output voltage that exactly matches the input light level. In this case, a 50% brightness level would correspond to 50% of the total possible applied voltage. If you take that 50% total applied voltage and put it into the electron gun of your CRT TV (assuming a gamma value of 2.5 and that A1=1),

calc_1

that would create an image that has about 18% of the total brightness, which is not at all what you were recording on the camera.

gamma25point

Now assume that a TV camera is designed to employ the gamma correction. Assume that A2=A1=1, and γ again equals 2.5. In this case, an input light level of ~17.7% yields an output voltage of 50%:

calc_2
gamma125

And that output voltage, now applied to the TV as the input voltage:

calc_3

creates a brightness of ~17.7%, the same level that was recorded by the TV camera. The effect of using these two gamma corrections in conjunction with one another is that the input and output light levels are linearly related to one another.

effective_gamma

Once the gamma correction is applied, a linear relationship exists between the recorded and displayed light level, as you can see below.

Using the gamma correction creates a gamma grayscale, which is shown below. As you can see, there’s much more emphasis on the darker tones.

nonlinear_gray

Throughout this example, you may be wondering why I’ve used a gamma value equal to 2.5. Both in the past and in the present, gamma values for televisions typically range from 2.0 to 2.5, and can often be altered based on the viewer’s preference or viewing environment.

gammacurves

With the same input voltage, lower gamma values generate brighter shades, meaning that lower gamma values are best for areas with more ambient lighting; the TV’s brightness has to be able to compete with the surrounding light, or it will appear washed out. Higher gamma values generate a wider variety of darker shades of light (as we saw), and so are best suited for dark viewing rooms, where your eyes can adjust and more readily discern the differences.

The next question you may have is the following: the gamma correction is all well and good for CRT TVs, but we haven’t had CRT TVs for decades, so how is the gamma correction still relevant?

Here’s the thing: so much television content was recorded with the gamma correction applied that we need to fake it to make it look good on a newer, digital TV.

gamma_on_TV
Credit: Wikimedia Commons user Ricardo Cancho Niemietz

So what did digital television manufacturers do? They duplicated the gamma response from CRT TVs in digital TVs. Only, instead of comparing brightness to input voltage, a given brightness level is the result of an input “code” or “color” value, which is basically a binary or hexadecimal representation of a particular shade of gray. Between the long-standing precedence of the gamma correction, as well as the fact that digital TVs did not have the capacity to correctly interpret the full gray- and color-scale, the need for a gamma correction persisted long after CRT TVs stopped being relevant.

It is only recently, with the development of cameras and TVs that can record and display in High Dynamic Range (HDR), that the CRT version of gamma is becoming a thing of the past.

One aspect of these new HDR TVs is that the range of possible luminance levels has expanded greatly, compared to their non-HDR counterparts. Typical peak brightnesses for new HDR TVs are on the order of 500-1000 nits (compared to ~100 nits for regular digital TVs), and black levels are on the order of 0.005 nits (compared to ~0.5 nits for regular digital TVs). The increased light capabilities, in conjunction with more efficient and sophisticated data storage and image/video processing, has made it possible for HDR TVs to handle the metadata required to interpret and display the entirety of the grayscale visible to the human eye.

So what’s next? In 2016, the latest TV standard was published, documenting the specifics of the new grayscale gamma curve that TV manufacturers will aim for to achieve HDR. This curve is called a PQ-EOTF curve, and it is basically designed to optimize visibility of the entire grayscale, based on the ability of the human eye to perceive different shades of light, at any given light level.

With the growing popularity of 4K content, broadcasters are slowly migrating towards higher-tech cameras; these cameras will also include one or more corrections that are complementary to the new PQ-EOTF curve so that this new content will look perfect on the new HDR TV sets.

As for the old content that still has the gamma correction applied, well, the HDR TVs have to be placed into HDR mode before they accurately display HDR content with the new PQ-EOTF curve. Otherwise, their default mode is regular standard or high definition, which is perfect for viewing old reruns of Star Trek: The Original Series.

Once TVs and cameras make the switch to recording and displaying in HDR full-time, older content can be remastered (from the original film strips) to be correctly formatted to optimize viewing in HDR. This process isn’t very difficult, as broadcast cameras have been recording more data than a TV could render for decades, and it’s only now that TV panels are capable of displaying the full breadth of color and grayscale information that these cameras were capturing all along.

But in the meantime, enjoy the gamma grayscale while it’s still around, but be ready to embrace the awesomeness of the HDR PQ-EOTF once it fully arrives.

TAGS: Grayscale gamma Gamma Gamma correction TVs Doing math is hard

What's Your Take?

All Comments