If you’ve ever explored the menu options on your TV, chances are that you’ve seen something called “gamma” or “grayscale gamma." “Grayscale” itself is pretty self-explanatory (it's the difference in scale from black, to gray, to white), but what does “gamma” have to do with anything?
Let’s take a trip through time and look back to an era where televisions used to be huge, heavy boxes that housed cathode ray tubes (CRT). In those days, the picture on your TV was actually the result of an electron gun shooting electrons down a vacuum tube to eventually strike tiny chemical dots (called “phospors”) that were painted on the back of the glass TV screen. These dots were meant to light up in three colors: red, green, and blue. These red, green, and blue phosphors were the predecessors of the pixel, is what makes up a modern-day LCD TV screen.
In the early days of TV, television manufacturers ran into a problem. Because the first televisions had images rendered only in black and white, the grayscale was very important (and, as we know, it remains important, even when pictures are shown in color). They wanted to be able to show images with depth and shadow, but without having every image look too dark or too washed out.
Back in the CRT days, the amount of light visible on the screen was related to the voltage applied. It’s like a dimmer switch for the lights in your home; as you slide the dimmer up, more voltage is applied, and the light bulb gets brighter. For ease of discussion, the luminance (or brightness) of the image was divided into increments of the total luminance, where 100% represented maximum brightness possible for that TV, and 0% represented total darkness. The rest of the light was usually described in increments of 5% or 10% (also referred to as increments of 5 or 10 IRE), and the TV grayscale was born.
Okay, you think, this all sounds perfectly logical. So what’s the problem? Bear with me for a little bit. It was eventually discovered that the relationship between the voltage applied and the light brightness was NOT linear, that is, increasing the voltage by 10% would not increase the brightness by 10%, but by an amount more or less than 10%. It turns out that the relationship between input voltage and brightness for a CRT TV is related by a gamma law, which takes the form of...
where Vin is the input voltage, A1 is a constant, γ (gamma, a Greek symbol) is a constant exponent, and Lout is the output light level. Here's what it looks like on a chart:
As a result, if you were to attempt to divide that non-linear curve into equal 10% increments (as though it were linear), from off (total darkness) to fully on (full brightness), this is what it looks like:
This is the so-called “linear” grayscale. If you look at it, this grayscale has a lot of different shades of light, and fewer shades of black. The problem is that the human eye has much more trouble differentiating between lighter shades than it does differentiating between darker shades; humans would perceive the “linear” grayscale on a TV as being too bright and washed out, with not enough subtle shading and detail.
To address this problem, cameras built to record television content were designed to use the “gamma correction”:
Lin is the light level recorded by the camera, A2 is a constant that may or may not be equal to A1, Vout is the voltage level output by the camera, and γ is a value is identical to the gamma value in the first equation.
Let’s solve these equations. The output voltage of the television camera is equal to the input voltage received by the television:
A3 is a new constant that equals A2γ. We now see from the last equation that with the gamma correction, the input (recorded) light levels and the output (TV) light levels are linearly related to one another.
Let’s try an example: Say that you’re receiving a signal from a TV camera that does not have the gamma correction applied, and instead, it generates an output voltage that exactly matches the input light level. In this case, a 50% brightness level would correspond to 50% of the total possible applied voltage. If you take that 50% total applied voltage and put it into the electron gun of your CRT TV (assuming a gamma value of 2.5 and that A1=1),
that would create an image that has about 18% of the total brightness, which is not at all what you were recording on the camera.
Now assume that a TV camera is designed to employ the gamma correction. Assume that A2=A1=1, and γ again equals 2.5. In this case, an input light level of ~17.7% yields an output voltage of 50%:
And that output voltage, now applied to the TV as the input voltage:
creates a brightness of ~17.7%, the same level that was recorded by the TV camera. The effect of using these two gamma corrections in conjunction with one another is that the input and output light levels are linearly related to one another.
Once the gamma correction is applied, a linear relationship exists between the recorded and displayed light level, as you can see below.
Using the gamma correction creates a gamma grayscale, which is shown below. As you can see, there’s much more emphasis on the darker tones.
Throughout this example, you may be wondering why I’ve used a gamma value equal to 2.5. Both in the past and in the present, gamma values for televisions typically range from 2.0 to 2.5, and can often be altered based on the viewer’s preference or viewing environment.
With the same input voltage, lower gamma values generate brighter shades, meaning that lower gamma values are best for areas with more ambient lighting; the TV’s brightness has to be able to compete with the surrounding light, or it will appear washed out. Higher gamma values generate a wider variety of darker shades of light (as we saw), and so are best suited for dark viewing rooms, where your eyes can adjust and more readily discern the differences.
The next question you may have is the following: the gamma correction is all well and good for CRT TVs, but we haven’t had CRT TVs for decades, so how is the gamma correction still relevant?
Here’s the thing: so much television content was recorded with the gamma correction applied that we need to fake it to make it look good on a newer, digital TV.
So what did digital television manufacturers do? They duplicated the gamma response from CRT TVs in digital TVs. Only, instead of comparing brightness to input voltage, a given brightness level is the result of an input “code” or “color” value, which is basically a binary or hexadecimal representation of a particular shade of gray. Between the long-standing precedence of the gamma correction, as well as the fact that digital TVs did not have the capacity to correctly interpret the full gray- and color-scale, the need for a gamma correction persisted long after CRT TVs stopped being relevant.
It is only recently, with the development of cameras and TVs that can record and display in High Dynamic Range (HDR), that the CRT version of gamma is becoming a thing of the past.
One aspect of these new HDR TVs is that the range of possible luminance levels has expanded greatly, compared to their non-HDR counterparts. Typical peak brightnesses for new HDR TVs are on the order of 500-1000 nits (compared to ~100 nits for regular digital TVs), and black levels are on the order of 0.005 nits (compared to ~0.5 nits for regular digital TVs). The increased light capabilities, in conjunction with more efficient and sophisticated data storage and image/video processing, has made it possible for HDR TVs to handle the metadata required to interpret and display the entirety of the grayscale visible to the human eye.
So what’s next? In 2016, the latest TV standard was published, documenting the specifics of the new grayscale gamma curve that TV manufacturers will aim for to achieve HDR. This curve is called a PQ-EOTF curve, and it is basically designed to optimize visibility of the entire grayscale, based on the ability of the human eye to perceive different shades of light, at any given light level.
With the growing popularity of 4K content, broadcasters are slowly migrating towards higher-tech cameras; these cameras will also include one or more corrections that are complementary to the new PQ-EOTF curve so that this new content will look perfect on the new HDR TV sets.
As for the old content that still has the gamma correction applied, well, the HDR TVs have to be placed into HDR mode before they accurately display HDR content with the new PQ-EOTF curve. Otherwise, their default mode is regular standard or high definition, which is perfect for viewing old reruns of Star Trek: The Original Series.
Once TVs and cameras make the switch to recording and displaying in HDR full-time, older content can be remastered (from the original film strips) to be correctly formatted to optimize viewing in HDR. This process isn’t very difficult, as broadcast cameras have been recording more data than a TV could render for decades, and it’s only now that TV panels are capable of displaying the full breadth of color and grayscale information that these cameras were capturing all along.
But in the meantime, enjoy the gamma grayscale while it’s still around, but be ready to embrace the awesomeness of the HDR PQ-EOTF once it fully arrives.
What's Your Take?
If you're a Reviewed.com reader, you know that we love to test things. A lot of our tests are adapted from international testing standards that are designed and updated by august bodies of engineers and scientists. But you know what there's no testing standard for? Judging how much someone is going to like using a refrigerator once it's in their home.
Recently, we have been systematically revamping the testing and scoring methods for each of the main product categories. One piece of feedback that kept popping up was that people wanted more information on usability – things like ease of use, comfort, and intuitiveness.
Those ideas broadly fall under the category of “the human factor”, and answer questions like, “What is this product like to use,” “How comfortable is it to hold/wear/watch this product,” and “How stylish will this product look in my kitchen/living room?”
We have always scored products on their usability, but the task has historically been a difficult one – while it’s easy to translate a number from a test into a score, it’s harder to give a score to a feeling or an impression.
This problem became even more pressing as we expanded our product testing to horizons beyond tech and home appliances. We’ve started edging into the massive product categories known as “parenting,” “smart home,” and “small appliances,” and testing everything from tablets for kids, to smart security cameras, to sous vide immersion circulators, and back again.
Now that we’re testing products that do not have an established testing methodology in place, one big sticking point remains: How do we actually quantify the results of the testing? In a lot of the products we’re assessing, numbers alone cannot help us to separate the wheat from the chaff.
For example, when we were testing oven thermometers, we measured the thermometer’s ability to accurately reflect the temperature of a toaster oven, but addressing the ease of use is less straightforward. When using an oven thermometer, you want to be able to read the temperature at a distance, through an oven window. But short of conducting an eye test, there is no quantity we can measure that will allow us to instantly classify a thermometer’s visibility as “good” or “bad”. So what do we do?
Posing the question as “Can you read the numbers on the thermometer?” is too general. How close are you to the window when you read the thermometer? Can you read the small numbers and the large numbers? Are the numbers obscured by something in the oven? These are all questions unanswered by a simple “yes” or “no” to that original question.
On the other hand, framing the question as a quantity to be measured, like “Record the maximum distance you can get from the oven and still read the numbers” is problematic on its own. Depending on the person doing the testing, the location of the oven, and the ambient lighting conditions, each answer could vary greatly.
After conversing with some experts and a lot (A LOT) of trial and error, we’ve settled on a method that makes score number-crunching relatively straightforward, while also giving us lots of useful information. The idea is to have the testers use the products the way actual people would, answer very specific multiple-choice questions about the usage/performance, and then leave room for testers to expound on their answers.
As an example, here’s the actual question that our tester answered, related to the number visibility on an oven thermometer that had been sitting in a toaster oven at counter height while the temperature increased:
How easy was it to read the display on this thermometer in the oven?
1. I can’t read anything at all - the font is way too small
2. The numbers and tick marks are very difficult to read
3. It is easy to read both the major ticks and numbers, but the minor ticks/numbers are more difficult
4. It is easy to read both the major and minor ticks and numbers
There was also the option for additional comments or to add another possible answer for the question.
This way, we have the benefit of a number that can be turned into a score, as well as information that can help us shape the editorial content surrounding that product.
Gathering data with this style of questioning really helps us to put the quantitative results in context. For example, if a pair of headphones is technically perfect but extremely uncomfortable to wear for more than 20 minutes, we’re not going to recommend that product to our readers. Conversely, if a pair of headphones doesn't sound great but is super comfortable, we might recommend them for a certain type of listener, like someone who travels a lot.
At the end of the day, it’s people using products, not robots, so we think that combining the squishier human factor in with our numerical assessments is the best of both worlds.
What's Your Take?
Have you listened to your own voice on a video, over a loudspeaker, or over the echo of a Skype chat? It probably sounds pretty different in all three circumstances. The reasons for this are twofold: Signal distortion, and Frequency Response (FR).
FR is one of the most important aspects of our testing, but signal distortion also has a profound effect on how your voice sounds when played on a given device. And like all distortion, it has to do with altering something from its original form. In this case, the things being altered are the digitally rendered sound waves that live inside your music player.
Each pure tone you hear in a piece of music corresponds to a frequency. For instance, Middle C is represented by a sine wave with a frequency of 261.6 Hz, or 261.6 cycles per second.
In a recording studio, an artist or musician plays Middle C. In the following video, you will be able to hear Middle C, as well as see its equation and shape through time on a plot. Note the different time scales in each video:
When a recording artist or musician plays that note, the sine wave is one signal that gets rolled in with all of the other signals representing other notes and sounds that are being played at the same time. The sine wave is saved inside your music device, and when you click “play”, the sine wave is summoned by your headphones.
However, it turns out that your headphones cannot perfectly reproduce the original sine wave that was recorded in the studio. This occurs for two reasons: The first is that we do not live in a perfect world where signals are perfectly transmitted all the time, and the second is that the omnipresence of electronics with non-linear loads (i.e. your music device/headphones) means that the digital signal response of a device, when it draws a current, can be flawed and unpredictable.
While your headphones cannot exactly replicate that original sine wave, they do their best to reproduce it.
For example, let’s say your headphones were aiming for Middle C:
But came out with this instead:
The two sine waves don’t look terribly different, but if you listen closely to the two sound clips, you can hear the difference.
Our headphone tests quantify how your headphones distort the original music signal at each individual frequency. In SoundCheck, our headphone testing software, we measure distortion with a frequency sweep that starts with a pure tone at a subset of frequencies from 20 Hz (increase your volume to hear this one):
to 10 kHz (lower your volume for this one):
Knowing the original frequency and height (= amplitude) of the pure tone, we can analyze the sine wave that is actually output by the headphones and quantify how different the two sine waves are.
So how do we analyze the resulting digital wave produced by your headphones? SoundCheck uses a method called Fast Fourier Transform, or FFT. With FFT, a wave is broken down into a finite number of sine waves with different frequencies and amplitudes. The best way to understand FFT is to imagine the process in reverse. You can take a finite number of sine waves, with known frequencies and amplitudes, and add them together to create a new wave of any shape or size. This process is known as superposition.
At a given point in time, some of those sine waves will cancel one another out to some extent by having opposing high and low values (destructive interference). At other points, some sine waves will add together by having similarly high or low values (constructive interference).
With a Fast Fourier Transform, SoundCheck takes the final digital wave and reduces it into the component sine waves. In this particular case, SoundCheck is only looking for component sine waves that represent harmonics of the original (or “fundamental”) tone.
Harmonic waves are simply waves that have frequencies that are integer multiples of the fundamental frequency. For instance, if the fundamental tone has a frequency of 20 Hz (=20*1), then the first harmonic tone would have a frequency of 40 Hz (=20*2), the second harmonic tone would have a frequency of 60 Hz (=20*3), all the way up the nth harmonic (=20*(n+1)), and so on.
Total Harmonic Distortion
There are many different types of distortion that can be measured in headphones. Total Harmonic Distortion, or THD, is the most basic kind of distortion. We'll be talking about THD in this post, but if you're curious about other kinds of distortion, check out this link for more information.
After identifying the harmonic tones that add together to create the imperfect fundamental tone, we calculate the THD using the equation shown below. Here the THD is presented as a percentage of the original amplitude of the fundamental sine wave, where Vn represents the amplitudes of the sine waves.
For n > 1, Vn is the amplitude of the nth-1 harmonic sine wave, and for n=1, Vn is the amplitude of the fundamental sine wave.
If the amplitudes of the harmonic sine waves are very close to the amplitude of the fundamental sine wave (high THD percentage values), then the resulting wave will be significantly distorted from the original sine wave. If, on the other hand, the amplitudes of the harmonic sine waves are very small (low THD percentage values), then the fundamental sine wave is better preserved.
Let’s take another look at our distorted wave. In this video, you will first see the shape of the distorted version of Middle C, then the three waveforms that make up that distorted wave: the original fundamental frequency sine wave and two harmonic frequency sine waves. This waveform has a THD of 20%.
Here is a second wave that has also been distorted from the original Middle C fundamental tone. This distorted wave has a THD of 40%.
Distortion at different frequencies
The human ear tends to have an easier time hearing higher frequency tones. Consequently, the human ear can more easily perceive distortion of higher frequency tones than of lower frequency ones.
For example, listen to a sine wave with a fundamental frequency of 40 Hz (increase your volume for this one):
And here is a version of that sine wave with a THD value of 20%:
By contrast, this is a sine wave with a fundamental frequency of 1000 Hz (=1 kHz) (lower your volume for this one):
Here is a version of that sine wave with a THD value of 20%:
The distorted version of the low frequency tone is practically indistinguishable from the original fundamental tone. With the high frequency tone, the difference is immediately obvious.
Ideally, headphones would not distort the original fundamental tone at any frequency. Headphone manufacturers know that our sensitivity to distortion increases with higher frequencies, so they do their best to minimize the perceivable distortion at higher frequencies, while being somewhat less strict when it comes to THD at lower frequencies. We take this into account when we run our distortion test on a pair of headphones.
Our headphone testing
When developing our testing, we built an empirical THD curve that mimics the human ear’s sensitivity to distortion. This empirical curve (the shaded blue curve shown below) was defined by previous distortion data from 70+ products we’ve tested over the past few years. (Note the logarithmic, rather than linear scale in the chart.)
In our distortion test, we do a sweep of original fundamental tones with frequencies from 20 Hz to 10,000 Hz. SoundCheck breaks the resulting distorted signals down into their component harmonic sine waves, and calculates the THD for each individual frequency. We plot that THD data (red line), and compare it to our empirical data curve (blue shaded area).
Any part of the product’s THD curve that lies above our empirical data curve results in a point deduction from a perfect score. The more a product’s THD curve resides above our empirical data curve, the more noticeable the distortion is to the listener, resulting in a lower overall distortion score.
Conversely, if a product’s THD curve resides entirely below our empirical data curve, then the distortion score for that product is a perfect ten.
To make a long story short, the less the red line goes above the blue line, the less noticeable any distortion will be to the listener, and the better the product will score.
Our best scoring products have low distortion across the entire range of frequencies. To find out more about distortion in individual headphone products, be sure to check out our ever-updating library of headphone reviews. Click on the "Test Results" tabs in individual product reviews for the greatest detail.
What's Your Take?
Have you ever wondered why your music sounds different, depending on what pair of headphones you're wearing?
For example, listening to the iconic Jaws theme will be a very different experience on a pair of Beyerdynamic over-ear headphones, compared to a pair of Beats in-ear headphones.
The dramatic difference between two pairs of headphones, other than the style and design, is which notes they choose to enhance or downplay. This is called frequency response, which is the most important performance metric that we assess in our headphone testing.
All headphones can be divided into two camps, frequency-wise: those trying to most closely reproduce the sounds as they were recorded, and those trying to ensure that each note sounds equally loud to the listener. But before we jump into the difference between those two ideals, let's break down the definition of "frequency response".
The frequency of any wave is defined as the number of cycles per second (aka “hertz” or “Hz”). When I say “cycles”, I refer to the fact that some waves have a structure or wave shape that repeats over and over throughout the lifespan of the wave. Here are some examples of those types of waves:
As you can see, there are many different types of waves with repeating shapes, but we're going to focus on the so-called “sinusoidal” waves ("sine" in the image above)—waves that have peaks and valleys with curving characteristics.
Because these shapes are identical and repeat throughout the wave, we can isolate one wave shape and look at how long it lasts over a period of time. If a wave repeats many times in one second, that wave is said to be “high frequency.” If a wave repeats very few times per second, it is said to be “low frequency.”
In music, we deal mostly (but not exclusively) with sinusoidal waves. The note middle C—whether it is played on the clarinet, the piano, or the saxophone—is made up of waves that always have the same frequency: 261.6 Hz, or 261.6 cycles of the repeated wave structure in one second.
The "response" of a given frequency refers to its corresponding decibel level. Decibels define the intensity (or the amount of energy transmitted over time over an area) of the frequency wave carrying a given sound.
Decibels (dB) are tricky to understand because they're logarithmic in nature. That means an increase of 10 dB (say, from 25 dB to 35 dB) represents a ten-fold increase in intensity. More confusingly, the intensity of a frequency wave doesn't directly translate into the "volume" that we hear in our ears. In order to change the volume setting on your radio from "1" to "2", the radio has to increase the intensity of that frequency wave by 6-10 dB.
The strange relationship between numerical decibels and the loudness (i.e. the volume) was first quantified by Harvey Fletcher and Wilden A. Munson, in the early 1930s. They asked volunteers to listen to single notes at a range of frequencies and intensities. At each frequency, they adjusted the intensity of the wave until the listeners believed that the loudness matched that of a reference frequency wave with a set decibel level.
In this way, Fletcher and Munson built empirical curves that represented a constant loudness, as perceived by the human ear and defined by a range of frequencies and decibel values. These curves, which have since been revised and updated, are called "equal loudness" (EL) curves. Today, they're a key part of our frequency response testing.
Sounds that have lower pitches have low frequencies. The rumble of traffic or the thrum of an air-conditioning unit are sound waves with fewer cycles per second and lower dB values. They're difficult to perceive because, with fewer cycles, the wave shape (and the sound produced) change very gradually, making it more difficult to observe and recognize the wave.
Conversely, sounds with high pitches have high frequencies. Fire alarms and nails on a chalkboard create sounds that have high frequencies and high dB values. High frequency sounds are more easily distinguished with the human ear because many cycles occur in one second. Our ears more easily pick up on the rapidly changing wave than those with lower frequencies.
With all of this background information, it is perhaps unsurprising that when we discuss the “lows,” “mids,” and “highs” of a piece of music, we are referring to the frequencies of the sound waves. A single note represents a sound wave with a specific frequency; a note with a low frequency of ~100 Hz may be produced by a bass drum, a note with a mid frequency of ~1,000 Hz may be made by an alto saxophone, and a note with a high frequency of ~10,000 Hz may be made by a piccolo.
Our Frequency Response Test
In our frequency response test, we use SoundCheck software to send a range of sound waves with known frequencies through headphones and measure the decibel level the headphones reproduce. When we look at the data for the frequency response test, we're plotting frequency vs. response in decibels.
In general, headphone manufacturers are trying to match one of two popular frequency profiles: the “consumer” response or the “studio” response.
The consumer response is defined by the equal loudness curves mentioned above, which attempt to achieve a constant loudness across a range of frequencies. Because it is easier for us to hear mid- and high-frequency notes than low-frequency notes, EL curves increase the intensity of lower-frequency waves and limit the intensity of mid- and high-frequency waves. With this type of frequency response, you'll hear the hum of an upright bass at the same loudness as the blare of a trumpet.
We call this the “consumer” frequency response because most headphones on the market aim for this frequency profile. It allows you to hear all parts of your music with equal emphasis on the different sounds, vocals, and instruments.
The studio frequency response is the opposite: the manufacturer seeks to reproduce all of the notes, both at high and low frequencies, at the same decibel level.
This doesn’t correlate to the volume as perceived by the human ear; doing so requires a curve, as you saw earlier. On the frequency vs. response in decibels plot, the ideal studio response is a flat horizontal line at any given decibel level. This is appealing to listeners who want to hear the music as it was mixed in the studio. There's no artificial emphasis or de-emphasis on certain notes.
We call this the studio response because these headphones are often used in a studio setting, where producers can clearly hear the sound as it was recorded, and then choose to boost or reduce certain frequencies to achieve the exact sound they want.
The "consumer" and "studio" profiles represent two distinct philosophies, so when we test a pair of headphones, we look at the frequency response and judge it based on which of the two frequency profiles it was seeking to reproduce. As most headphones on the market have a consumer profile, it would be unfair to compare that rapidly changing sound to a studio profile. Similarly, a flat studio profile would be a poor match for an ideal EL curve. Some companies will aim to produce their own "signature sound," but those tend not to deviate very far from either of the two previously mentioned philosophies.
Using our methodology, each pair of headphones is scored on how well it actually mimics the frequency profile that it most closely resembles. The reviewers can then take this information and tell you categorically that, with a given pair of headphones, the violins in the shower scene of Psycho are so muted that all sense of suspense is lost, or that the roar of the waterfall at the dam in The Fugitive will be so loud that you'll feel like you're being pursued by Tommy Lee Jones.
Only you can decide which sound profile you prefer, and that desire has to be balanced with the price tag and the look of a given pair of headphones. For the products that top our list, be sure to check out the Best Right Now: Headphones article, or our constantly updated library of headphone reviews.
What's Your Take?
Ever wonder how Reviewed.com tests all those cameras? Find out in this video detailing the process. It stars TJ Donegan, Editor in Chief of Imaging and Mobile, along with yours truly. The process is really quite amazing. We collect tens of thousands of data points to cover all important aspects of each camera. Check out our methods and labs to see just how we do it.
What's Your Take?
The video team put together this great video about how we test refrigerators, starring Keith Barry, Editor-in-Chief of Home Appliances and yours truly. If you ever had a question about how we test refrigerators, I highly recommend checking it out!
What's Your Take?
Have you ever tried watching a VHS tape on your HD TV? How about a standard-def cable channel? If so, you've probably noticed just how awful video looks when it's not played back in its native resolution.
It's not a problem that will go away any time soon. Even the latest disc-based technology, Blu-ray, can only provide up to Full-HD resolution (1920 x 1080 pixels). That's only a quarter of the pixels the latest generation of 4K UHD televisions are capable of displaying. 4K TVs are likely to hit the mainstream before native 4K content is widely available, so what can you do to prevent your collection of full-HD content from looking like crap on your shiny new UHD set?
Well, you have two options. You can display the content at its native resolution (i.e., only using a quarter of the screen), or you can upscale the image from HD to UHD. No one in their right mind would choose the first option—it simply defeats the purpose of buying a big, high-res TV. Upscaling is the short-term future of UHD, so here's an in-depth look into the science behind it.
What's Your Take?
OLED is the latest and, some might argue, greatest technology to hit televisions since the introduction of LED-backlit LCDs. In fact, it appears that OLED technology is poised to do to LED televisions what LED did to LCD: eclipse them entirely.
But how different are OLED panels from LED? As it turns out, the only thing these two technologies have in common are three letters.
What's Your Take?
In a previous article, our man Tyler investigated the efficacy of various DIY laundry detergents. What he found was that none of them performed as well as the store-brand, or our standardized testing detergent. But that was to be expected, since they lacked one very important component: enzymes. In this Science Blog post, we rectify that oversight by making our own enzymatic laundry detergent, and find out whether it's worth the effort.
What's Your Take?
Though the days of cooking, mountains of dishes, and hours of food coma gave you doubts, you make it through Thanksgiving. If your Thanksgiving is like mine, your family of vultures only made a tiny dent in the bounty, leaving you with the best part of all: leftovers. Some things, like chili, pasta sauce, and fine wine are known to taste better as they age, but how does turkey fare? To answer that question, we delve into the science and see how cooked poultry spoils, and how long you can safely brown-bag those homemade turkey sandwiches.
What's Your Take?
Cooking may have been our first great leap forward as humans. Besides killing off harmful bacteria, cooking our food made nutrients become more easily absorbed, which allowed our brains—our number one consumer of energy—to considerably increase in size. Besides the safety and nutritional aspects, cooking has one more major advantage: It makes food taste much better. Most foods, particularly meat, undergo a series of chemical reactions when exposed to certain temperature ranges, resulting in the synthesis of endless compounds that make life taste sweeter.
What's Your Take?
Look at any dish detergent, and it'll probably boast its ability to "cut through grease." To dispose of grease and oil, detergents use surfactants—chemicals that make oil and water miscible, so they can be rinsed out. But surfactants are at best half the story in modern washing machine and dishwasher detergents, and definitely the less interesting half at that. To break down organic stains, most modern detergents recruit nature’s nano-machines to help: enzymes.
What's Your Take?
You've probably never had blood, sweat, cocoa, wine, and motor oil on your clothes at the same time—unless you happened to be present at the 1988 Nakatomi Christmas party. However, it's certainly not unreasonable to expect five of the toughest and most persistent stains to occasionally show up in your hamper. After all, we do encounter stains like these all the time. Since each of these substances represents a different genre of stain, we use fabric strips with each stain to evaluate washer performance.
What's Your Take?
Picture a crystal. Unless you're a television manufacturer, you're probably imagining something like the image above (maybe we did introduce a little bias there). Generally, we think of crystals as clear, stony, possibly valuable, and—most of all—solid. So solid, in fact, that crystals can be some of the hardest substances on earth. While most people understand the meaning behind the LCD acronym—liquid crystal display—and the words therein, surprisingly few notice the apparent paradox: How can something be liquid and crystal at the same time?
What's Your Take?
NASA has fielded a lot of criticism over the recent past. With no Space Race to transfix our nation (with all our distractions today, could it if there were one?), a significant portion of the American public doesn’t know what NASA’s contemporary activities are, criticizing it as an extremely expensive toy for America’s scientists. But while NASA's activities can seem pretty obscure and far removed from the general public, they probably do more for you than you realize.
What's Your Take?
Since televisions became commercially available in the ‘20s, picture quality has steadily improved. But as flat-CRT, Plasma, LCD, and LED technology sharpened our displays, there has always been technological progress to make. Color, black levels, brightness, contrast, sharpness, resolution, viewing angle, motion—TVs have a lot to do, and they can’t seem to do it all. But last week, our evaluation of the new Samsung OLED technology showed something we never expected: perfect black levels. This is a conquered test, and it has some awesome implications.
What's Your Take?
There's an overwhelming amount of scientific evidence that tumble-drying does irreparable damage to clothes. Such is the price of convenience and lack of clotheslines. But how exactly does a dryer hurt your clothes? There are three distinctive ways dryers could hurt fabric: shrinkage, color transfer, and actual tears to the material. To fully understand how a dryer can negatively affect clothing, we researched some scientific studies detailing these effects of tumble-drying.
What's Your Take?
If you can recall your high school physics or chemistry lessons, you may remember learning about the three phases of matter: solid, liquid, and gas. But there's actually a fourth fundamental phase of matter, plasma—a word you might recognize from your TV. Though it might sound like a silly marketing term, plasma TVs actually utilize this obscure fourth phase of matter to pry out television display panels with incredible contrast and color. Panels that often out-perform their LCD or LED counterparts.
What's Your Take?
People have been blaming the government for poor washer performance for quite a while now. In the ‘90s, top-loading washers reigned supreme as relatively mature technology until the Department of Energy’s efficiency standards took them to task. Though they were popular with consumers, the top-loaders guzzled water like Hummers and Suburbans went through gas. To meet these new standards, the washer industry began to focus on front-loaders, which generally use less water since they don’t need to fill up the entire cavity to clean. The golden age of top-loaders was over.
What's Your Take?
In the audio community, you sometimes hear about people with a set of golden ears who can determine the frequencies and sound quality with machine-like precision. But here at Reviewed.com, we think it’s sub-optimal to hope for machine-like precision when you can have actual machine precision. It’s always a good idea to calibrate the scoring and criteria using expert opinions, but for the actual testing? We'd rather avoid opinions and instead have a system with accuracy and repeatability. So instead of our own ears, we use a special pair on "HATS."
What's Your Take?
Everyone knows how fast technology evolves. In the ‘60s, Gordon Moore coined “Moore’s Law," declaring that that processing power would double every year. He thought that pace would last ten years, but it's showed no signs of letting up. While speedy evolution is great, it can make it a challenge to review products in a context that makes sense. After all, that "context" is a moving target. And though it's not always that tricky to identify the criteria for scoring or the lab tests, navigating scores in a mercurial climate presents certain challenges. But fortunately, some elegant math has given us a fantastic tool to deal with these scoring challenges: the s-curve.
What's Your Take?
You’re probably familiar with the light-emitting diode (LED) as the light source that isn’t a flame, Edison bulb, fluorescent tube, or firefly (yes, there are more too, of course). Besides being responsible for so many inventions we love: displays, remote controls, high-speed internet, and (green) traffic lights, the LED has been getting increased press from new usages for household lighting, bendable screens, and extremely thin televisions. As their name suggests, the LED is a special type of diode. So how do they work?
What's Your Take?
A washer's job may not be easy, but its primary goal is simple: It just has to remove stains. So to test any given washer's performance, we have to run stained fabric through the machine and see how much of it is removed. That may seem like a straightforward process, but the scientific techniques involved are quite complex.
What's Your Take?
Human beings may be the only animals capable of reasoning, but we still suffer from the occasional gap in our logic. Since our job at Reviewed.com is to critically evaluate products, it’s necessary to constantly remind ourselves of some widespread errors in human thinking so we can avoid these pitfalls. There’s nothing like a long list of common cognitive biases, fallacies, and misconceptions to crush some misplaced hubris in your decision making or opinions. Even without having to navigate a sea of marketing propaganda, the conscientious consumer still has some hard decisions to make. So we’ve decided to illuminate some common errors in thinking to help you make the big moves.
What's Your Take?
Every July, around twenty teams of nine skinny men take to the roads of France for professional cycling's most prestigious Grand Tour. During the 21-stage Tour de France, riders ascend 70,000 feet of massive Alps and Pyrenees and tame around 2,000 miles of pavement in their quest for Yellow.
As you might imagine, it takes a lot of effort and energy to get through those three weeks. These guys work hard. But how much energy are they putting out exactly? How long could they power an average home? Using 2012 champion Sir Bradley Wiggins's estimated average power numbers and a lower average for the rest of the peloton, we've done a few calculations.
What's Your Take?
When Reviewed.com tests digital cameras, it's straightforward enough to count frames per second, measure color accuracy, examine noise, and grade sharpness. In our controlled lab environment, direct measurements and the images on the card tell most of those stories. But for the camera's vaguer characteristics like image stabilization, we had to use a bit of creativity to put that to the test.