Tech sourced from robot vacuums may have solved mobile photography's biggest problem.
By clicking one of our links you're supporting our labs and our independence, as we may earn a small share of revenue. Recommendations are separate from any business incentives.
If you've ever used a smartphone to shoot late-night photos, you've probably experienced the frustration of checking your shots the next day only to find a blurry mess. Your memory may be a little fuzzy, but that doesn't mean your photos have to be.
The new LG G3 aims to fix that, the way any 5-year-old would: with freakin' lasers. To be more precise, LG has included the same kind of laser rangefinding tech used in many police radar guns.
LG first explored using lasers to judge distance with its Roomba-style vacuums, which use it to figure out where your filth ends and the walls begin. Though the tech never made it to a production vacuum cleaner, LG's mobile division saw its potential and eagerly snapped it up.
Autofocus is a relatively new invention, but the tech behind it is really quite simple. The basic idea is this: A camera doesn't know what you're trying to take a photo of, so it can't tell if the subject looks good or not. An autofocus system picks out likely objects in your scene and tries to figure out how far away they are. Once the camera has that data, it can tell the lens to focus on the most likely subject.
Cameras do that in one of two ways. Traditional DSLRs and high-end mirrorless cameras use a thing called "phase-detect" autofocus, which takes the incoming image and splits it in two. If the two images the camera sees are different, it knows the shot is out of focus. The big benefit of this method is that the camera knows exactly how different the two versions are, and can shift focus to the correct point in one swift movement.
Unfortunately, this method requires either a big, bulky mirror or a cutting-edge (expensive) image sensor, so only one smartphone thus far has used it: Samsung's Galaxy S5.
Most point-and-shoots and smartphones use "contrast-detect" autofocus. This method relies an important science fact: as you zero in on a subject, the contrast—the difference between light and dark areas—grows. This method works extremely well in bright light, and with stationary subjects, but it's slower by design. The camera has to move the lens around to check contrast levels, and once it knows it's headed in the right direction it still has to keep going past the point of focus before backtracking to the point of peak contrast. And in low light, there's far less contrast to track in the first place.
LG's method will allow the G3's 13-megapixel camera to instantly know exactly how far away objects are.
It does this by emitting a laser beam that passes through a prism, outputting a cone of laser light that bounces off of objects and back towards the camera. The camera can figure out how far away subjects are by how long it takes the light to return to the phone. With that information, the camera can focus more or less instantly, with LG claiming focus speeds as fast as 276 milliseconds.
Of course, there are still drawbacks. It's likely that objects at the edge of the image frame, or certain reflective and transparent objects, or landscapes that are very far away, will render this laser method useless. For this reason, we'd hazard a guess that the G3's camera also uses contrast readings to fine-tune focus in a pinch.
We'll know more once we get a G3 into our labs for a full test, but in the meantime, the five-year-old in all of us is celebrating.
Via: The Verge