Samsung's new flagship sports some unusual features that could make it the best smartphone camera yet.
Recommendations are independently chosen by Reviewed’s editors. Purchases you make through our links may earn us a commission.
One of the main points of comparison for shoppers looking at new smartphones is camera quality. So it's no surprise that Samsung has loaded its new flagship, the Galaxy S5, with tons of new features—some of them borrowed from high-end cameras.
Though we haven’t gotten a Galaxy S5 into our own labs, serial tech eviscerators Chipworks have already torn one to bits in the name of in-depth analysis. What they’ve found could put the Galaxy S5 well ahead of the competition when it comes to image quality.
The Galaxy S5 uses a 16-megapixel, 1/2.6-inch image sensor that's bigger than the ones found in some other smartphones—including the iPhone 5S—but slightly smaller than the 1/2.3-inch sensors we see in most point-and-shoot cameras.
But the most intriguing thing they've found so far is phase-detection autofocus points baked right into the sensor. On-sensor phase detection isn’t anything new in digital photography, but this is the first time we've ever seen it in a smartphone.
Phase-detection autofocus (PDAF) works by taking the incoming image and splitting it, in this case using two neighboring pixels on the sensor. This allows the camera to figure out exactly how out-of-focus a subject is, and helps it immediately hone in and track movement. In fact, it’s vastly superior to the contrast-detection autofocus used by all other phones when it comes to focusing on objects in motion. The S5 almost certainly uses a blend of both techniques for accurate, fast focus acquisition.
Though Samsung mentioned its PDAF-enabled sensor in the February product announcement, the company made a much bigger deal out of the camera's ISOCELL technology. ISOCELL is a new way of structuring the electronics of a digital image sensor to produce less noise while improving low light sensitivity.
In all digital cameras, light enters the camera, hits the sensor, and is recorded by individual pixels. The area in the pixels where light is captured is usually referred to as a well.
Picture a giant football stadium with a retractable roof over the field. If you put a bucket on every square foot of the field and opened the roof during a rainstorm, the buckets would fill up. The same thing happens in a camera when you open the shutter: light enters and fills up the "buckets."
The problem is that, like a leaky bucket, conventional sensors tend to let light bleed from one pixel to another. ISOCELL uses a combination of technologies—vertical transfer gates (VTG) and front deep trench isolation (F-DTI), which are a little more complicated than we can explain here—to plug the leaks.
According to Chipworks, which routinely does this kind of deep analysis, F-DTI was used in the original HTC One's "UltraPixel" sensor, while VTG technology is used in some point-and-shoots like Sony's Cyber-shot WX100.
Right now we still don't know how well these technologies will perform in the real world. All we can do is take Samsung at its word and wait to see the Galaxy S5 for ourselves. With the launch coming up very soon, we'll have a full performance test as soon as possible. In the meantime you can keep up to date on Chipworks' other findings via its continuously-updated Galaxy S5 blog post.
Images: Samsung Mobile, Chipworks
Sign up for our newsletter to get real advice from real experts.