Many hands make light work.
By clicking one of our links you're supporting our labs and our independence, as we may earn a small share of revenue. Recommendations are separate from any business incentives.
How many Apple engineers does it take to screw in a lightbulb? Who knows. But the camera in your iPhone is the product of a team of over 800 people, according to a report by CBS's 60 Minutes.
Why does it take so many engineers, developers, and assorted staff to perfect a seemingly simple smartphone camera? Because with every shot, the iPhone makes about 24 billion calculations to ensure things like the white balance, autofocus, optical stabilization, and autoexposure produce a usable—and sometimes even spectacular—photo.
Most people think hardware is what separates a good camera from a bad one, but when it comes to small-sensor shooters like smartphones, software is the secret sauce that makes taking a great photo as easy as pressing a button. In our own side-by-side comparisons we've been consistently impressed by just how good Apple's camera software is.
In many ways, it actually props up the iPhone's hardware, which has lagged behind top rivals from Samsung, Sony, and LG in recent generations. Though those companies surely pour just as much money and effort into their own software, the iPhone has remained the most user-friendly and continued to pump out top-tier photos. In essence, it's doing more with less.
It's a point that we emphasized in our review of the iPhone 6S, which we put through the same tests we use on pro-grade DSLRs. For instance, we simulate scenes with dim light or difficult lighting sources, such as the green-tinted fluorescents in your office or the warm orange glow of incandescent bulbs in your home.
Apple's cameras do exceptionally well in these circumstances—it's something the company's engineers clearly pay extra attention to. The 60 Minutes segment offers a look inside Apple's camera lab, where they use an LED light array that can simulate a variety of different color temperatures and light levels.
It's not unlike the lighting rig we use in our own camera testing lab. It's this kind of carefully controlled research that lets you diagnose exactly where the camera software can be improved and, in Apple's case, fix the problems.