2017 is the year of dual-camera phones, but the best cameras are still single

The current craze for dual-camera smartphones was predictable as early as the spring of last year. At the time, only LG and Huawei had added a second lens and sensor to the rear of their phones, but it felt obvious even then that the technology was going to take over. The interesting thing I’m noticing this year is that even as dual-camera systems are becoming more numerous, the phones with the best image quality still have a conventional single camera on the back. That’s liable to change with time, but for now the second camera’s benefits seem to be coming at the cost of the best image quality.

When I say there’s a dual-camera craze, I mean it’s harder to find a 2017 flagship phone without the extra lens than with. Andy Rubin’s Essential Phone has two cameras on its rear, and so do the Asus ZenFone 3 Zoom and upcoming ZenFone 4, Huawei P10 and P10 Plus, LG G6 and V20, and the OnePlus 5 and its close cousin the Oppo R11. Motorola’s brand new Z2 Force is joining all of the above with its own dual-camera setup, and Samsung will soon be a member of the club too with its upcoming Note 8. A second camera is an easy thing to sell to people, especially after Apple embraced the idea with its iPhone 7 Plus.

But the intrigue for me lies in the absentees from the dual-camera list. Because those devices perfectly coincide with my favorite phones for mobile photography.

The Google Pixel has been a revolutionary device for mobile imaging because of Google’s shockingly good image-processing algorithms. Where I previously thought that hardware like optics and a high-quality image sensor were the only things that could meaningfully advance picture quality, Google showed that a lot of clever math can result in sharpness and low-light performance leaps ahead of the competition. In another surprising twist, HTC took over from the Pixel this summer with its even better camera (in my judgment and that of DxOMark) on the HTC U11. Samsung simply iterated on its already excellent camera with the Galaxy S8 to take the third spot in my current ranking of best mobile cameras. None of those phones have a supplementary rear camera.

An unusual aspect to my 2017 cameraphone ranking is that, for the first time in a long time, the iPhone doesn’t figure in the top three. I know it’s an unfair fight, given that the newest iPhone model is older than any of the competitors I rate higher, but this is a new phenomenon because the iPhone used to win unfair comparisons. The iPhone’s camera has been the standard setter for most of this decade because Apple has made it a priority, invested heavily, and has a massive team of 800 people working on it. But in 2016, the iPhone 7’s image quality improvements were negligible and the big innovation from the iSight team was the addition of the second telephoto camera on the 7 Plus and its associated portrait mode that automatically blurs out the background for a simulated bokeh effect.


Photo by James Bareham / The Verge

Apple didn’t have its portrait mode ready in time for the iPhone 7 Plus’ release, mostly because that’s a very complex thing to get right in all circumstances. My question now is, did Apple sacrifice resources that would have previously gone to incrementing its picture quality lead in order to improve its dual-camera software? I suspect there’s at least an element of truth to that supposition, especially when looking at how mightily others like OnePlus have struggled in developing their own portrait mode algorithms. Making dual cameras work harmoniously is a hard engineering challenge, and overcoming it seems to be costing companies the opportunity to advance their imaging in pure quality terms.

The reason why phone makers are willing to, at least temporarily, forgo the eternal race toward ever sharper and prettier images is their hope and belief that they can build entirely new uses and functions into their cameras. Discrete functions are more compelling reasons to buy a new thing than single-percentage-point improvements in quality. LG’s dual-camera system, for instance, integrates an extra wide-angle shooter that allows for more creative flexibility. LG is competing with cameras like the Pixel and U11’s by offering something that both of them lack.

Apple’s camera on the next iPhone is sure to be revolutionary, even if its image quality doesn’t improve one iota. The ARKit software in iOS 11, the operating system with which the next iPhone will ship, has shown itself to be one of the most compelling, enticing, and easily programmable implementations of augmented reality yet. Accessing the experiences developers design with ARKit will make for a huge change, or at least expansion, in the way the iPhone camera is used. And Apple is also looking at other ways of expanding the functionality of cameras with things like its upcoming face-unlocking feature (which Samsung and others already offer).

The law of diminishing returns is making itself apparent in many areas of smartphone development these days, and it appears that numerous companies are opting to invest their imaging resources toward creating new experiences rather than finessing and refining existing ones. So even while I say that the best mobile pictures are presently obtained from single-camera phones, I can definitely understand why others might think that the best total mobile photography experience might come from elsewhere.