Table of Contents
Apple’s iPhone 12 and iPhone 12 Mini add significant new photography features, but camera hardware and computational photography software on the higher-end iPhone 12 Pro models really show how hard Apple is trying to attract photo and video enthusiasts.
Among the changes in the iPhone Pro models are new abilities to fuse multiple frames into one superior shot and a lidar sensor for improved autofocus. And the iPhone 12 Pro Max gets a larger sensor for better low-light performance on the main camera, a telephoto camera that zooms in better on distant subjects, better stabilization to counteract your shaky hands.
The iPhone 12, iPhone 12 Mini, iPhone Pro and iPhone Pro Max debuted at Apple’s iPhone 12 launch event Tuesday. The iPhone 12 (from $799, £799, AU$1,349) and 12 Mini (from $699, £699, AU$1,199) stick with last year’s design, with regular, ultrawide and selfie cameras. The bigger photography improvements come with the Pro (from $999, £999, AU$1,699) and Pro Max (from $1,099, £1,099, AU$1,849), which get a larger image sensor and a fourth telephoto camera for more distant subjects, too. The iPhone 12 Pro has the same 2X zoom telephoto reach as earlier iPhones — a 52mm equivalent focal length — but the Pro Max’s extends to 2.5X zoom, or a 65mm equivalent lens.
Apple’s iPhone 12
Cameras are one of the most important features on a new smartphone along with processor and network speeds. We snap photos and videos to document our lives, to share with friends and family and to enjoy artistic expression.
The iPhone 12 and iPhone 12 Mini get significant improvements, too. They’ll benefit from Night Mode photos that now work on the ultrawide and selfie cameras, too, and an improved HDR mode for challenging scenes with bright and dark elements, Apple said.
Computational photography tricks
HDR stands for high dynamic range — the ability to capture shadow details without turning highlights into a washed-out mess. All the new iPhones bring third-generation HDR technology designed to better capture details like silhouetted faces, Apple said. It also uses machine learning technology to judge processing choices like boosting brightness in dim areas, the company said.
The iPhone Pro models get another computational photography technique that Apple calls ProRaw. iPhones and Android phones have been able to shoot raw photos for years, an unprocessed alternative to JPEG that lets photographers decide how best to edit an image. Apple ProRaw blends Apple computational photography with a raw format so photographers will get the benefit of noise reduction and dynamic range with the flexibility of raw images, Apple said. It’s similar to Google’s computational raw technology that arrived with the Pixel 3 in 2018.
Google pioneered work in the range of processing tricks called computational photography, helping erase the comfortable lead in image quality that Apple’s early iPhones held for years.
But with the iPhone 11, Apple employed its own versions of some Google techniques, like combining several low-exposure frames into one shot to capture shadow detail without turning skies into an overexposed whiteout. Google calls it HDR+, and Apple calls it Smart HDR; a related technology called Deep Fusion blends frames for better detail and texture, in particular with low light.
On the iPhone 12, Apple’s deep fusion technology exercises all the major parts of the A14 Bionic chip inside — the main CPU, image signal processor, the graphics processor and the neural engine. That means Apple can apply deep fusion technology to all the cameras on all the iPhone models, Apple said. And it means the iPhone’s portrait shots now work in Night Mode, matching an ability Google added with its Pixel 5.
Better iPhone camera hardware
The larger sensor on the iPhone Pro models — 47% bigger than the iPhone 11’s main camera sensor — increases pixel size. That engineering choice increases the sensor cost but lets it gather more light for better color, less noise and improved low-light performance.
The Pro phones also stabilize images by shifting the sensor, not the lens elements, which Apple said lets you take handheld shots with a surprisingly long 2-second exposure time.
All the iPhone 12 models also benefit from a wider f1.6 aperture on the main camera for better light-gathering ability. And the ultrawide camera now gets optical image stabilization, too.
The phones get better video abilities, too, with 10-bit encoding that should capture color and brightness better and support for Dolby Vision HDR video technology. The iPhone Pro models can shoot the HDR at 60 frames per second, but the iPhone 12 and 12 Mini top out at 30fps.
Ordinary 4K and 1080p can be shot up to 60fps, but slow-motion 1080p can reach 240fps. Time-lapse videos now are stabilized.
What the iPhone doesn’t do
But Apple hasn’t gone as far as some in trying to grab photography headlines.
The iPhone 12 employs no pixel binning, for example, a technique that uses much higher resolution sensors for photographic flexibility. Pixel binning pools data from groups of four or nine neighboring pixels to yield the color information for a single pixel in the photo being taken. Or, if there’s enough light when the photo is taken, the phone can skip the pixel binning and take a much higher resolution photo. That offers more detail or more flexibility to crop into the important part of the scene.
Another newer trick the iPhone skipped is the inclusion of a telephoto camera with much higher magnification. The Huawei P40 Pro Plus has an impressive 10X optical zoom, for example. That’s difficult since the laws of physics make telephoto cameras physically large, but smartphone makers like Huawei and Samsung are trying to solve the problem with a mirror that bends the light path sideways into the interior of the phone.
But Apple could have tricks up its sleeve later. In 2017, Apple acquired image sensor startup InVisage whose QuantumFilm technology held some promise for making image sensors smaller or improving image quality.
And it’s done plenty with computational photography on its own, notably a portrait mode that simulates the blurred backgrounds “bokeh” of high-end cameras and lighting effects that can be applied to those portraits.