Today’s f-numbers are yesterday’s megapixels. Instead of minimal improvements in specifications, algorithms are revolutionizing photography, and Apple is visibly paving the way like no other manufacturer before it with its iPhone 8 Plus and iPhone X. We’re taking a closer look at the camera.
Flashback: When the Google Pixel was released last year, the camera made jaws drop. Its integrated IMX378 Sony image sensor delivered remarkable photos, even under less than ideal conditions. But was it really the IMX378?
Not quite. For instance, the BlackBerry KeyOne used the same sensor, but fared far worse in image quality. The excellent images are due to the Pixel’s camera app, which Google does not share with other manufacturers, unlike the Android operating system. One of the Pixel app’s key points is combining a whole series of photos shot with various settings in order to obtain an optimized image.
This massive influence of image processing algorithms is also known as computational photography. Having said that, let’s talk about Apple now.
The camera in the iPhone X
Even if the technical data is not the focus (pun intended, 😉) of this article, I will list it anyway for the sake of completeness. The iPhone X has two new image sensors that are 12 MP each, and both camera modules are equipped with an optical image stabilizer. The wide-angle lens offers a 28-millimeter focal length and an aperture of f / 1.8, while the telephoto lens is 56 millimeters and f / 2.4. In addition, there is also a quad-LED flash.
The sensor is reportedly bigger and thus, can capture 83 percent more light. Assuming a crop factor of 7 and a 1/2.9-inch sensor size on the iPhone 7, this would approximately mean a 1/2.0-inch format, which would be bigger than in other current smartphones. I am very excited about the additional details.
As with previous iPhone generations, there is also a portrait mode in the new iPhones, which blurs the background, but the portrait lighting function is new. Portrait lighting works by detecting the face of the photographed person in detail and simulating several types of lighting; at least it looked very impressive in the presentation.
To process these elaborate effects, Apple has equipped its new A11 SoC with a standalone image signal processor. It will furthermore assist in focusing and noise reduction—also in video mode.
According to the presentation, Apple divides the captured image into a total of two million squares and analyzes their content. This compresses low-detail image areas more heavily while areas that are rich in detail are preserved as much as possible. Of course, this is nothing new—video encoders also work this way—but Apple wants to be exceptionally good at it. Apple uses HEVC for its video codec and, as usual, speaks confidently about having the best video quality of any smartphone. Of course, we will put that to the test.
60 fps is possible at the maximum resolution, 4K. Furthermore, there are now slow-motion videos at a 1080p resolution and 240 images per second, which equals a slowdown by a factor of eight when played back at 30 fps. In comparison: The Samsung Galaxy Note 8 currently achieves only 1,280 x 720 pixels at 240 fps.
At the risk of beating a dead horse: This is once again a clear example of the benefits of having both hardware and software come from a single source. Of course, this is also possible for Android, but it simply requires significantly greater collaboration between many partners that are also simultaneously competing with each other.
The iPhone X furthermore sports a 7 MP front camera with a 3D scanner for the new FaceID feature, allowing for portrait mode, including Bokeh effect and portrait lighting, to work with selfies as well.
Cameras in the iPhone 8 and iPhone 8 Plus
The back of the iPhone 8 Plus mostly sports the same camera hardware as the iPhone X. The only notable differences, from what we know so far, relate to the video camera. Instead of f / 2.4 like on the iPhone X, it only offers an aperture of f / 2.8, which corresponds to roughly a 36 percent difference. Furthermore—as is already the case with the iPhone 7 Plus—there is no optical image stabilizer in the telephoto module, while the front camera is standard fare. Here, Apple foregoes the iPhone X’s 3D scanner and integrates a conventional 7 MP camera.
As in the previous generation, the iPhone 8 (not the Plus version), also foregoes the extra second lens on the rear. According to existing information, the wide-angle camera has the same specifications as the iPhone X and iPhone 8 Plus.
Computational Photography is king
The new iPhone sees Apple continue a trend that we have already observed in many different Android smartphones and their cameras: Software is becoming increasingly important – computational photography is king.
While all manufacturers can largely use the same hardware components and the Android operating system, image processing algorithms are a well-kept secret—even by Google. In order for brands to set themselves apart from the competition and standard Android fare, large software departments are required, like the ones that only the absolute top manufacturers can afford, and it will be all the more difficult for smaller companies to keep up in the long term. Unlike with the operating system and the accompanying media and app ecosystem, Google does not make money from the camera app itself, and Mountain View’s interest in improving Android’s standard features may also be just as low.
Considerable progress can only be expected from the largest manufacturers—or from Android in general, as soon as Google feels the entire system is threatened by Apple’s camera advancements, but nothing may be done in this respect as long as market conditions between iOS and Android do not significantly change.