Like every iPhone release thus far, this year’s iPhone XS and iPhone XS Max sport upgraded cameras. But they’re not better in the ways you might expect.
On paper, the iPhone XS and iPhone XS Max don’t seem like they have cameras that have significantly upgraded specifications compared to last year’s flagship.
But they will still take better photos and videos than the iPhone X. Here’s why.
- Our First Impressions of the 2018 iPhone Models
- 25 Top iOS 12 Tips That Actually Work
- What Happened to Airpower? Exclusive Coverage
- Here’s Everything Apple Didn’t Announce at this Year’s Event and What Might That Mean
Hardware Upgrades For The iPhone XS
To be clear, Apple has still updated the actual camera hardware on the new iPhone Xs and iPhone Xs Max. That includes an update to the rear camera.
While it’s still a 12-megapixel sensor with an optically stabilized f/1.8 lens, Apple has bumped up the size of the sensor and the megapixels
- That means they can collect more light, and as a result, can take better-looking photos.
- The front-facing camera packs the same 7-megapixel sensor with a f/2.2 lens. (Though the sensor is now “double the speed,” according to Apple.)
- The rear-facing telephoto camera still features a 12-megapixel sensor, a f/2.4 lens, and optical image stabilization.
- Apple also says it has incrementally updated the already great True Tone flash. That update should allow for more accurate color reproduction in low light.
Key Software Updates for iPhone XS Camera
In addition to the hardware updates, there are also various software-based updates. These updates, like new video stabilization algorithms, will generally bump up the quality of photos.
But when it comes to significant changes on the iPhone XS and iPhone XS Max camera front, it mostly comes down to two new major features.
Apple’s Smart HDR For iPhone XS and XR
But by and far the biggest update to the camera system on the iPhone XS and XS Max has little to do with sensors and megapixels. Instead, it’s mostly software-based.
The new system is called Smart HDR
It’s a software system that leverages Apple’s high-performance A12 Bionic. Basically, it runs photos you take through an updated image signal processor and the A12’s Neural Engine to make them look better.
The actual process is more complicated than that, of course. Smart HDR works like similar systems on other devices and cameras. It takes a series of pictures at different exposures and intelligently combines them to create a single image that’s vastly better looking.
As a result, your images will be high-quality even in low-light situations. Compared to devices that aren’t equipped with Smart HDR, they’ll also generally be brighter and have much more detail.
Aided By Computational Photography
Apple calls this technology “computational photography” — that is, photography assisted by high-tech processing power like the A12 Bionic’s. Suffice to say, this type of system simply isn’t possible without top-tier silicon like the A12.
It’s actually pretty amazing how Smart HDR works. As soon as you open the Camera app, the iPhone XS/XR begins taking a “four-frame buffer” that allows zero shutter lag.
Smart HDR will then look at these frames and decide whether they can improve a photo by adding detail. It also intelligent detects motion or faces within a shot and adapts the final result accordingly.
So, essentially, Apple’s A12 Bionic chip takes a photo and makes it look better in the very instant that you snap it. That’s a feature that even full-frame cameras can’t do, even though they might take higher-quality photos.
Portrait Mode Update
Apple has made a similar software-based change to Portrait Mode, which is the proprietary feature that lets users take an image with a simulated “bokeh effect” akin to photos taken with a full-frame camera.
Bokeh is a term used to describe an image with a highlighted subject and a softly blurred background. It’s just one way that smartphone camera pictures can be made to look like something taken on a DSLR or full-frame, mirrorless camera.
A Mobile Device Game Changer–Manually Adjusting the Bokeh
But the iPhone XS, iPhone XS Max, and the iPhone XR do something interesting. It lets you adjust the background blur or bokeh after you’ve taken the image.
It works by detecting a face in an amine and separating the background and foreground. Apple’s software will also create a “depth map” of the environment, intelligently adding blur to mimic how the picture would look on a DSLR.
Apple has done this by modeling Portrait Mode’s blur on full-frame cameras with detachable lenses. Sure, the image quality isn’t directly comparable, but the iPhone XS (and XR) lineup does exceptionally well for a handset.
It’s worth noting that some other devices, like some from Samsung, have had similar features, before Apple. But Apple’s depth control system will generally look better because it’s working harder in the background with its image processing.
Additionally, the feature is also available on the front-facing camera despite the fact that it only has a single sensor.
Mike is a freelance journalist from San Diego, California.
While he primarily covers Apple and consumer technology, he has past experience writing about public safety, local government, and education for a variety of publications.
He’s worn quite a few hats in the journalism field, including writer, editor, and news designer.