I love the iPhone 16 Pro, but Apple’s cameras are aging next to Android


I’m comfortable switching between iOS and Android. I own a OnePlus 9 5G, and I’ve used other Android devices. My daily driver is an iPhone. Recently, I bought an iPhone 16 Pro and a refurbished Apple Watch Ultra 2. I would have picked up a Google Pixel 9 Pro if I could use an Apple Watch without an iPhone. I considered the Pixel because it feels like Apple is holding back its camera technology. You can take great photos with an iPhone 16 Pro, but I think I could do better work on the other side of the fence.



Related

Google Pixel 9 Pro vs. Apple iPhone 16 Pro: The best compact flagships

Is it more than just Android vs. iOS?


Raw camera specs aren’t everything, but they are something

A matter of flexibility

The Apple iPhone 16 showing the camera at 2x zoom

The iPhone 16 Pro isn’t a slouch. It has a 48-megapixel main (wide-angle) camera, a 48-megapixel ultra-wide, and a 5x telephoto, roughly equivalent to a 120mm DSLR lens. However, the telephoto is capped at 12 megapixels, unlike the 48-megapixel telephoto on the Google Pixel 9 Pro or the 50-megapixel one on the OnePlus 13. Realistically, you can’t preserve sharp details when you crop the photos you take with it. The iPhone 16 Pro’s “digital” zoom past the 5x mark is a blurry mess.


Even Apple’s main cameras are starting to feel weak, considering that the Samsung Galaxy S24 Ultra has a 200-megapixel camera. That’s more than anyone needs in 2024, and Samsung’s shutter tech leaves something to be desired. Still, a trillion-dollar company like Apple should be able to deliver a 50- or 100-megapixel sensor that smashes the competition.

Weak zoom can mean failing to capture important memories.

Traditionally, Apple focused on the quality of its sensors and processing rather than raw megapixel count. Yet, at a certain point, the pixel count matters. It determines how well you can resize and reshape photos. I come from a wedding photography background and often crop photos to highlight the bride, groom, and family while removing unwanted distractions. I think smartphone owners are coming to expect the same flexibility.


I also feel like Apple could be doing better on zoom lengths. A 5x zoom is respectable and the norm on high-end smartphones. However, some Android models have been capable of 10x zoom for a long time. In pro photography work, a 120mm lens is something I’d use for short-range portraits. I want something I can also use to photograph scenery, wildlife, or concerts. Weak zoom can mean failing to capture important memories.

Good photos show highs and lows in colors and shadows

Tone down the processing, please

To Apple’s credit, you can do a lot of on-device work to make an iPhone 16’s photos pop. In addition to editing options, its Tone and Photographic Styles features have been in iOS for a few years. Before any tweaking, I prefer the processing on Google Pixel phones, as do most people, based on MKBHD’s tests. Apple’s default look can be bland, reducing shadows and saturation to make everything clear. In the photo gallery above, the middle one was shot on my iPhone using relatively neutral settings. Everything else was taken with an Android device.


Shadows add interest to photos. They set a mood, define contours, and add mystery. The same qualities often apply to colors. In both cases, reducing them affected interest and felt artificial, given how vivid the scenes seemed in person.

Shadows add interest to photos. They set a mood, define contours, and add mystery.

I remember seeing a theory that Apple’s confusion about improving processing might be why it introduced Tone and Photographic Styles. I don’t know if that’s true, but warning bells should be going off in Cupertino when it’s a growing trend for apps like Halide to reduce or eliminate an iPhone’s default processing. I feel like even Samsung frequently delivers more exciting photos without user changes.


Related

Our top 16 Google Photos editing tips and tricks to help you get the best smartphone photos

Don’t dismiss Google Photos as an average photo editing app

What might be to blame for Apple’s stagnation in camera tech?

A view from the ballpark

The Pixel 9 and iPhone 16 in the hand.

I don’t want to be too harsh on Apple since I like my iPhone 16 Pro. After getting things dialed in, its photos are excellent. Still, I sense Apple doesn’t know how to preserve colors and shadows without losing detail, and its camera tech is hampered by a familiar management concern: profit margins. Apple executives are notorious for preserving margins, which is why some iPhone components remain unchanged for years. Apple hasn’t changed the lens elements on the iPhone 16 lineup to fix flares in nighttime photos and videos, even though that problem surfaced with the iPhone 13 in 2021.

Apple must do better with the iPhone 17 to keep bragging rights.


The Android landscape is diverse and sometimes slap-dash. Many phones with the OS are worse at photography than iPhones. The increased competition among Android phone makers seems to result in more ambitious camera tech at the high end. Apple must do better with the iPhone 17 to keep bragging rights.

What about the iPhone 17, anyway?

Rumors point to the iPhone 17 Pro Max getting a 48-megapixel telephoto camera and one or more models getting a mechanical aperture for controlling light and depth-of-field. I expect additional changes, but we’ll have to wait to confirm anything since the next major iPhone event is due in fall 2025. Between then and now, I’ll bet that Android cameras will raise the stakes higher still.

Leave a Comment