iPhone 16 Pro again rumored to get Tetraprism zoom lens
The Tetraprism lens used in the iPhone 15 Pro Max will be used in more models for the iPhone 16 generation, according to a report seemingly confirming more orders for the component.
Render of the iPhone 16 Pro
The iPhone 15 Pro Max is the only one in the generation with a Tetraprism lens. The periscope lens system provides it with a 5x optical zoom, compared to the 3x zoom of the iPhone 15 Pro.
There have been numerous rumors that Apple will be bringing the lens system to more models in the iPhone 16 collection. In a Friday report, it now seems to be happening.
According to industry sources of DigiTimes Asia, Apple is preparing to expand the use of periscope lenses. As part of this, Apple is also expected to expand its supplier range, adding Genius Electronic Optical alongside existing supplier Largan Precision for the iPhone 16.
Current repeated rumors point to the iPhone 16 Pro gaining the lens and the Pro Max continuing to use it.
This would in theory make the extra zoom level a benefit available only to the Pro lineup, not the standard edition models.
Other rumors have claimed the Pro models will also gain an extra 48-megapixel camera sensor on the rear, replacing the 12-megapixel sensor on the ultra-wide angle camera. New coatings may also reduce lens flare, while thinner lenses could help reduce weight and make the bump smaller.
Rumor Score: Likely
Read on AppleInsider
Comments
That said, for all of the highly advanced computational photography that Apple claims, with machine-learning based on analyzing hundreds of thousands of photos, blah, blah, blah, my 15 Pro proved totally inept while on vacation a couple of weeks ago at capturing that most basic of "tricky" photos: a backlit portrait shot. My wife and i were on a restaurant terrace, perched on a cliff, with stunning scenery behind us, but in significantly brighter light than us. Using all the computational trickery available to the 15 Pro, producing a photo that balanced the light between the foreground human subjects and the scenic backdrop should have been a piece of cake. But no! Shot after shot--and we shot a bunch--had us well-focused but nearly invisible in dark shadow. The scenic background, however--which the camera should "know" is not the subject--was perfectly exposed. Okay, I thought... let's use fill-flash, so I set the camera's flash on auto and shot another set of pics. The flash never fired once--again, based on the point of focus, the camera should know it has human subjects in dimmer light compared to the background and compensate with flash. But all we got were more photos of us in deep shadow. Okay, fine--I'll set the flash to "On" instead of "Auto" so it will definitely fire--and it did--but produced only marginally better results, (And yes, we were shooting from a close enough range that the flash should have had enough power to light us.) We came away with a couple of dozen photos in all, not a single one of them usable.
I would also say this about all the supposed "advances" in computational processing of iPhone photos: I've been a VERY avid photographer for decades, and even had my own home color darkroom back in the film days, and I would say my overall satisfaction with iPhone photos peaked with the 13 Pro and has slid ever since. There are just too many times when I'm looking at the lighting and/or color balance of the scene in front of my eyes, and the processed photo of it produced by iPhone has failed to capture it accurately. It's not egregiously wrong--it's not a "bad photo," per se--it's just "off" from what I saw, and not in a good way. It seems to me--and this is admittedly entirely anecdotal--that the more "advanced" Apple tries to get with its processing, the more it uses computational trickery to deliver the "perfect" photo, the further away it gets from delivering a photo that captures the scene as I saw it.
How are you shooting? Are you using RAW? I’ve found that doing so produced notably better images when you know how to process them in Lightroom. I’ve also found that with heavily backlit shots, tapping on the subject before taking the shots can make a big difference. But it the difference I’d extreme, even my /Canon R5 can have problems as will all cameras.