TECH DATA

Did you ever Look at the Three Lenses on an iPhone Pro and Wonder How that could Possibly Work?

The iPhone 14 Pro has three cameras each with a CMOS sensor one for each of its three lenses. A 48 megapixel wide angle main camera, a 12 megapixel 120-degree view ultra wide angle and a 12 megapixel telephoto.

Let me say it again, the 14 Pro has a 48 megapixel main camera.

That’s a huge upgrade in iPhone image capture. More detail. Better low light photography.

Here’s the impressive part. All three cameras are set to look at the exact same point, changing seamlessly from one camera to the other as you zoom in or out. When you capture an image, all three cameras capture the same image or video. Then Deep Fusion, Apple’s neural image processing technology, uses artificial intelligence to merge images from different exposures into a single image or video.

If a lens didn’t capture parts of the image properly, Deep Fusion replaces those pixels with pixels from one of the other cameras to create the highest quality image or video possible.

Plus Apple seamlessly integrates image stabilization, high dynamic range exposure capture, low light image capture and shallow depth of field focusing effects into the iPhone Pro.