Recently, iPadOS and iOS were updated to version 13.1.2, and this morning Apple released the first beta update for developers – iOS 13.2. One of the most important features of this update is the “Deep Fusion” feature that Apple brings to the iPhone 11th series.
What kind of technology
Deep Fusion is a new AI photo-processing technology. It combines the work of neural networks and the A13 processor. Thus, machine learning allows you to process photos pixel by pixel, texture, detail and noise are optimized for each part of the photo separately. This allows you to create high-quality photographs indoors and in medium light.
When Deep Fusion is turned on (this is done automatically), smartphones from the Phone 11 series take five photos, including:
- three photographs with normal exposure;
- one photo with emphasis on sharpness;
- one photo with a long exposure (with emphasis on dark colors).
After that, neural networks are activated, which out of five photos automatically select the appropriate composition, in accordance with the type of object in the photo, and then optimizes and adjusts the photo in accordance with the structure and color tone.
Comparison of photos taken using 10s and iPhone 11 were published by blogger Tyler Stalman: