IOS 13.2 First Beta Includes Deep Fusion for iPhones 11, 11 Pro and 11 Pro Max [atualizado 5x: novidades]

There is no doubt that photographic power is the great draw of iPhones 11 and 11 Pro (Max). Apple has put a lot of the chips in the new cameras into its new cameras, and the bet seems to have worked: the leap over the XR and XS (Max) iPhones was so noticeable that for those who take photography seriously, several analyzers made the rare claim that the upgrade is worth it even for those with 2018 handsets (if the our review corroborate this opinion, good wait and see).

It is good to note, however, that an important piece of the photographic power of the new iPhones is not yet available. It is the Deep fusion, a technology created by Apple that makes the photos taken by the handsets even better and more faithful to reality and that was rated by Phil Schiller, in the presentation of the handsets, as “Crazy science”.

Example of picture taken with Deep fusion (click / tap to enlarge).

So far, there was no sign of the release of the Deep fusion Apple just promised the feature sometime later this year in a software update. For the The verge brought good news about it: technology is already incorporated into first beta of iOS 13.2 (build 17B5059g) released just now by the developer company. O Deep fusion, incidentally, has everything to be the big news of the verse.

But what, exactly, the Deep fusion? Apple has already explained in basic terms how the technology works, but there are still some doubts about its operation.

Firstly, it is important to make clear that the Deep fusion iPhones 11 and 11 Pro (Max), because it depends on features on the A13 Bionic chip (and the updated version of the Neural engine), which are only present in such apparatus. In addition, the appeal at the A specific mode that you can manually enable (or disable) it simply exists silently, acting whenever the device deems it necessary.

Example of picture taken with Deep fusion (click to enlarge).

More specifically, the Deep fusion “live” with Smart HDR (Smart hdr, which starts from the same premise, but less advanced) and with Night Mode (Night mode); each of them will be activated depending on the amount of light in the scene and the lens used at ultra wide angle, for example always use Smart HDR as it does not support the other technologies.

O The verge also made a walkthrough of what happens on your iPhone when you take a picture and the Deep fusion is active. like this:

  1. Before you press the capture button, iPhone is already capturing images continuously; When you shoot, the device takes the most recent photos and captures four extra images: three patterns and one long exposure for details.
  2. The four extra images are combined into one image, called a Synthetic long (long synthetic).
  3. O Deep fusion choose the fast exposure image in more detail and combine it with the long synthetic. Processing can remove imperfections, such as noise, more accurately than Smart HDR.
  4. The images go through four detail processing steps, pixel by pixel each analyzed and modified to enhance their characteristics. With this, the system is able to know how to combine the two images by capturing details of one and tone, color and lighting of the other.
  5. The final image generated.

The process takes a total of about half a second, which means that if you take a photo and immediately tap the icon to view it, iPhone will display a “simulation” of the image over that period until the photo is taken. be processed by the Deep fusion and displayed.

It is worth noting that the Deep fusion It is not compatible with sequential shooting mode, as it requires individual processing for each image. Another important point that technology will not go into when you turn on the “Capture Photos Out of Frame” option is because the feature uses the ultra-wide angle camera to capture “extra” information outside the visible field that is displayed. when you edit the photo in question.

Telephoto Limitations

As you may remember, from the very first iPhones with a second lens (telephoto), iOS detects when there is low light in the room and automatically uses the main wide-angle (which has a larger aperture) thus making a cut (crop) digital in the image.

Well, good to note as John Gruber pointed out in Daring fireball the same thing happens when we use both night mode or the future Deep fusion. That is, d to opt for the telephoto (2x) in both modes, but the lens used to actually capture the wide-angle image.

The fact that the Deep fusion It's coming and, under various conditions, bring good news for the processing of images on iPhones remains to know how all this translate into benefits in the real world. Let's wait.

Siri and messaging apps

Earlier today, we also posted a change story Apple is planning for Siri, easing her interaction with third-party messaging / phone apps.

Well, according to Mark Gurman of Bloomberg, this news is also already present in iOS 13.2 beta.

Other betas released today

In addition to iOS 13.2, Apple also made available to developers today the iPadOS 13.2 beta (compilation 17B5059g), the watchOS 6.1 beta 2 (175050e), the tvOS 13.2 beta (17K5059d) and the Xcode 11.2 beta (11B41).

Any relevant news that comes up about them, we will update this post.

Update by Rafael Fischmann 10/02/2019 15:12

And he has painted at least one other thing Apple had previously promised in iOS 13.2 beta: announce messages by AirPods.

As reported 9to5Mac, users will be able to hear and reply to messages completely hands free and without having to “call” Siri manually. In addition, developers may incorporate feature support into their own messaging apps.

At least for now, the feature will work even with AirPods.

Update II by Rafael Fischmann 10/02/2019 s 15:24

One more new sight: support for Handoff for HomePod. It's not working yet, though, as it certainly also requires a HomePod system upgrade itself.

Update III, by Rafael Fischmann 10/02/2019 s 15:43

IOS 13.2 beta also fixes a boring HomeKit issue that bundles related accessories into a "stack":

Update IV, by Rafael Fischmann 10/02/2019 17:31

And they started popping up the first tests of the Deep fusion:

In my first tests, Deep Fusion offers very modest sharpness gains (and much larger files my HEICs were ~ 2x larger).

These others I have not found so fair, because they compare photos of an iPhone 11 (with Deep fusion) with an iPhone XR ie even without the novelty, the trend that the photos of 11 already got better anyway.

Even so:

#DeepFusion's First Tests on #iPhone 11
And another relevant test, you can see that 11 is much clearer than XR even before Deep fusion be applied.

Update V, by Rafael Fischmann 10/02/2019 18:28

Here is the tone AirPods emit when they receive messages: