In recent days, a little polemic has taken over the world Apple-centric: comparing selfies taken with the iPhone X and the XS / XS Max iPhonesit was speculated that Ma would have included in the new “Beautification mode” something Samsung and others include (as an option, good to note) in their handsets and that, quite frankly, makes everyone look like a wax figure.
Passionate reactions followed. On the one hand, people not seeing anything wrong with Apple beautifying our humanly imperfect faces. On the other hand, “Where is Apple with its head?”, “Not even the option to turn off the filter does the company give us?”, “Will I be confused with a Samsung user on the internet? HELP!" and others.
It turns out, according to a review by photo expert and app developer Halide, Sebastiaan de With, the news has nothing to do with some kind of filter created silently by Apple; The answer to politics is in the technology itself behind the cameras of the new iPhones.
The developer article brings a lot of technical details and is worth reading for those interested in the subject, but, basically, Apple is implementing changes in the way its software captures camera hardware on new iPhones. It has not changed from the previous generation, but the photos themselves may have drastic changes precisely because of a greater influence of machine learning techniques.
One of the key innovations of the XS / XS Max iPhones regarding camera is the Smart HDR, who takes several photos (at different exposures) quickly of the same scene and, with a little girl of artificial intelligence, combines them all into one image with enhanced exposure and accurate color. The issue is that in this process iPhone needs to significantly increase ISO (which is related to light sensor exposure time) to capture more images faster; As is well known in the world of photography, a higher ISO means more noise.
This behavior of the new iPhones is most evident in their front-facing cameras, which have even smaller sensors and therefore are susceptible to even higher noise when ISO increases. As a result, Apple's noise reduction work makes the photos somewhat blurry, with a loss of detail print.
It gets even more complicated, according to With, when the iPhone captures raw (RAW) images, the new devices tend to pop the photo, making it necessary for the user to decrease exposure before the click. To get around even the developer is announcing a new feature for Halide called Smart RAW that eliminates all the effects of Smart HDR for more balanced and accurate shots.
To sum up the story of the iPhone XS / XS Max: In order not to deliver a grainy image, Apple is smoothing out your photos and increasing exposure of darker areas while decreasing lighter areas, giving more homogeneous results to clicks. they look more artificial in the front camera, less capable. And so our selfies have this supposed beautifying effect. It's not a bug, it's a feature, some would say.
I, however, disagree. It's OK that Apple is applying tons of machine learning to its cameras and the technology behind the software is almost palpable; none of this makes sense if the photos produced by the handsets are worse than before and, in the opinion of the one who writes you (based on our own article comparing the selfies iPhone X and XS), yes, the front camera of the new devices brings worse and distinctly artificial results when shooting people.
We hope, therefore, that Apple will at least tweak its software over the next versions of iOS 12 not necessarily giving us an option to turn it on or off, but even making the whole thing smarter and less aggressive about it. imperfections of our faces (which, after all, are also part of us). better for everyone, no?