In the last few days, a small controversy has taken over the world Apple-centric: comparing selfies taken with the iPhone X and the iPhones XS / XS Max, it was speculated that the Apple would have included in the new devices a kind of “Beautification mode” – something that Samsung and others include (as an option, it’s worth noting) in their devices and that, being quite frank, makes everyone’s face look like a wax doll.
Passionate reactions followed. On the one hand, people seeing nothing wrong with Apple embellishing our humane imperfect faces. On the other… “Where is Apple thinking?”, “Not even the option to turn off the filter will the company give us?”, “Will I be mistaken for a Samsung user on the internet? HELP!” and others.
It turns out that, according to an analysis by the photography specialist and app developer Halide, Sebastiaan de With, the novelty has nothing to do with some kind of filter created silently by Apple; the answer to the controversy lies in the technology behind the cameras of the new iPhones.
The developer’s article has a number of technical details and is worth reading for anyone interested in the subject, but, in basic terms, Apple is implementing changes in the way its software captures images – the camera hardware in the new iPhones it has hardly changed compared to the previous generation, but the photos themselves may have drastic changes precisely because of a greater influence of machine learning techniques.
One of the main new features of the iPhones XS / XS Max in relation to the camera is the feature Smart HDR, which takes several photos (in different exposures) quickly of the same scene and, with a little help of artificial intelligence, combines all of them in an image with improved color accuracy and exposure. The point is that, in this process, the iPhone needs to significantly increase the ISO (which is related to the time the sensor is exposed to light) to capture more images more quickly; as is well known in the world of photography, a higher ISO means more noise.
This behavior of the new iPhones is more evident in their front cameras, which have even smaller sensors and, therefore, are susceptible to even more noise when the ISO increases. Therefore, Apple’s noise reduction work ends up making the photos a bit “blurry”, with an impression of loss of detail.
It gets even more complicated, according to With, when the iPhone captures images in raw format (RAW) – the new devices have a tendency to burst the photo, making it mandatory for the user to decrease the exposure before the click. To work around the problems, the developer is also announcing a new feature for Halide called Smart RAW, which will eliminate all the effects of Smart HDR for more balanced and accurate photos.
Summing up the history of the iPhone XS / XS Max: in order not to deliver a “grainy” image, Apple is smoothing out its photos and increasing the exposure of darker areas while decreasing that of lighter areas, giving more homogeneous results to clicks – results that look more artificial on the front camera, less capable. And, therefore, our selfies have this supposed beautifying effect. It’s not a bug, it’s a feature, some would say.
I do, however, disagree. It’s okay that Apple is applying tons of machine learning to its cameras and the technology behind the software is almost palpable; none of this makes sense if the photos produced by the devices are worse than before – and, in the opinion of this one who writes you (based on our own article comparing the selfies iPhone X and XS), yes, the front camera of the new devices brings worse and distinctly artificial results when photographing people.
We hope, therefore, that Apple will at least adjust its software over the next versions of iOS 12 – not necessarily giving us an option to turn the feature on or off, but even making the whole thing smarter and less aggressive towards to the imperfections of our faces (which, after all, are also part of us). It’s better for everyone, isn’t it?