Among the (few) new features brought by the new iPads Pro, its new rear camera system was widely publicized as one of the flagships especially the LiDAR scanner present in the module, responsible for capturing 3D information from the environment to improve applications in augmented reality (and other purposes).
The fact is that until now, apart from its basic operating information, we don't know much about the LiDAR scanner. Because exactly for what arrived Sebastiaan de With, member of the development team of the famous camera app Halide and that it has already shared several other in-depth analyzes of Apple's hardware and software elements.
The article has a complete analysis of the camera system of the new iPad Pro, including a good look at the "common" lenses (wide-angle and ultra-wide) of tablets, however, we already know well the iPhones 11 and 11 Pro (Max). The point of most interest here is the LiDAR scanner, and de With discovered some very interesting details in relation to it.
For starters, yes, the scanner is based on a technology that, on paper, is very similar to the one that Apple uses in its front cameras TrueDepth (those responsible for Face ID and Front Portrait Mode): a component emits points of infrared light (therefore, invisible) and measures the time it takes these beams to reach their destination, making a three-dimensional map of their surroundings.
There is, however, a fundamental difference between the LiDAR scanner and the TrueDepth camera: the first is designed to scan entire rooms, with a much greater range of light beams, while the second, although with a much greater level of detail, capable of capturing information just a few tens of centimeters.
THE iFixit did an experiment to compare the two technologies. See how the light points of the LiDAR sensor are much more spaced than those of the Face ID:
that's why, contrary to some expectations, the rear cameras of the iPad Pro at the support the Portrait Mode or Portrait Lighting: there is simply not enough resolution in them to map a face or an object close to it. On the other hand, according to With, Apple could implement the mode in the rear camera of the new iPad Pro simply by comparing the spatial information captured in its common lenses.
The developer shared a proof-of-concept that his team is working on: a new application called Esper, which captures entire environments to reproduce them later in 3D:
It is not yet clear how Apple will take advantage of the LiDAR scanner. We already have good uses in the Measure app, from iOS but nothing, so far, that leaves people gaping or showing the real value of the tool. Did she come? For that, we will have to wait a little longer.