Developer shares more details about LiDAR scanner on new iPad Pro

Developer shares more details about LiDAR scanner on new iPad Pro

Among the (few) new features brought by the new iPads Pro, its new rear camera system was widely publicized as one of the flagships especially the LiDAR scanner present in the module, responsible for capturing 3D information from the environment to improve applications in augmented reality (and other purposes).

The fact is that until now, apart from its basic operating information, we don't know much about the LiDAR scanner. Because exactly for what arrived Sebastiaan de With, member of the development team of the famous camera app Halide and that it has already shared several other in-depth analyzes of Apple's hardware and software elements.

The article has a complete analysis of the camera system of the new iPad Pro, including a good look at the "common" lenses (wide-angle and ultra-wide) of tablets, however, we already know well the iPhones 11 and 11 Pro (Max). The point of most interest here is the LiDAR scanner, and de With discovered some very interesting details in relation to it.

For starters, yes, the scanner is based on a technology that, on paper, is very similar to the one that Apple uses in its front cameras TrueDepth (those responsible for Face ID and Front Portrait Mode): a component emits points of infrared light (therefore, invisible) and measures the time it takes these beams to reach their destination, making a three-dimensional map of their surroundings.

There is, however, a fundamental difference between the LiDAR scanner and the TrueDepth camera: the first is designed to scan entire rooms, with a much greater range of light beams, while the second, although with a much greater level of detail, capable of capturing information just a few tens of centimeters.

THE iFixit did an experiment to compare the two technologies. See how the light points of the LiDAR sensor are much more spaced than those of the Face ID: