As we have commented several times, the TrueDepth system (present on the X, XS, XS Max and XR iPhones, as well as the new iPad Pro) has opened a world of possibilities not only for Apple itself, but also for many developers who take advantage of the camera's built-in technology and front sensors. devices for some very interesting uses.
In this sense, a new app called Airsynth it makes fun of Face ID's camera and depth sensor system to make sounds. With this, the user simply zooms in or out above the iPhone or iPad Pro screen to control the synthesizer (something like a pseudo-term).
Software was created by independent developer neozelands David Wood (from roboheadz) is a short demonstration of the kind of data a developer can get from Apple hardware. Of course, there are other apps of this style in the App Store, but they use the “ordinary” (2D) camera to “guess” how close a user is holding their hands to the screen.
AirSynth has a clean, objective interface that lets you configure the look and style of sound you want to produce, allowing you to choose from six available models. In addition, just as Face ID works perfectly in the dark, the app also has no trouble identifying your hands in dim places.
Of course, not everything is just flowers: Unfortunately, the app has no built-in recording or editing controls; that is, you can't save that cool sound you made on your device (maybe if you use the native recording feature of iOS).
Notably, GarageBand (Apple's music production app) for iOS has a control that leverages the depth features of the TrueDepth system for sound creation, something similar to AirSynth. As a result, even more music apps are likely to embrace functionality in the future.