One of the features of iOS / iPadOS 13 most celebrated by advanced users the support for pointer devices in the system such as a Mouse or one trackpad We have already given details about the integration in this article. The idea is that, with a more precise control accessory, the iPad potentially becomes more professionally suited equipment, coming even closer to the Mac / PC as a work tool.
On the other hand, Apple goes in a different direction: mouse support, as the company has repeatedly reiterated, accessibility featuredesigned for people with limited mobility to use their iPad (or iPhone) more easily and accurately. Even so, it is not enabled by default and needs to be configured in very specific menus within the iOS / iPadOS 13 settings.
In a conversation with the TechCrunch about the accessibility features of Apple's new systems, the company's director of global accessibility policies and initiatives, Sarah Herrlinger, reinforced this view: the iPad and iPhone will always be touch-controlled devices, and that will never change the mouse control for only a portion of their users.
Herrlinger said she was aware that users with no walking difficulties would also use the feature eventually, just as other Apple accessibility tools (such as "Typing for Siri" or "Universal Magnifying Glass") are used by people who, in particular, use it. In theory, they do not depend on them for the use of the devices. She said, however, that mouse support was thought from the beginning (it began to be designed a few years ago) as a resource for users with difficulties, not an expansion of system capabilities.
In other words, it is not because iOS / iPadOS 13 can be controlled by a mouse that we will begin to see more complex applications with long columns of small buttons inaccessible to touch; nor are there apps in the App Store with a "works best with mouse" warning. The primacy of mobile devices is still touch, and so will be for the visible future nothing fairer, is not it?
Voice Control (Voice control)
Herrlinger also talked about the new feature of Voice Control to reach iOS / iPadOS 13 and macOS Catalina 10.15, giving some more details on how it works.
In a short demonstration by the executive to journalists, the tool proved to be very powerful and functional, interpreting and executing commands correctly even in a noisy environment. Voice Control accepts simple natural language commands such as "open Mail" or "send Happy Birthday to Mary on iMessage" (but additional language support will be added later), but goes beyond a way that you can basically control the entire device with speech only.
Some examples: On a Safari bookmark screen, with several huge or unpronounceable title pages, you can say “show numbers” so that the feature assigns a number to each option displayed on the screen; From, just say “click on 21”, for example. You can also ask the tool to display a grid on the screen so that you can perform precise click / touch, drag or zoom commands.
According to Herrlinger, Voice Control will be constantly improved and widely customizable by the user, so that even people with total disability can control devices.
Amazing, isn't it?