WWDC19 should bring news on Siri, augmented reality and more!

Wallpapers da WWDC19 em iMac

Last week, we commented on several features that should be painted in the iOS 13 and in macOS 10.15 – all published by the Brazilian Guilherme Rambo, at the 9to5Mac.

Because he came back today with more news, now focusing on what Apple should present at WWDC19 for developers.

Crab

Developers will have more options for integrating with Siri, including media playback, research, voice calling, event tickets, message attachments, train travel, flight, airport gate and seating information.

Marzipan

Whoever chooses to port their iOS apps to macOS using UIKit will have access to new APIs which will allow applications to integrate with specific Mac features, such as the Touch Bar and the menu bar (including keyboard shortcuts). UIKit apps on Mac will also be able to open multiple windows.

Applications that support Split View on iOS they can be resized by dragging the divider and they can also have their position reset by double-clicking on the divider, just like the native apps for macOS.

Best of all, making an iOS app compatible with macOS will be as simple as selecting a checkbox in an Xcode configuration.

Augmented reality

Here we will have significant news, according to Rambo, including a structure 100% in Swift for AR and a complementary application which will allow developers to create visual experiences in AR. ARKit will also be able to detect human poses and, for game developers, the operating system will support touch controls and stereo headphones.

A new framework will give developers more control over the vibrating motor (Taptic Engine), which currently offers a very small set of feedback styles for third-party developers. We will also have news related to link previews (a preview of the URL in question), similar to those that appear in iMessage conversations.

NFC

NFC will gain the ability to read any tag ISO7816, FeliCa or MiFare – currently, only tags formatted as NDEF can be read by third party applications, as we talked about in this article.

CoreML

A new version of CoreML will allow developers to update their machine learning models on the device. Currently, as explained by Rambo, the models need to be pre-trained and are static after implantation. This change, however, will allow apps’ behavior to be modified as their ML models learn from user actions. Apple is also adding a new API for developers to conduct sound analysis with machine learning. THE framework Vision will have an integrated image classifier, without the need to incorporate a machine learning model to classify images into common categories.

Scanning and more

The document scanning functionality available in some parts of iOS, such as the Notes app, will be available to third-party developers. With this new API, applications will finally be able to capture photos from external devices, such as cameras and SD cards, without having to use the Photos application.

On the Mac, apps will be able to offer file provider extensions, improving the way certain apps (Dropbox, for example) can integrate with the Finder. There will also be a new API that developers can use to write device drivers.

· • ·

WWDC19 will take place between the 3rd and 7th of June, in San Jose (California, United States).