For developers: WWDC19 should bring news on Siri, augmented reality and more!

Last week we commented on several features that we should paint on iOS 13 and in macOS 10.15 all disclosed by the Brazilian Guilherme Rambo, at the 9to5Mac.

Because he came back today with more news, now focusing on what Apple should present in the WWDC19 for developers.

Crab

Developers will have more options for integrating with Siri, including media playback, search, voice call, event tickets, message attachments, train travel, flight, airport port and seat information.

Marzipan

Whoever chooses to port their iOS apps to macOS using UIKit will have access to new APIs which will allow apps to integrate with specific Mac features such as Touch Bar and menu bar (including keyboard shortcuts). UIKit apps on Mac can also open multiple windows.

Applications that support Split view on iOS they can be resized by dragging the divider and can also have their position reset by double clicking on the divider, just like native macOS apps.

Best of all, to make an iOS app compatible with macOS will be as simple as checking a checkbox in an Xcode configuration.

Augmented Reality

Here we will have significant news, according to Rambo, including a 100% Swift framework for AR and a companion application that allows developers to create visual experiences in AR. ARKit can also detect human poses and, for game developers, the operating system supports touch controls and narrow earphones.

A new framework give developers more control over the vibration engine (Taptic engine), which currently offers a very small set of feedback styles for third party developers. We'll also have news related to link views (a preview of the URL in question), similar to what appears in iMessage conversations.

NFC

Already the NFC gain the ability to read any tag ISO7816, FeliCa or MiFare currently, only the tags formatted as NDEF can be read by third party applications, as we discussed in this article.

CoreML

A new version of CoreML will allow developers to update their machine learning models on the device. Today, as Rambo explained, models need to be pre-trained and are static after implantation. This change, however, will allow apps' behavior to change as their ML models learn from user actions. Apple is also adding a new API for developers to perform machine-learning sound analysis. O framework Vision has an integrated image sorter, without the need to incorporate a machine learning model to classify images into common categories.

Scanning and more

Document scanning functionality available in some parts of iOS, such as the Notes application, will be available to third party developers. With this new API, apps will finally be able to capture photos from external devices such as cameras and SD cards without having to use the Photos app.

On the Mac, apps may offer file provider extensions, improving the way certain apps (Dropbox, for example) can integrate with Finder. There will also be a new API that developers can use to write device drivers.

· • ·

WWDC19 will take place June 3-7 in San Jose (California, United States).