In the opening keynote for WWDC 2016, Apple introduced the concept of “Differential privacy”, a series of rules to which your data collection systems would follow to respect the privacy of users. This is absolutely necessary, considering that, starting with iOS 10 and macOS Sierra, Ma's artificial intelligence uses this data much more vigorously to become smarter and more proactive, suggesting the exchange of words for emojis, for example.
We have already published a very complete post clarifying the initial questions about differential privacy, but today new information about it has emerged that deserves attention.
The staff of Recode talked with Apple in order to clarify some doubts on the subject, and now we are aware of some more things: first, Apple confirms that, as always, the act of giving information to the company continues to depend on a clear user authorization; that is, whoever prefers to remain totally outside the Cupertino radar.
These new data collections will initially cover, at least exactly four areas: new words that users add to their local dictionaries, emojis typed by the user (so that Apple can suggest replacements), deep links used within apps and search tips within notes. It is good to note, too, that all of these new collections will only go into action effectively from iOS 10 and macOS Sierra.
Regarding the new Photos app, which recognizes faces and a plethora of objects, Apple says that its artificial intelligence does not scan the user's images instead, the algorithm "trained" with other data sets, from Ma itself.
(via Cult of Mac)