Siri may be able to answer medical questions in the future; political questions would be “diverted”

Apple suspends analysis of user conversations with Siri after complaint

Last Friday (27/7), we released a complaint made by the The guardian about analyzing Apple consumer conversations with their virtual assistant, the Crab. As we reported, Ma could not only, as indeed was listening to excerpts from user conversations Also, according to the company, that under strict confidentiality criteria which did not allow to identify users in conversations.

Despite defending itself by stating that this is a legal practice and that everything is analyzed anonymously, Apple seems to have felt the pressure of contradiction between its so-called privacy policy and what was being done with Siri, so much so that decided to suspend the chat analysis program worldwide, as disclosed by Bloomberg.

We are committed to providing a great experience with Siri while protecting the privacy of the user. While conducting a full review, we are suspending Siri's reviews globally. In addition, as part of a future software update, users may choose to participate in the program.

According to the news, Ma made this decision to "classify" part of Siri's user commands after some consumers expressed concern about another fact of this problem: in some situations, Siri could be invoked without you actually having done so. , which is called false positives.

However, Ma did not comment on whether, in addition to pausing the program, it would also stop storing these recordings on its servers. The company currently reviews this material for about six months to remove any and all identifying information, so it can store a copy that can be saved for two years or more, according to the company. The verge.

The Cupertino giant's decision comes the same week that a German regulator temporarily prevented Google's third-party employees from transcribing Google's voice recordings. Google Assistant in the European Union, after complaints claim that some of these contents contained confidential information.

In addition, last April, the Bloomberg News It also exposes a similar situation involving Amazon which, according to the report, employs thousands of people worldwide to analyze conversations captured by its smart speaker line Echo, through its virtual assistant. Alexa.