Personally, I would say that we are still moving into the era of artificial intelligence, but there are also those who argue that we are already in it. Just compare the progress of virtual assistants since the beginning of this decade to understand how much AI resources have improved – but not enough, according to the former head of Crab, Bill Stasior.
Stasior has led the development of Apple’s virtual assistant since 2012 and, earlier this year, left his position in Cupertino to join the executive committee of the diagnostic and genetic testing company Avellino Labs.
Every year working with Siri, however, they give him enough credit to assess that all virtual assistants today still “do not deliver on the promise to understand us as naturally as other humans”, as reported by Business Insider.
The executive explained that one of the main reasons for this is that most virtual assistants were designed to handle specific tasks, realizing the obvious: this technology is not capable of understanding the world in the same way as humans.
When you want to speak to an assistant, you are opening the door to almost any task or question. There is an incredibly wide variety of languages and ways to express ourselves. And, having this general capacity, we are still far away [de as assistentes virtuais alcançarem isso].
Among some of the more complex functions, Stasior recalled that Alexa, Amazon’s virtual assistant, and Google Assistant are ahead of Apple’s technology in recognizing different voices. In the case of Siri, it will identify different users by voice on HomePod only with iOS 13 – which has existed on Amazon and Google systems since 2017.
In addition, while Amazon is teaching Alexa to understand the emotional state of his voice, Stasior said that Apple, until the time the executive was at the company, did not have the same plans. This, however, does not make Apple’s assistant less capable, but with fewer functions due to restrictions on Apple systems.
On iOS 13, Siri will gain some very welcome new features, such as the ability to read messages on AirPods. In addition, Apple has expanded SiriKit (the assistant development API) to allow the tool to connect to third-party music services and podcasts.