Virtual assistants are increasingly present in people’s daily lives.
They allow you to do quick searches, ask about the weather and find out what time our favorite TV show starts.
However, some precautions are necessary when using these services, since virtual assistants such as Siri, Apple, Cortana, Microsoft, Alexa, Amazon and Google Assistant can listen to you at all times, and even store sensitive information.
READ: New bracelet protects your Alexa and Google data
THE dnetc prepared a list with tips on what not to do next to these devices, in addition to selecting some questions that should also not be asked.
Below, check out some of the commands and questions that should be avoided when using a virtual assistant.
List gathers topics that should not be covered using virtual assistants – Photo: Marvin Costa / dnetc
Want to buy cell phones, TV and other discounted products? Discover Compare dnetc
Asking for help in case of emergencies
A study conducted by a Canadian university found that virtual assistants are of little use if used in emergencies.
The survey tested assistants Siri, Cortana, Alexa and Google Assistant through 123 questions on first aid.
The questions included topics such as heart attacks, nosebleeds and poisoning, according to an article published in the English medical journal The BMJ.
The study found that Alexa and Google Assistant achieved the best results, understanding the questions and providing useful answers in 90% of cases.
The responses provided by Siri and Cortana could not be calculated because they were unsatisfactory.
Therefore, if you see a case of medical emergency, the best option is still to call the providers of this type of service.
Telephone numbers of the headquarters of the Military Police, 190, Fire Department, 193, and SAMU, 192, can be useful in risky situations.
Being aggressive with virtual assistants can impact the relationship with real people – Photo: Disclosure / Microsoft
Order the assistant rudely
According to BBC information, a study by ChildWise concluded that children who grow up are used to «giving orders» to virtual assistants and can become aggressive when dealing with people in the future.
With that in mind, Alexa, Amazon’s virtual assistant, and Google Assistant have encouraged children to use words like «please» and «thank you», offering as a reward more pleasant and kind answers to those who use «magic words».
Two out of five teenagers and adults use the Internet to do research related to sex and sexual health – Photo: Ana Marques / dnetc
Asking about sex and health
New Zealand research conducted by the University of Otago has found that it is better to do a Google search on topics related to sex and sexual health, than to ask questions directly to the virtual assistants Siri and Google Assistant.
According to the survey, two out of five teenagers and adults use Internet search tools to research about sex-related issues.
For this reason, the authors of the research decided to continue with the study, evaluating the responses on sex delivered by the virtual assistants.
The most common questions about sexual health were asked on Google search and the assistants Siri and Google Assistant.
The study concluded that Google’s direct searches were the ones that reported the most in about 70% of cases.
The Google Assistant was close behind with 50% reliable responses, followed by Siri with 32%.
Alexa understands and responds to commands – Photo: Disclosure / Amazon
Offend the virtual assistant
Virtual assistants are programmed not to swear and not to take offense.
The «bad words» or «bad words» are not pronounced, and assistants like Siri even convince themselves to be hurt when they are offended.
According to information from the Australian website ABC net, research shows that relationships between people and computers are abusive, and the fact that this bad behavior is related can be frustration and other emotions, which is worrying.
The way we talk to «virtual robots» can impact the way we deal and interact with the people around us, so care is needed.
In addition, assistants like Alexa have changed the perspective of responses when it comes to harassment, after updates from the manufacturer.
Before, the assistant answered sexist questions with kind answers like «thanks for the feedback» and «it will be nice of you to say that», but now she answers with phrases like «I will not answer that» and «I don’t know what answer you are expected.
Virtual assistants get involved in privacy scandals, so it’s questionable to ask them personal questions – Photo: Isadora Díaz / dnetc
Asking very personal questions
There are suspicions that virtual assistants can listen to the environment where they are at all times, and not just when they are asked.
Therefore, asking personal questions or talking about private matters near the devices can be dangerous.
The problem is in the way of storing information from these assistants, which are stored in the cloud and can be shared with other companies.
In addition, companies like Apple have already confessed to hiring real people to perform audio transcriptions among virtual assistants, in order to improve the accuracy of their searches.
Alexa, for example, stores all voice commands, even if the user orders the deletion of the history.
For this reason, it is not recommended to share personal information with Siri, Alexa, Cortana and Google Assistant, mainly to preserve your privacy.
Will Google Assistant beat Siri on the iPhone? Opinion in the dnetc Forum.
Amazon Alexa in Brazil: voice assistant already works entirely in Portuguese