5 commands and questions you should avoid with virtual assistants | Productivity

Virtual assistants are increasingly present in people's daily lives. They allow you to do quick searches, ask about the weather and find out what time our favorite TV show starts. However, some precautions are necessary when using these services, since virtual assistants such as Siri, from Apple, Cortana, from Microsoft, Alexa, from Amazon and the Google Assistant can listen to you at all times, and even store sensitive information.

READ: New bracelet protects your Alexa and Google data

O dnetc prepared a list with tips on what not to do next to these devices, in addition to selecting some questions that should also not be asked. Check below some of the commands and questions that should be avoided when using a virtual assistant.

List of typical topics that should not be addressed using virtual assistants Photo: Marvin Costa / dnetcList of typical topics that should not be addressed using virtual assistants Photo: Marvin Costa / dnetc

List of typical topics that should not be addressed using virtual assistants Photo: Marvin Costa / dnetc

Want to buy cell phones, TV and other discounted products? Meet Compare dnetc

1. Asking for help in case of emergencies

A study conducted by a Canadian university found that virtual assistants are of little use if used in emergencies. The survey tested assistants Siri, Cortana, Alexa and Google Assistant through 123 questions on first aid. The questions included topics such as heart attacks, nosebleeds and poisoning, according to an article published in the English medical journal. The BMJ.

The study found that Alexa and Google Assistant achieved the best results, understanding the questions and providing useful answers in 90% of cases. The responses provided by Siri and Cortana could not be calculated because they were unsatisfactory. Therefore, if you see a medical emergency, the best option is to still call the providers of this type of service. Telephone numbers of the centrals of the Military Police, 190, Fire Department, 193, and SAMU, 192, may be useful in situations of risk.

Being aggressive towards virtual assistants can impact the relationship with real people Photo: Divulgao / MicrosoftBeing aggressive towards virtual assistants can impact the relationship with real people Photo: Divulgao / Microsoft

Being aggressive towards virtual assistants can impact the relationship with real people Photo: Divulgao / Microsoft

2. Order the assistant rudely

According to BBC reports, a study by the company ChildWise found that children who grow up are used to "giving orders" to virtual assistants and can become aggressive when dealing with people in the future. With that in mind, Alexa, Amazon's virtual assistant, and Google Assistant have encouraged children to use words like "please" and "thank you", offering as a reward more pleasant and kind answers for those who use "magic words".

Two out of five teenagers and adults use the Internet to conduct research related to sex and sexual health Photo: Ana Marques / dnetcTwo out of five teenagers and adults use the Internet to conduct research related to sex and sexual health Photo: Ana Marques / dnetc

Two out of five teenagers and adults use the Internet to conduct research related to sex and sexual health Photo: Ana Marques / dnetc

3. Asking about sex and health

A New Zealand survey conducted by the University of Otago found that it is better to do a Google search on topics related to sex and sexual health, than to ask questions directly to Siri and Google Assistant virtual assistants. According to the survey, two out of five teenagers and adults use Internet search engines to research about sex-related issues.

For this reason, the authors of the research decided to continue with the study, evaluating the responses on sex delivered by the virtual assistants. The most common questions about sexual health were asked on Google search and the assistants Siri and Google Assistant. The study found that direct Google searches were the ones that reported the most in about 70% of cases. The Google Assistant was right behind with 50% of reliable answers, followed by Siri with 32%.

Alexa understands and responds to commands Photo: Divulgao / AmazonAlexa understands and responds to commands Photo: Divulgao / Amazon

Alexa understands and responds to commands Photo: Divulgao / Amazon

4. Offend the virtual assistant

Virtual assistants are programmed not to curse and not take offense. The "bad words" or "ugly words" are not pronounced, and assistants like Siri even convince themselves to be hurt when they are offended. According to information from the Australian website ABC net, research shows that relationships between people and computers are abusive, and the fact that this bad behavior is related can be frustration and other emotions, which is worrying.

The way we talk to "virtual robots" can impact the way we deal and interact with the people around us, so I need to be careful. In addition, assistants like Alexa have changed the perspective of responses when the subject is harassed, after updates from the manufacturer. Before, the assistant answered sexist questions with kind answers such as "thanks for the feedback" and "be willing to say it", but now she answers with phrases like "I will not answer this" and "I don't know what answer you expected . "

Virtual assistants get involved in privacy scandals, so it's questionable to ask them personal questions. Photo: Isadora Daz / dnetcVirtual assistants get involved in privacy scandals, so it's questionable to ask them personal questions. Photo: Isadora Daz / dnetc

Virtual assistants get involved in privacy scandals, so it's questionable to ask them personal questions. Photo: Isadora Daz / dnetc

5. Asking very personal questions

There are suspicions that virtual assistants can listen to the environment where they are at all times, and not only when they are asked. Therefore, asking personal questions or talking about private matters near the devices can be dangerous. The problem is in the way of storing information from these assistants, which are stored in the cloud and can be shared with other companies.

In addition, companies like Apple have already confessed to hiring real people to transcribe audio between virtual assistants, in order to improve the accuracy of their searches. Alexa, for example, stores all voice commands, even if the user orders the deletion of the history. For this reason, it is not recommended to share personal information with Siri, Alexa, Cortana and Google Assistant, mainly to preserve your privacy.

Will Google Assistant beat Siri on the iPhone? Opinion in the dnetc Forum.

Amazon Alexa in Brazil: voice assistant already works entirely in Portuguese

Amazon Alexa in Brazil: voice assistant already works entirely in Portuguese