The TikTok app was fined $ 5.7 million (about $ 21 million, in direct conversion) last week for exposing sensitive content for children, such as pedophilia and sex. The decision of the US consumer protection agency Federal Trade Commission (FTC). Accusations point out that the tool used by pedophile networks to contact children, and American schools have already issued warnings about the dangers of the application. The video dubbing service for Android and iPhone (iOS) has been featured in recent months for hitting the billions of installations by the official Google Play and App Store app stores.
The reach of the platform has left behind services like Facebook and Instagram in the United States. By means of a statement, TikTok has pledged to offer a more assertive security system with specific tools for identifying such situations, as well as sharing restrictions. Despite the restriction of use for over 13 years, the app does not ask for any proof of age. The situation has been aggravated by the launch of streaming, which allows real-time contact between children and malicious adults.
Instagram announces filter to hide images with sensitive content
TikTok has announced a series of measures to protect its underage users from improper content. Photo: Press Release / TikTok
Want to buy a cell phone, TV and other discounted products? Meet the Compare TechTudo
In an official note, TikTok has stated that its priority is to create a safe experience for all users and is therefore taking steps to protect the community. This includes tools for guardians to protect their children and new security settings. The app also reported that, in partnership with the FTC, it works in an environment for its younger consumers, with extra security and special protections. Among the new features are the ban on sharing personal information and limitations on interaction between users.
The software has also developed a series of videos called "You're in Control". Using memes and editing tools already popularized within the app, users will be informed about privacy settings and security policies. Issues related to information protection, such as barring comments and setting up your screen, are approached in a funny and simple way, in order to meet the needs of smaller consumers.
The accusations of exposing its users to improper content came a few days after hitting a million users. Photo: Reproduction / Thayanne Porto
TikTok is not the only platform to be involved in polymers with underage users. The Yubo app has sparked debate by offering a Tinder-like system, but allowed for children and teens to use. The software would be a suitable platform for pedophilia, because of the users' identification impediment.
In early 2018, Sarahah, the popularly popular messaging app of 2017, was pulled from Apple's App Store, the mobile app store, after being accused of cyberbullying. Another controversial example is SimSimi, which simulates conversations in a chat application using artificial intelligence. The service raised suspicions of sending inappropriate messages of sexual content, bullying and even death threats. Because it uses a yellow emoji-like character, the platform was widely used by children and teenagers.
Via Mirror and TechCrunch (1 and 2)
What's the best app on your phone? Comment on TechTudo's Forum.
Instagram: How to post old photo on Stories with 'This Day' function