contador web Saltar al contenido

Former YouTube employee accuses network for 'addiction' of users | Downloads

How much time do you spend watching the videos suggested by YouTube? This question is at the center of an important discussion surrounding the platform. That's because former Google employee Guillaume Chaslot, who worked with the algorithm responsible for pointing content to users, made a number of complaints during his talk at this year's DisinfoLab Conference.

READ: YouTube Music and YouTube Premium arrive in Brazil; meet new services

Chaslot argues that the video recommendation fails and often does not suggest relevant content. In an interview with The Next Web, however, he stated that this would not be the main problem: the expert accuses YouTube of using Artificial Intelligence technology to make users "addicted" to the platform and "waste time".

Former employee accuses YouTube of using Artificial Intelligence to hook platform users Photo: Rodrigo Fernandes / TechTudoFormer employee accuses YouTube of using Artificial Intelligence to hook platform users Photo: Rodrigo Fernandes / TechTudo

Former employee accuses YouTube of using Artificial Intelligence to hook platform users Photo: Rodrigo Fernandes / TechTudo

In his talk at the DisinfoLab Conference, Chaslot stated that users end up receiving, on YouTube's own recommendation, sensational recommendations. While pointing out that, for the most part, videos are harmless, the expert contends that some can be quite problematic and disturbing. Thinking of a way to end the 'addiction' unleashed by YouTube, Chaslot created an alternative platform, the AlgoTransparency project, which promises greater transparency to the user.

Wanted by TechTudo, YouTube has revealed that AlgoTransparency is a platform that was created outside of YouTube and that it is unable to point out how the network video recommendation algorithm works. In addition, the company states that it disagrees with the methods used for metrics and therefore disagrees with the data disclosed by AlgoTransparency. On Chaslot's accusations, YouTube chose not to comment.

Google does not yet offer options for users to control the recommendations they receive. It is possible to block some channels, but even so, Chaslot says that the algorithm can still continue to display videos similar to those that Artificial Intelligence interprets as of interest. The expert argues that there is only one short-term solution: to exclude the function.

In the long run, Chaslot calls for more transparency from companies and also for users to have more control over what they consume. Meanwhile, to avoid being impacted by unwanted content, the expert says he uses a Chrome extension called Nudge, which removes addictive online features such as the Facebook news feed and YouTube recommendations.

According to Chaslot, YouTube's algorithms account for over 700 million hours of video viewing time every day, and even the people who build them don't fully understand how it works. The goal of AlgoTransparency is to understand what information the platform is passing on to its users.

On his site, Chaslot says the project uses multi-step programs to review videos recommended by YouTube every day. The first step is a list, available for consultation online, with more than 1,000 US channels. This number increases as research progresses. All recommended videos from the last content published by each of these channels are put together. Then a comparison is made to know which channels were the most indicated by the platform.

On the site, the user can filter by keyword to know which videos were suggested by the platform – as well as how many times – that day. It is also possible to filter by date. As the project is still under construction, limited research is focused on the United States and France.

Via: TNW and AlgoTransparency

How to close the YouTube app and keep listening to videos

How to close the YouTube app and keep listening to videos

How to close the YouTube app and keep listening to videos