YouTube Will No Longer Recommend Conspiracy, Medically inaccurate Videos
Google’s owned video platform YouTube has announced that it will no longer suggest or recommend conspiracy or medically inaccurate videos on its platform as they are violating its community guidelines. According to the NBC report, one of the biggest video platforms would not suggest videos claiming that earth is flat or making blatantly false claims about historical events like 9/11.
According to the YouTube algorithm, it suggests video usually after a user has watched one but it would no longer lead to similar videos and instead would “pull in recommendations from a wider set of topics”.
In a blog post, YouTube said that the action is meant to “reduce the spread of content that comes close to — but doesn’t quite cross the line of – violating” its community policies. The blog post also stated that if a user has already subscribed to a channel or searches on YouTube, they will be to get the notification of their latest videos and the desired content.
On Saturday, Guillaume Chaslot, a former engineer for Google — YouTube’s parent company appreciated the move as a “historic victory”.
“It’s only the beginning of a more humane technology. Technology that empowers all of us, instead of deceiving the most vulnerable,” Chaslot wrote.
According to Chaslot, the aim of YouTube is to keep users on the site as long as possible in order to promote more advertisements. When a user was enticed by sevral conspiracy videos, the AI not only became biased by the content the hyper-engaged users were watching, it also kept track of the content that those users were engaging with in an attempt to reproduce that pattern with other users.