YouTube has revealed that it is in the process of reconfiguring its video recommendation system in a measure seeking to cut short the promotion of conspiracy videos that contravene its terms of service. YouTube’s decision to change its video recommendation formula comes after a report was published by Buzzfeed detailing more on how the companies’ recommendation system can lead to gross misinformation of viewers.
The company said, in a response, that it will curtail video recommendations containing information that is considered borderline and divisive. Particular videos that the company intends to put a tag on include those which promote claims of religious and miraculous healing for serious disease conditions, videos claiming that the earth is flat, and those which contain information that makes false claims about historic events and natural disasters that have caused humanitarian suffering in the past.
The company has been taking a series of steps targeting cutting short on sharing of videos that contain divisive information. About a year ago, the YouTube’s chief executive officer, Susan Wojcicki, said that the video sharing feature would be framed in a manner that insures viewers only share information that is validly offered by trusted sources. Experts believe that the current stringent measure is in response to a series of flaws that the company has noticed especially since YouTube has not been upholding such a video sharing policy before.
According to the YouTube’s new recommendations, YouTube is going to sideline all conspiracy videos that are based on opinionated information when providing lists of recommendations to viewers. However, YouTube said it does not intend to deal a major blow on the freedom of speech policy that it values and protects. Instead of doing away with conspiracy-based videos, the company’s new algorithms will offer recommendations whenever viewers search for particular content containing conspiracy information. This will ensure that all conspiracy and opinionated videos are ranked together.
The newly launched recommendation algorithm will take effect in the US and is expected to be rolled out in the other parts of the world depending on how it is going to perform. Analysts, however, think that the larger problem being faced today in relation to far-left and far-right politically divisive content is not being effectively addressed in the current measures rolled out. YouTube’s intention, however, remains to observe a strict division between video recommendation and freedom of speech. It, by no means, intends to cut short on videos containing opinionated information as long as it does not contravene the law.
Dil Bole Oberoi