31 January 2018
World’s biggest video hosting and streaming website YouTube is taking measures to make sure that the content uploaded dot the platform is moderated and goes through a series of filters before they are published. The changes to be made are considered after a popular YouTuber Logan Paul faced a controversy related to his YouTube channel video that features a dead body recently, triggering discussion at YouTube headquarters to have a content moderation policy in place to avoid any of such event to take place on the platform.
Google allows advertisers to place their ads on top 5 percent of the YouTube channels at higher prices than normal though, after a number of controversial content uploaded by top ten percent of the publishers, advertisers are pulling back their money which is a major concern of YouTube.
YouTube is now placing 10,000 employees to manually filter out the content being uploaded by the top 5% of the publishers while artificial intelligence is here to help and manually edit the platform. The platform is now handing over the list of whitelisted publishers who have proven themselves as trustworthy in this regard. YouTube previously brought forward comment moderation policy to fight spam too.
Do you think this policy from YouTube will have any positive effect to avoid any of such controversies in the future? Share your views in the comment section below.