Google has decided to step up its monitoring process and will reportedly start vetting YouTube channels more strictly from now on. The move is said to be a part of Google Preferred, a premium advertising programme.
The company is slated to use both human moderators and machine learning to identify videos that should not be a part of the 'Preferred' category, reports a popular tech portal. Google had recently announced it will be expanding it staff members by adding 10,000 more employees who will be focused on the task of flagging videos deemed inappropriate for ads.
This move is in the wake of the recent incident involving popular YouTuber Logan Paul who was taken off the ‘Preferred’ platform for uploading the video of a dead body from Japan's famous suicide forest. Google made an official announcement regarding the same this week.
According to Google, 'Preferred' is a collection of some of the most popular YouTube channels watched by people between the age group of 18-34 years in the US. These channels also have some of the most interesting content, which are also brand safe.
Last year, YouTube found itself on choppy waters after advertisers raised concern over videos aimed at children.
Several major advertisers pulled out spending on YouTube in early 2017 after some ads were found on offensive videos such as those promoting racism and terrorism.