Recently YouTube has been making a lot of changes to reflect public concerns such as the exploitation of minors, exposure to sexual predators, and spreading incorrect or harmful information
The number of children targeted for grooming and abuse on Instagram has more than tripled – with some of the victims as young as five years old.
Recently YouTube has been making a lot of changes to reflect public concerns such as the exploitation of minors, exposure to sexual predators, and spreading incorrect or harmful information such as anti-vaccine theories and flat-earth conspiracies. Apart from reviewing and subsequently deleting countless inappropriate comments and accounts on videos featuring minors last Friday, the video-sharing platform also disclosed plans to disable comments temporarily on thousands of such clips and demonetize them. Today the video-based social media giant announced several new policies that will be implemented in order to further protect underage persons from pedophilic abuse on the platform.
One of the steps taken by YouTube in the latest update to its child safety initiative includes blocking the comment section for good on all videos that show younger as well as older minors. Only a select few creators will be allowed to enable comments on their content, subject to the condition that they monitor and moderate the comment section themselves in addition to the pre-existing tools already provided by the platform and “demonstrate a low risk of predatory
YouTube is also planning on launching a new comments classifier which will essentially do what the company has been doing itself as well as urging other members of the community to do: scrutinize comments across the platform to search, locate, report and finally remove comments that are inappropriately sexual or predatory in nature. Apparently, the new classifier is twice as better as the previous one, and will not alter the monetization of content in any way.
The third and final point made in the blog post was an appeal to the viewers as well as the creators of content on the video-sharing platform, encouraging them to actively participate in protecting underage children by flagging any harmful or dangerous comments, videos or channels that violate YouTube’s child safety regulations.
After beginning the process of disabling comments on videos featuring young minors on its platform last week, YouTube
Over the past week, YouTube said it suspended comments from tens of millions of videos that had posed risks to such behaviour. The extension, which will see children up to the age of 18 covered, will come into place over the next few months.
“A small number of creators will be able to keep comments enabled on these types of videos,” YouTube said in a blog post. “These channels will be required to actively moderate their comments, beyond just using our moderation tools, and demonstrate a low risk of predatory behaviour.”
See also: How to make YouTube Kids safer for your children (CNET)
YouTube said it would work with creators directly, with the goal of growing the number over time as it improves its ability to catch inappropriate comments.
The Alphabet-owned company is also launching
Additionally, YouTube said it intends to take action on creators who cause “egregious harm” to the community.
“No form of content that endangers minors is acceptable on YouTube, which is why we have terminated certain channels that attempt to endanger children in any way,” the blog continued.
“Videos encouraging harmful and dangerous challenges targeting any audience are also clearly against our policies. We will continue to take action when creators violate our policies in ways that blatantly harm the broader user and creator community.”
YouTube has classified the following as harmful content: Sexualisation of minors, which includes sexually explicit content featuring minors and content that sexually exploits minors; harmful or dangerous acts involving minors, such as that showing a minor participating in dangerous activity, or encouraging minors to engage in dangerous activities; infliction of emotional distress on minors, including content that could cause participants or viewers distress; misleading family content, videos misleading viewers by appearing to be family content, but containing sexual themes, violence, obscene, or other mature themes not suitable for young audiences; and cyberbullying and harassment involving minors.
If content violates YouTube’s policy, it will be removed and an email will be sent to the creator.
A warning for a first offence will be given, with no penalty. Once three strikes have been tallied, the channel will be terminated.
YouTube last week had already begun limiting the monetisation of videos that include minors, with those flagged as at risk receiving limited or no ads, and made known through the use of a yellow icon.