![youtube delete all comments 2018 youtube delete all comments 2018](https://i.ytimg.com/vi/b9w-dTlqKh4/hqdefault.jpg)
The company has more than 1.5 billion users, and each video has even more comments. Starting this week we will begin taking an even more aggressive stance by turning off all comments on videos of minors where we see these types of comments.” “Comments of this nature are abhorrent and we work with NCMEC to report illegal behavior to law enforcement. “We have historically used a combination of automated systems and human flagging and review to remove inappropriate sexual or predatory comments on videos featuring minors,” a blog post reads. In November of that year, YouTube announced that comments on specific channels would be turned off if predatory behavior began to occur, and the company was to work closely with the National Center for Missing and Exploited Children (NCMEC) and law enforcement if needed.
![youtube delete all comments 2018 youtube delete all comments 2018](https://i.ytimg.com/vi/Fa06m5zzlUM/maxresdefault.jpg)
The company became even more aggressive in 2017, following numerous reports that predatory comments were found on videos that featured children. When you review comments, the system will take that feedback into account and get better at identifying the types of comments to hold for review.” “We recognize that the algorithms will not always be accurate: the beta feature may hold some comments you deem fine for approval, or may not catch comments you’d like to hold and remove. “Comments identified by our algorithm will be held and you have the final decision whether to approve, hide, or report these comments,” a blog post reads. By November, the company had introduced new moderation tools, like giving creators the ability to blacklist certain words so that they wouldn’t appear, and letting them opt into a program that essentially screened comments through YouTube’s algorithm. The company announced at that year’s VidCon that creators could appoint moderators to deal with nasty comments. In 2016, YouTube tried fixing the comments section again by giving content creators more power. The company tried again in 2015, introducing a new ranking system that Kiley McEvoy, a product manager at the time, said would reduce the visibility of “junk comments,” noting that it “seems to be working.”
YOUTUBE DELETE ALL COMMENTS 2018 PLUS
Google Plus became the bane of most users’ existence and, most importantly, didn’t solve the problem of toxicity on the YouTube platform. Unfortunately, YouTube and Google’s plan didn’t work. The two or three highlighted are from people you follow or other celebrities. Introducing a comment section powered by Google Plus, YouTube explained that users would see “posts at the top of the list from the video’s creator, popular personalities, engaged discussions about the video, and people in your Google+ Circles.” Think of the way comments appear on a celebrity’s Instagram post. In September 2013, YouTube announced that new tools were being introduced to “help video creators moderate conversations for welcome and unwelcome voices.” YouTube has tried multiple times throughout the years to rectify its comment section. A history of trying to clean up the comments Questions are now swirling around whether YouTube is responsible for the comments left on its platform. The FBI reportedly followed up with Bennight about the comment, and YouTube deleted the offensive message. Ben Bennight, a 36-year-old YouTuber, took a screenshot of the comment and sent it to both the FBI and YouTube. And it’s facing new scrutiny this week, following reports that Nikolas Cruz, the 19-year-old suspected gunman in a tragic school shooting that occurred in Florida this week, posted vague allusions to his intentions on various YouTube channels.īuzzfeed News reported that a YouTube user with Cruz’s name commented that he was going to launch an attack on the school. YouTube’s comment section is one of the internet’s most infamous cesspools, much to the company’s chagrin.