The YouTube video platform is considering moving all of its children’s content to the YouTube Kids app, in order to provide its youngest users with greater protection in the face of accusations of an existing algorithm that makes it easier for pedophiles to view children’s videos.
The YouTube Kids app was launched in 2015, and was designed so that parents have complete control of what their children see, through previous reviews and screenings.
Tips to protect our children
YouTube also recommends taking some preventive measures, such as de-selecting the “auto-play next video” function in videos meant for children.
The platform’s executive director sent an e-mail to her employees with the company’s new mantra: “it’s not about free speech, but about free reach,” which remarks the company’s intention of not recommending videos that may be inappropriate for certain users, such as minors, rather than outright deleting them from the platform.
Others changes are also on the way, including some that will minimize messages and video participation. Said videos will remain hidden from app users, under the list of related videos.
A backdoor for pedophiles
YouTube’s automated system encourages most of the hundred million users that visit it daily to check more videos through suggestions, including some that show partly clothed pre-pubescent children.
The system would sometimes upload private family videos without the creator’s consent, to which the algorithm would refer people after watching sexually-themed videos, which resulted in a catalog of videos that experts think sexualize children.
One of the Berkman Klein Center researchers, working for the University of Harvard’s Internet and Society ran into some of these videos when investigating YouTube’s impact in Brazil, finding that YouTube’s algorithm is what connects these channels and gives pedophiles material that may be of their interest.
Faced with this situation, the system made immediate changes and stopped linking some of the videos as part of the regular maintenance of their algorithms rather than a deliberate policy change; the company also claims that protecting children is quite high in their priority list.
Stopping the burrow effect
YouTube has described its recommendation system as an A.I. that feeds from the suggestions that keep users watching videos, which makes up for around 70% of its entire usage – though that’s all they can say without revealing the exact way the system makes its choices. This is what’s known as the burrow effect: videos or themes that go to extremes, meant to hook users.
Back in February, the media reported that sexual predators used video comment sections to guide pedophiles to parts videos where children appear in order to sexualize them. Users don’t need to search for videos of children to stumble upon them, as YouTube will direct users to them through a string of recommendations.
Researchers claim that the only change that would stop this would be to completely remove the recommendation system from videos meant for children. Though the platform can do this automatically, removing it would affect content creators that depend on views, since recommendations are the best way to generate video traffic.