San Francisco: YouTube does not want to turn off recommendation despite concerns that its algorithm helps videos with young children spread among sexual predators, The New York Times reported.


COMMERCIAL BREAK
SCROLL TO CONTINUE READING

By suggesting what users should watch next, YouTube's automated recommendation system drives most of the platform's billions of views. But this system may also recommend videos of girls as young as 5-6 wearing bathing suits, or getting dressed or doing a split, researchers have found.


But YouTube's teams don't want to turn off the automated system. The company said because recommendations were the biggest traffic driver, removing them would hurt "creators" who rely on those clicks, said the report on Monday.


Instead, the company will "limit recommendations on videos that it deems as putting children at risk," the Times wrote.


YouTube's terms of service state that children under the age of 13 aren't allowed to have their own accounts, but many of these innocuous videos are uploaded by older family members, The Verge reported.


In a blog post on Monday YouTube said, "responsibility is our number one priority, and chief among our areas of focus is protecting minors and families.


"We also enforce a strong set of policies to protect minors on our platform, including those that prohibit exploiting minors, encouraging dangerous or inappropriate behaviours, and aggregating videos of minors in potentially exploitative ways," it said.


YouTube said it removed more than 800,000 videos for violations of its child safety policies, the majority of these before they had 10 views, in the first quarter of 2019 alone.


To limit the risk of exploitation, YouTube disabled comments on tens of millions of videos featuring minors across the platform.