A US government committee has described YouTube Kids as a “wasteland of vapid, consumerist content”.
In a letter to YouTube chief executive Susan Wojcicki, the US sub-committee on economic and consumer policy said the platform was full of “inappropriate… highly commercial content”.
Google launched YouTube Kids in 2015 as a safe place for children to view appropriate content.
YouTube said it had worked hard to provide “enriching content for kids”.
In a statement, a YouTube spokesperson said: “Over the last few years, we’ve worked hard to provide kids and families with protections and controls that enable them to view age-appropriate content.
“We’ve made significant investments in the YouTube Kids app to make it safer, and to serve more educational and enriching content for kids, based on principles developed with experts and parents.
“Additionally, on YouTube, we do not serve personalised ads alongside ‘made for kids’ content, and apply additional protections to ensure we’re recommending age-appropriate content for kids and families.”
In 2019, Google agreed to pay $170m (£124m) in a settlement with the Federal Trade Commission for collecting and selling children’s data without parental consent. YouTube committed to privacy protections for children and the removal of personalised ads.
According to the letter, some videos appeared to be “smuggling in hidden marketing and advertising with product placements by children’s influencers”.
The letter claimed that one research team, which it did not name, found only about 4% of videos had a high educational value. Much of the rest was low quality content such as toy unboxing and videos of people playing video games.
It also said that one mother had reported a video that contained advice on how to commit suicide. After the video was reported, the letter alleges YouTube failed to remove it for eight months.
The letter asked YouTube to supply a host of documents, including:
- information about the top 200 channels and how much time on average children spent watching them
- a detailed explanation of how paid ads are selected for display to children
- a explanation of how the recommendation algorithm determines which videos to promote to children
- number of videos removed because they were inappropriate from 2016 to 2020
- number of channels or creators blocked during the same timeframe