An advocacy groups coalition filed a complaint with the Federal Trade Commission to investigate Google’s YouTubeKids application for being unsafe for children.
Since its launch in February, the Google powered application was marketed as a safe children oriented video service that is able to take loads of parents’ backs trying to cut screen time and exposure to inappropriate content. However, the file received by the Federal Trade Commission begs to differ as to the safety of disturbing content provided by YouTubeKids.
The Campaign for a Commercial-Free Childhood and the Center for Digital Democracy have kept Google’s application under scrutiny for a while now and their findings are indeed disturbing.
The content of YouTubeKids may be user generated, but that doesn’t excuse the fact that it features no less than 13 Budweiser commercials, cartoons that can at best be explained as pornographic, drug references and other product placements. The content is indicative of deceiving business practices from Google and the placing of children at risk of exposure to seriously inadequate content for their age.
In a statement released on Monday, Google commented that it is still working to make the application’s videos as family-friendly as possible. At the same time all feedback is taken into consideration and analyzed, resulting in the removal of the video flagged as inappropriate by users.
Shortly after YouTubeKids was launched in February, product manager Shimrit Ben-Yair stated that there is a two-step process for screening the 300 hours of video uploaded to YouTube every minute in order to find the children-friendly content that is supposed to feature on YouTubeKids.
Firstly, the video material is screened to
“algorithmically narrow it down to family-friendly content” she said.
Secondly, it is necessary that Google employees perform
“manual sampling for quality control, to see if it’s family-friendly”.
It is yet unclear how the disturbing content features on an application that is aimed at keeping 5-year olds entertained and is supposedly safe. The advocacy groups complain about pornographic content, as well pedophilia jokes that reach the children at one voice command. Let alone the videos streaming explicit content of child suicide, dangerous experiments, alcohol intake among much other shocking content.
The issue of inappropriate and unsafe content is not raised for the same time in connection with Google’s YouTubeKids application.
In April, another coalition of child advocacy and consumer groups filed a different complaint with the Federal Trade Commission accusing Google that the app is overtly commercialized and that children are exposed to ads and promotional material that they wouldn’t even see on television.
YouTubeKids can easily be accessed by children up to 5 years of age, not by using the classic search option, but by voice command to the smartphones or tablets of choice. That prospect now takes parents aback.
The only statement Google has made to answer this concern was:
“For parents who want a more restricted experience, we recommend that they turn off search”.
Others are directing an increasing call for action toward Google. The Change.org website featured a petition two weeks ago for Google to stop the application. While many of the videos targeted in the petition have been taken down from YouTubeKids, others stem on a daily basis according to KidKam, the initiator of the petition.
Image Source: thedrum.com