IE 11 is not supported. For an optimal experience visit our site on another browser.

Pediatrician mom warns parents after finding 'scary' content on YouTube Kids

Think you know what your kids are watching on YouTube kids? This mom was shocked by what she found.
/ Source: TODAY

This story refers to suicide. If you or someone you know is at risk of suicide, please call the U.S. National Suicide Prevention Lifeline at 800-273-8255, text TALK to 741741 or go to SpeakingOfSuicide.com/resources for more additional resources.

It was last summer when another physician and mother alerted Ocala, Florida, pediatrician Dr. Free N. Hess, D.O., the founder of the child safety website PediMom, about a chilling clip she accidentally spotted on YouTube Kids.

While sitting with her child and nursing his nosebleed, the mom was shocked when the "Splatoon" cartoon he was watching was interrupted. For eight seconds, the doctored video proceeded to show the image of an adult man who walked on screen and addressed "kids," then graphically described ways they could slit their wrists. The man then walked away and the cartoon resumed. (Today.com has reviewed the video in question. It is graphic and disturbing. We are not linking to it because we do not want to promote the content.)

The stunned mother wrote an anonymous blog post detailing her experience for Hess's site, and Hess and her online communities reported the video clip and complained until YouTube Kids took the video down. However, Hess soon saw it pop up again in a different YouTube Kids video.

Hess said this particular distressing clip has been inserted in cartoons on YouTube or YouTube Kids at any given time. As a result, she began campaigning in earnest for YouTube to work harder to protect children this past month.

Using the hashtags #YouTubeWakeUp and #ProtectOurKids, Hess has been urging YouTube to take problematic material down — and quickly. "They're much, much better at taking down flagged material on YouTube Kids than they are on YouTube itself," Hess noted to TODAY Parents.

"We work to ensure the videos in YouTube Kids are family-friendly and take feedback very seriously," a YouTube spokesperson told TODAY Parents in an email. "We appreciate people drawing problematic content to our attention and make it possible for anyone to flag a video.

"Flagged videos are manually reviewed 24/7, and any videos that don't belong in the app are removed," the spokesperson said. "We’ve also been investing in new controls for parents including the ability to hand pick videos and channels in the app. We are making constant improvements to our systems and recognize there’s more work to do." YouTube did not comment on how the concerning material is added to videos.

Hess wrote a post displaying videos she easily found on YouTube Kids just this month that feature different methods of suicide, a character named "PedoBear" that serves as a sort of "pedophile mascot," and videos glorifying school shootings, all tucked into videos that would attract children.

According to YouTube, the YouTube Kids app is designed to be a family-friendly place for kids to explore their interests. The videos available on YouTube Kids are chosen through a mix of human-supervised machines, user input, and human review. However, if a parent allows the search function to be turned on, children can independently access millions of YouTube Kids videos beyond their home screens. By turning the search function off, a parent can restrict a child's access to a more limited set of videos.

But YouTube.com itself is not intended for kids under the age of 13, and children under that age are not allowed to create or own user accounts. If YouTube identifies an underage account holder, they will review the account and terminate both the YouTube and Google accounts if the owner cannot provide legal verification that they are over the relevant age of consent.

Hess, who has an 8-year-old daughter herself, said YouTube needs to do more to keep children safe. "I don't know that I can honestly say right now that YouTube kids is a safe platform for kids to use, because if it is this easy for me to find inappropriate and, quite honestly, scary content, I don't know that I can recommend it to parents," she said.

"YouTube continuously says they do their best to remove flagged content, but the truth is that by then, it's too late," she said. "If somebody's flagging it, the likelihood is at least one child saw it — probably more than one."

Never miss a parenting story with the TODAY Parents newsletter. Sign up here!

Hess said her own daughter is not allowed to use social media at all. "She's very restricted," she said. Though they use a monitoring app, "She doesn't use the computer at all unless we're sitting right next to her."

How can parents help keep their children safe? First and foremost, said Hess, they must educate themselves. "We're digital immigrants, but our kids are digital natives," she said. "There is a big disconnect and a lot we don't understand. Educating ourselves is hugely important."

She also said parents need to realize that it is impossible to keep children from seeing something inappropriate, whether on their own devices or their friends', so they need to open up the lines of communication with their children early and often.

"This is their life, and they are growing up with it," she said. "We need to accept that, and the safest way possible to do that is to teach our children about the possibility of inappropriate content online so we can talk about it when it comes up."

But she's not letting YouTube off the hook. "There's gotta be a better way to be doing this," said Hess. "We have to find a better meeting ground to where they are still able to run their business, and yet our kids are safe.

"We don't have a choice but to find a better way, because the risks of what our kids are being exposed to are just too high."