The latest round of unsettling internet challenges has taken a dangerous turn, with multiple “kid-friendly” videos teaching children how to commit suicide or encouraging them to commit high-risk acts.
A trend called the “Momo Challenge” has been stirring up fervor in recent weeks, even though the character — a gaunt, terrifying doll that asks the viewer to participate in challenges that range from innocuous to deadly — has been appearing around the internet for at least a year, according to international police agencies and news outlets.
Momo, with its bulging eyes and stringy hair, reportedly appears on sites or apps like WhatsApp, Facebook and Youtube, sometimes in conjunction with kids’ videos meant to depict the popular game Fortnite or kids show character Peppa Pig.
The actual visage of Momo depicts a sculpture called “Mother Bird,” made by a Japanese special effects company called Link Factory, which is not associated with the challenge itself.
The person or people masquerading as the character encourage video viewers to contact them via a WhatsApp number. Then they, as “Momo,” ask people to complete challenges, some of which involve self-harm or suicide, such as how to take pills, according to international reports. Other examples include turning on the oven at night.
Momo also tells viewers that she will “curse them” if they don’t do what she says, and encourages them not to tell anyone about the challenge.
Some have decried Momo as a hoax, while others insist their children have been in contact with those behind the mysterious character. The shifting, bottomless nature of internet content, plus the tendency of Youtubers and others to quickly capitalize on scary memes, makes it difficult to pin down the origin or measure the widespread impact of the trend.
However, the challenge has been linked to several suicides worldwide, including two children who killed themselves just days apart from one another in September in Barbosa, Colombia, according to reports from the Daily Mail of England. The boy, 16, reportedly got the younger girl, 12, involved with the game before they both died. Police found messages associated with the game on the children’s phones, reports say.
“Our advice as always, is to supervise the games your kids play and be extremely mindful of the videos they are watching on YouTube,” said the Police Service of Northern Ireland in a post about the challenge Saturday. “Ensure that the devices they have access to are restricted to age suitable content.”
Parents and others have reported the videos to Facebook, Youtube and other sites, and while some have been taken down, others are still live online and sometimes, the old ones reappear again.
Unrelated suicide messages have been found in popular children’s videos depicting Splatoon, a kid’s game in which squid characters squirt ink at each other. A character called Filthy Frank, created by former Youtuber George “Joji” Miller, appears midway through the video and appears to give kids advice about how to slit their wrists.
It’s unclear how the disturbing clip was included in a video meant for kids.
“End it,” he says at the end of the 11-second segment, which then cuts back to the Splatoon video.
Free Hess, a Florida-based pediatrician and mom who runs her own website PediMom.com, said she first encountered the video with a clip of the suicide instructions edited in about seven months ago from a concerned parent.
Hess said although the clip was removed from YouTube Kids — a version of YouTube available as an app billed as kid friendly — it had resurfaced on YouTube. There was also a second video that has been removed from Youtube.
“There has to be a better way to assure this type of content is not being seen by our children,” said Hess in a blog post published last Friday. “We cannot continue to risk this.”
In a statement, YouTube said any videos that don’t belong in the app are removed, and the service has invested in additional parental controls to tailor the user experience more closely.
“We work to ensure the videos in YouTube Kids are family friendly and take feedback very seriously,” said YouTube.
Last year, the YouTube Kids app was slammed by critics for allowing several videos to infiltrate the app that were not appropriate for kids. YouTube’s parent company, Google, responded with an update allowing parents to curate the app with more kid-friendly channels such as Sesame Street.
The suicide rate in the U.S. has increased in recent years, including among minors.
Resources to help:
Suicide Lifeline: If you or someone you know may be struggling with suicidal thoughts you can call the U.S. National Suicide Prevention Lifeline at 800-273-TALK (8255) any time of day or night or chat online.
Crisis Text Line provides free, 24/7, confidential support via text message to people in crisis when they dial 741741.
Contributing: Brett Molina.
Read or Share this story: https://www.usatoday.com/story/life/allthemoms/2019/02/27/momo-challenge-internet-meme-could-teach-your-child-how-commit-suicide/3004431002/