If you're a parent, I have a question for you: do your children watch youtube or maybe scroll through snapchat or instagram? Chances are they do. The average child spends nearly four to six hours online watching cartoons and playing games. It's mostly harmless stuff, and it's a great relief for parents because it keeps their children occupied. But what about the not so harmless stuff out there tonight?
We're concentrating on two prominent online sites that are also harmful to youngsters. Snapchat and YouTube Let's start with youtube kids, which was launched in 2015 and is basically youtube with an age filter, meaning no violence, no drugs, and no pornography, at least that's what youtube claims. However, a US-based non-profit dug deeper into youtube kids and discovered videos promoting drug culture, skin bleaching, and even weight loss.
I'll go through some of the details. One video showed a cooking show in the style of Breaking Bad, with the host wearing a respirator similar to those used when making crystal meth, and another video showed the infamous rv from Breaking Bad, the same one where crystal meth is cooked, and just to be clear, we're talking about Breaking Bad on Netflix, which is rated A only for adults, but it's free to watch for kids on YouTube.
Even songs may get through the filter, like this one by Eric Clapton called cocaine. Now, I'm no cyber expert, but a song called cocaine cannot be appropriate for any youngster. You can put Looney Tunes or Peppa Pig over it, it doesn't matter; I'll read out the words for you.
If you get bad news, you want to kick them blues cocaine, and when your day is done, you want to run cocaine, that's what the song says. Your child might be listening to this song or similar songs, which is a nightmare for any parent. YouTube Kids appears to be an infinite pool of poison that is causing body image disorders in youngsters. Some films teach children how to bleach their skin, while others challenge them to burn off their calories with a rhyme called "wiggle your jiggle." YouTube's lighthearted take on body shaming.
With copy-paste responses, how is the firm reacting to this news? Let me quote their spokesperson: "We have a higher bar for which videos may be part of the app and also empower parents to control what content their child can and cannot see empower parents"
Well, I've got some bad news for youtube: your system failed. YouTube moderates material using a combination of computers and people. My question is: why employ algorithms? Your robots may harm someone's youth, causing despair or anxiety. Are the revenues really worth it? To be fair, every media site, including YouTube, should assess their technology.
Take, for example, Snapchat. A US adolescent is suing Snapchat for $5 million, claiming that the app's design led to her sexual abuse four years ago. She received messages from a random male asking her to share nude images of herself, despite the fact that she was only 12 years old at the time. She first refused and banned this guy on snapchat, but the man continued to message her on instagram and through phoney snapchat accounts until she complied and the man saved the photographs and then put them all over the internet.
Look at the extent of abuse here: a random person can text her, he can find her on instagram, he can establish false snapchat accounts, and he can download her naked photos. It makes you question where the safeguards are. From what I can see, social media has devolved into a lawless jungle.
We are aware of the popularity of these sites. For starters, we need to hold these companies accountable because, let's face it, they rushed these so-called kids platforms. YouTube Kids has 35 million weekly users, and Snapchat has 300 million users, the majority of whom are children. The question is how do we make these platforms safe, the content, the interactions, everything. Remember instagram kids? It was shut down in four months, and now you have drug videos on YouTube and sexual exploitation on Snapchat. The reality is that these are imperfect products; their algorithms aren't flawless, and their material isn't safe, yet big tech continues to push them into the market.
How can we safeguard our children against this toxicity? Big tech should change ideally, but since that is unlikely to happen anytime soon, here are some options. Parents will have to be more attentive about what their children see, maybe curating the content themselves, and number two, encourage governments to modify the regulations. Think about it: cigarettes cannot be sold near schools.
At the very least, this is a government regulation, but crystal meth videos are OK. You carefully research and select school curriculum, but you play songs about cocaine on YouTube. Surely governments can do better. We need strict regulation to moderate content, and most importantly, we need more outreach. Imagine if this happened on television, if the Disney Channel, for example, played a song about cocaine, or if a comic character taught your children about skin bleaching. The outrage would be immense.
We need to apply the same pressure on big tech because, although commoditizing adults is terrible enough, damaging children's lives is inexcusable.