“Look!” my 10-year-old cousin exclaims, moments after arriving at my house to spend the night. She shoves her phone in my face, showing me a popular TikTok video from her feed. “Cool,” I respond indifferently. For the next 40 minutes, she shows me some of her own TikTok videos— and recent Instagram posts. The whole time, I consider her safety— she does not know what kind of strangers lurk behind the screen names of her hundreds of followers.
According to a Nielsen report released in 2017, 45% of US children received a phone with a service plan between the ages of ten and twelve. 90% of their parents said they gave their child a smartphone before they turned 13 (one year above the average age for receiving one’s first smartphone) because they wanted the ability to reach their child easily. But parents do not expect their kids to do more than call them and play harmless games.
“That is a question that we get asked multiple times a day— ‘When is the right age to give your child a smartphone?’, And the frustrating part is that there is no good answer. Your child needs to be in a place where they can be responsible online and understand the risks… As a parent, you can’t stick your head in the sand. You have to know what you’re doing when you hand your child a smartphone,” Titania Jordan, Chief Marketing Officer of Bark, a parental control service, said.
A survey given to elementary schoolers in grades 1-4 revealed that 28 out of 43 students use Youtube, rendering it the most popular app among them, with Netflix as the runner up. However, most students said they own a tablet or phone report using it to play games.
“I like my computer because there is so much to do with it like for example you can play games, watch videos, and google stuff you need to know,” a third grader said.
A first grader shared that he values learning through using technology.
“You can do a lot on it and you can actually learn.”
Four students even said they enjoyed using technology to code.
“You get to code and make your own stuff on technology.” a second-grader said.
The dangers that elementary-school-aged children face on the internet differ from app to app.
On Youtube, children can view content intended for adults and teens. Kids often come across parody videos, in which children’s cartoon characters use profanity and perform violent or lewd acts. One such video features Peppa Pig— an anthropomorphic pig from an animated children’s series of the same name— and her brother shooting each other with rifles and smoking. In one scene, Peppa wields a sword with fiery, demon-like eyes.
The controversial content doesn’t stop there.
Recently, parents noticed suicide tips spliced into cartoons and gaming videos on Youtube and its family-friendly sister app, Youtube Kids. These videos will appear normal at first— until about five minutes in when a man appears and starts giving kids instructions on how to kill themselves.
“Remember kids,” he says while gesturing to his wrist, “[cut] sideways for attention, longways for results. End it.”
Younger users across various platforms tend to glorify content involving suicide and self-harm— so much so that parents and professionals began labeling it “suicide porn.” Its prevalence on Instagram drove British teen Molly Russell to take her own life in 2017. After her death, her father discovered she followed several accounts related to self-harm, suicide, and depression on the app.
“It’s got to be incredibly heavy to process the fact that people are hurting themselves or trying to take their own lives. Since it’s becoming more frequent, it normalizes [suicide] and desensitizes [children]… The first time you see it you’re incredibly shocked and disturbed, but the more and more you see it, you become [more numb]. We don’t need children becoming numb to something as staggering as suicide,” Jordan said.
Frequent cyberbullying on Instagram and Snapchat can also lead to “bullycide” among impressionable elementary schoolers.
12-year-old Mallory Grossman committed suicide after excessive cyberbullying from her classmates on Instagram and Snapchat.
“Why don’t you kill yourself,” one bully allegedly said.
Ten-year-old Ashanty Davis hung herself after someone posted a video of her and a classmate fighting at school on Musical.ly (a music video app now known as TikTok).
Parents may find it difficult to pick up on suicidal tendencies or nuances in behavior in children this young. Pediatricians advise parents to monitor their child’s social media presence and talk openly about depression.
“[Children] need to know how permanent [suicide] is, and how it’s not a glorious thing. Not everyone is going to get their own series on Netflix highlighting them and memorializing them. It’s permanent… and it’s just the worst thing ever.” Jordan said.
The app that arguably combines all of this and more, Tik Tok, boasts over 500 million monthly users, flashy filters, popular songs, and gratuitous challenges and trends. It mainly attracts elementary schoolers, and, consequently, child predators.
According to TikTok’s guidelines, users must be 13 years of age or older to join. Yazmine B., a fifth grader who uses the app religiously, could barely contain her laughter when she told me how she made her account.
“I said I was born in 1835. They had [birth year options] going all the way to the first century,” Yazmine said.
On February 27, TikTok’s Parent company, ByteDance, paid the Federal Trade Commission 5.7 million dollars amid allegations that Tik Tok gathers the personal information of its younger users.
In July 2018, Indonesia banned the app for containing “pornography, inappropriate content and blasphemy.”
According to the UK’s National Society for the Prevention of Cruelty to Children, one in twenty children received a comment on Tik Tok asking them to take their clothes off on a posted video.
What does this mean in a technology-driven world? How can the internet develop into a safer place for children?
As technology continues to permeate every aspect of our lives, more schools choose to begin digital literacy programs that instill safe, smart internet usage habits in elementary school-aged children. Because children become acquainted with technology at such a young age, teaching them about internet safety usually falls by the wayside. Jordan recommends that as children grow, parents openly discuss internet safety with them in a way they will understand.
“With [younger] children, I would start by telling them that not everything is as it seems online. There are tricky people一 as they get older, you can expand on what that means一 whether it’s trolls or online predators. But you start with the concept of tricky people, and because not everyone is who they say they are online, you have to guard your PII [Personally Identifiable Information].” Jordan said.
Parents can also install software that informs them of their children’s virtual activities. The Bark app allows parents to monitor what their kids see and post on over 30 apps and browsers, including TikTok, Instagram, Snapchat, and Youtube. For nine dollars per month per family, Bark alerts parents through email and text when their child comes across a potential danger, such as cyberbullying, sexual content, or drug use.
“Children believe they are invincible. That what we warn about won’t happen to them. That sites and apps that target children are safeー and they are not. Strangers are strangers,” Felicia Freeman, a parent, said.
For the latter half of Gen Z, technology has become a significant part of childhood. They have the potential to advance this technology further in the futureー but it all depends on what we teach them in the present. Learning about the importance of citizenship in the real world and online will help them with handling what they see on the internet and projecting positivity amongst their peers. Being mature enough to handle a having a smartphone at such a young age means knowing when to turn it off.
“There’s real life and then there’s digital life, and digital life is not real life,” Jordan said.