Parents want to keep their kids safe online. But are parental controls the answer?

A child smiles while using a table computer.
Credit: Nick Velazquez / Mozilla

For our new series The Tech Talk, we’re digging into the challenges technology poses for families and exploring all the ways that can empower them. So we looked into digital platforms and found ourselves, as many parents and other caretakers do, in the parental control settings. 

These settings, along with services that promise to shield young people away from “inappropriate” content, can give families comfort in the face of the infinite feed. They let adults limit screen time and restrict mature content (although the way platforms identify what that means is far from perfect). But it is not so simple as setting up the parental control settings and walking away. It’s important for families to both understand kids’ behaviors and explain to them why they’re using parental controls.

The capabilities of these tools, as well as their shortcomings, led us to one question: In a world where technology’s hold over everything we do seems uncontrollable, what does parental control even mean?

An illustration reads: The Tech Talk

Talk to your kids about online safety

Get tips

Jenny Radesky, who studies the intersections of child development, parenting and technology at the University of Michigan, takes issue with the phrase itself.

“Parental mediation is [a better] term, parental engagement is another – and probably better because it implies meaningful discussion or involvement to help kids navigate media, rather than using controlling or restricting approaches,” said Radesky, who has contributed to the American Academy of Pediatrics’ policy agenda on kids’ technology use. 

She pointed to research that suggests letting children manage their own media consumption may be more effective than parental control settings offered by apps. 

In one study called “Coco’s Videos,” researchers designed a video-streaming app for preschoolers. In it, a character named Coco interacts with children as they watch videos. The researchers had the kids use three different versions of the app.

In the neutral version, a child sees a large “home” button after watching a video playlist. That button leads them back to the beginning where they can make a new playlist. 

In the “post-play” version, a child sees the same home button. But this time, there’s a small screen embedded in the top right corner which automatically plays a recommended video. The child can either expand that window to full-screen and keep watching, pause or go back to the home screen. 

In the controlled version, a child is locked out of the app once they’ve finished a video playlist. After three minutes, the app resets and returns to the home screen. 

Researchers found that the post-play version that automatically plays another video, a feature used by platforms such as YouTube and Netflix, “significantly reduced children’s autonomy and likelihood of self-regulation, extended video-viewing time, and led to increases in parent intervention.” Meanwhile, the version that used a lockout mechanism didn’t cut down screen time or the likelihood of parent intervention. 

The study concluded that we don’t need to make additional tools to control excessive media use; we just need to stop creating experiences that encourage it. 

In another study, preschoolers and parents were asked to create a device-based playtime plan together. Researchers observed parent-child interactions and interviewed the parents afterwards. The experts found that children, with parental guidance during the planning phase, moved on to their next activity without their parents having to intervene 93% of the time.

Alex Hiniker, who co-authored both studies, thinks that communication between parents and children about technology can be empowering. But platforms continue to be designed to get as much time and attention as possible. 

“Slapped on top of them are these lockout mechanisms and timers and say, ‘OK, now self-police yourself into not using the super enthralling things that we just put in front of you,’” Hiniker said. “I’m not a big fan of that approach. I don’t really think it’s working for families.”

How well parental control settings and apps work is one question. It’s also worth asking where the balance lies between protecting children online and encroaching on their independence and privacy. 

Parental controls and children’s privacy

Jason Kelley, a strategist for the nonprofit Electronic Frontier Foundation, which advocates for digital rights, worries that parental controls may be normalizing surveillance for children.

“You have to think of strict parental controls in a young person’s mind as essentially a parent sitting and watching them use the internet over their shoulder,” Kelley argued. “This can send a really bad message that safety is only available and possible through surveillance, and that’s simply not true.”

He acknowledged the good intentions behind efforts that seek to rein in social media and other digital platforms that increasingly take up young people’s time. But most parental control tools don’t recognize the different privacy needs of a toddler and that of a teenager, Kelley said, and filtering systems aren’t great at recognizing context. Not only could parental control settings block “thinspiration” posts, they could also restrict posts about body positivity.

“How we protect the mental health of young people online is a reasonable question,” Kelley said. “But there’s also the real question of whether Instagram is worse than, say, ‘Tiger Beat’ magazine. Our culture is set up in a way to make people feel bad about themselves. And Instagram is a reflection of that in some cases. But it’s impossible to eliminate bad things from someone’s online experience.”

Kelley said as a society, we want to instill in younger generations why privacy matters. He also underscored that not all adults act for the best interest of minors. And that risk is clear for marginalized groups. 

The internet has risks, but so do parental controls

It’s important to realize that many kids in the LGBTQI+ community can be made vulnerable by tech monitoring tools, especially in a country where things like conversion therapy are still practiced, said Christopher Wood. He’s the executive director of LGBT Tech, which develops and offers technology resources to support LGBTQI+ communities.

He noted that outside the home, sensitive information about young people can already be exposed to teachers and campus administrators through the school devices they use. Wood said he runs a local LGBTQ+ center in Virginia, where he gets calls from young people getting kicked out of their homes because their families found out their sexual orientation or gender identity — most often through technology.

Recent legal developments, such as Florida’s “don’t say gay” bill and the Texas Supreme Court’s ruling on gender-affirming care for trans teens, heighten privacy concerns. Information exposed by a monitoring tool could land young people, their parents or their teachers in legal trouble, with LGBTQ+ youth at the most risk.

Wood understands the uncertainties families feel about technology and parents’ need to create a safe space for their children on the internet. 

“There’s a push and pull with any child and parent,” he said. “I’m a parent. For me, it’s about creating an opportunity where my child can feel safe to come to me and talk to me if they get into trouble, while also providing the opportunity for them to explore. Technology is only going to become faster, and it will continue to infiltrate more parts of their lives.”

Wood believes that education is key, and not just for kids. 

“Sometimes, I’m like, ‘What are you doing? How are you doing that?” he said of his interactions with his children. “We need to create opportunities for parents to feel educated and feel like they’re not beating their heads against the wall.” 

Researchers like Radesky and Hiniker agree that when it comes to technology and its effects on young people, the onus shouldn’t fall on parents. 

Hiniker said more experts like her are now exploring how platform design can support meaningful relationships within families and give power back to users, including children.

In the “Coco’s Videos” study, Hiniker observed how kids interacted with their parents after watching their videos. In the app, the character Coco reminds the child of their next activity, like sleeping, reading a book or going outside.

“I loved listening to these really sweet moments,” Hiniker recalled. “They’d say, ‘Hey, Mom, it’s time for me to go outside and you need to find my boots for me because it’s raining.’ Kids sort of understood that they’re in control, and it seemed really empowering to them.”

This design is different from our idea of parental controls, Hiniker said, but it could be our best bet in raising children who grow up to have healthy relationships with technology. 

In the end, parents may not be able to find foolproof parental control tools that can shelter their kids from the internet’s imperfections. But families can find comfort in the fact that the best control may be the one young people feel as they learn about the powers of being online – all while knowing that they have whatever support they need offline. 


The internet is a great place for families. It gives us new opportunities to discover the world, connect with others and just generally make our lives easier and more colorful. But it also comes with new challenges and complications for the people raising the next generations. Mozilla wants to help families make the best online decisions, whatever that looks like, with our latest series, The Tech Talk.

Get Firefox

Get the browser that protects what’s important

Share on Twitter