How Much Privacy Do Children Deserve Online?
We have the technology to know everything our kids do online. It's only a question of whether you decide to use it. But should you?
You cannot protect children from everything that might harm them. This is an especially hard truth to understand as a parent, and one that’s even tougher to internalize when technology exists to convince you otherwise. It’s especially insidious when the same technology promises to solve them at the expense of a child’s privacy.
I started thinking about this when a press release crossed my inbox for an app called Bright Canary, which this alarmist headline: “Fed-up Parents Turn to BrightCanary's AI Monitoring Tool to Keep Kids Safe On Snapchat, Text Messages and Other iOS Apps.”
This comes after Louisana sued Roblox, calling it “the perfect place for pedophiles.”
Bright Canary is, on some level, a fancy keylogger, an invasive piece of technology that can track what you type on a keyboard. (A useful way to nab passwords!) But Bright Canary calls their version a “secure keyboard,” that “gives parents real-time insights into what their child types and sends.” It’s monitoring everything that you give it access to, and uses “AI” to monitor your child’s behavior and report on red flags.
“The app's AI monitors for bullying, explicit content and drug-related language,” says the company, “then delivers potentially life-saving alerts and emotional insights to help parents keep their kids on track. They also offer a service that includes access to images and videos sent via text message.”
The pitch, though, is alluring. It feels safe, comforting—it feels right. Watch this:
But how different is a service like this from reading your kid’s diary?
And once you have access to all that information, can you trust yourself to turn it off?
Children, even in the internet age, deserve privacy—with caveats, as is always the case! They are owed a chance to be themselves without peering eyes. That doesn’t mean throwing up your hands and ignoring what your child is doing on the internet, but installing the equivalent of a vacuum into your child’s device feels a little much?
I sent a simple question to Bright Canary about their pitch: “What’s the balance between protecting children and invading privacy?” Their answer was aggressive. You should read the whole thing by Bright Canary CEO and co-founder Karl Stillner:
“Nothing a child does on a device is private. Assuming otherwise can leave children vulnerable to exploitation, sexploitation and other serious risks. They need supervision. Some kids are quite literally dying from a misplaced belief in online privacy.
With grooming, mental health concerns, and drug sales happening on platforms like Snapchat, parents not only have the right — they have a duty to know if their 12-year-old is talking to a stranger online. Giving a child a smartphone is like dropping them off in the middle of a busy, unfamiliar city. And yet, many kids are allowed to message and explore online without supervision, even though there are billions of strangers on the other end of their smartphone.
We aren’t “peering in” as much as we’re adapting parenting for the digital age. Most parents know who their kids hang out with in real life, but that’s nearly impossible online unless parents have tools to stay involved. BrightCanary makes it easier. Our app summarizes a child’s digital activity across all their apps so parents can quickly see what’s happening, spot red flags, and start conversations. When parents need to access full messages, that information is available in BrightCanary — but for daily use, we encourage a privacy-respecting approach with our activity summaries.
We believe this is the best of both worlds: giving kids a measure of autonomy online while ensuring parents have the tools they need to keep them safe.”
Several lines here had me raising my eyebrow:
“A misplaced belief in online privacy”
“Nothing a child does on a device is private”
“We encourage a privacy-respecting approach”
“We aren’t ‘peering in’ as much as we’re adapting parenting for the digital age”
All of this suggests there’s an either/or option. Which makes sense: they want to sell you a service. Can you argue against protecting your children for only $99.99 per year?
People wax nostalgically about the latchkey kid era, where parents were only semi-present and children explored without a surveillance state. My own mom has told me a version of this, where she wandered the neighborhood all day, and only ran home when she heard her mother’s whistle smash through the air like crackling thunder.
As a kid, some of my fondest memories are, in fact, when I was let loose with a bike and a group of friends. We got into trouble. We pushed boundaries. We did stupid shit. I probably would have done less of that stupid shit if I knew I was being tracked.
Children, even in the internet age, deserve privacy—with caveats, as is always the case! They are owed a chance to be themselves without peering eyes. That doesn’t mean throwing up your hands and ignoring what your child is doing on the internet, but installing the equivalent of a vacuum into your child’s device feels a little much?
I know that I grew up in a different era of the internet, but being able to find myself in those spaces is precisely the reason I have a career, why I’m writing this newsletter. But it didn’t mean my parents passed on their obligations to protect me. They asked questions about what I was doing online and who I was talking to. When I wanted to broach meeting those people in real-life, my parents went with me and met them, too.
I can imagine situations where services like this makes sense. Perhaps your child is struggling with mental health. Maybe they’re being bullied on social media. Maybe they’re going down an extremist rabbit hole. I can conceive of scenarios where you would want to take a more heavy handed, active approach. But the rhetoric of places like Bright Canary suggests parents who aren’t doing this are themselves irresponsible.
Last week, I wrote about having my kids wear AirTags during a trip to Europe. I do not think having a vague idea where my two young children are in a foreign country where a stranger might not speak the same language is helicopter parenting. We shoved the AirTags in a drawer when we came home. We’ll use them if we visit Japan.
We went to a city festival this past weekend and my wife asked if we should bring the AirTags with. I thought about it, but I think it’s important to, at times, feel a little stressed and uncomfortable about your children. It’s okay if they go out of sight for a few minutes, if you’re quietly wondering if they can find their way back to where your group is hanging out at a busy event. You’re building calluses. They’re building skills.
I want to bring that same set of values to the internet, too, as best I can!
“A misplaced belief in online privacy” is why we’re getting sweeping laws in places like the UK that are using “protect the children” rhetoric to strip adults and children of their right to view the internet as they please. Those policies have spread to the United States and resulted in, naturally, restricting speech. You can protect kids and respect privacy in the same way that you can restrict phone usage in schools without pandering to conservatives who want to eliminate LGBTQIA folks from society.
The concern from parents is, obviously, understandable!
What our children do on the internet is, in many ways, a black box. Google tells me it’s possible to access a history of what my nine-year-old watches on YouTube, but what kind of parent is combing through their kid’s watch history after they’re in bed?
I wish YouTube sent me a monthly summary of what my kid is watching, i.e. “hey, your kid is really into so-and-so’s channel, here’s the keywords they’re using.” It’s less about my children ducking potential teenage nazis insomuch as it would open up a conversation between us about what they’re into. We don’t allow YouTube on the main TV in the house, but I’ve wondered if I should relax that rule a bit, because it would mean both of my kids would give me better insight into what they’re watching?
It is not unreasonable to ask the algorithm to do enough, but it often doesn’t, and something like Bright Canary steps in to fill the gap. That video I embedded above promises to do exactly what I want YouTube to do, except that it also gives you an opportunity to dig much, much deeper into their internet habits, too. This is all happening because YouTube and its ilk do not want to take on the responsibility.
I think most parents would tell you they’re trying to keep an eye on what their kids are doing in these spaces, but the reality is that if your kid seems like they’re well-adjusted, you move on. You’ve ticked the parental control and age-appropriate boxes and then move on with your life. How many parents even know Roblox and Fortnite have their own layers of parental controls in addition to controls on the device itself?
Bright Canary isn’t the only company offering service like this. Many are trying to fill the infuriating void left by Roblox, every social network under the sun, and internet companies in general simply not giving a shit about children. Such changes often only come after nightmarish stories about the abusive behavior happening in these space.
Why wait for Roblox to when a magic tool says it’ll tell you if your kid is in danger?
Who could say no to that? And when would you decide to stop?
Have a story idea? Want to share a tip? Got a funny parenting story? Drop Patrick an email.
Also:
If you use one of these services, I’m not judging you. But I do want to hear from you about how it’s gone and how you balance oversight and maintaining privacy.
Maybe I’ll change my tune in a few years. My nine-year-old is in 4th grade, and the real questions and tests about this approach will happen in middle school.
My nine-year-old had a sleepover with several friends for her birthday party this past weekend and two of them had Apple Watches. I keep waiting for my daughter to ask about getting one herself, but my wife and I have been pretty insistent that any permanent device would not happen until middle school. I feel like giving her an old junk phone without a data plan has placated her interest in such a device, even if it doesn’t have the same utility as a “real” phone would.
Wow, yikes. Reminds me of the "hero cop" archetype in police recruiting and tacticool gear advertisements that Waypoint talked about a few years ago. Only a hero parent prepared to go to war is suitable to protect their kids from the forces of evil on the internet.
The callus analogy is very apt. I worry that wrapping kids in bubble wrap to protect them from the internet is only going to leave them unprepared for the real threats from scams/right-wing recruiters/etc. Kids should be allowed to "play in the dirt", so to speak, and built up their calluses and their immune system.
I can't say that the thought of my daughter one day being online doesn't fill me with anxiety. There's... A lot on the internet that I'd prefer she steer clear of. To some extent using parental filters helps. But also, I think a lot of technology like this misses some essential. Trust is a two way street, and it's forged through conversations, not unilateral controls