This Company Is Betting AI Can Help Protect Your Kid While They Play Games Online
ProtectMe silently monitors what your child says—and what people say to them—and reports back. But at what point does software like this become an invasion of privacy?
In our house, we use Apple’s screen time features to monitor and control how much time our children spend on their iPads. But as I mentioned in a piece on Crossplay last week about the newest way my oldest child is finding novel loopholes around screen time, Apple’s features are, at times, really inadequate. They often feel designed to help adults control their screen time, and my wife and I are often left wishing there was a greater level of granularity available, like more varied options for granting time.
There are a wealth of options for families looking beyond what’s built into your device, whether it’s one from Apple, Google, or someone else. Most solutions seem (understandably) focused on social networking websites, messaging, and web surfing. Games can be more complicated, especially because that gaming might be happening on a closed system like an Xbox or PlayStation. But even there, some companies, like Bark, offer hardware solutions to monitor what’s happening. Others, like Aura, are backed by celebrities like actor Robert Downey Jr. (??) and promise monitoring beyond what games your kids are playing, and include protection from identity theft.
“We tried [one monitoring software]—Norton and another one,” said one Crossplay reader Robert F, when I asked about people’s experiences with software like this. “The issues were trying to find a balance between restrictions on their laptops and the school IT needing access or the kids suddenly requiring something whilst at school. It became very frustrating taking up our time having to lock, then unlock. I've largely given up because however much effort you put into it there are always kids in class with unrestricted phones/ computers. We are now down to the basic apple controls and the kids under 12 just have browsers removed.”
It’s a confusing marketplace of options, and likely to be a topic Crossplay revisits over time. If you and your family are using, or have used, these solutions, let me know.
“In our past life, most of us worked in national security,” said Ron Kerbs, CEO of Kidas, who develops the monitoring software ProtectMe, in an interview with me recently. “I was leading R&D teams for the Israeli intelligence. One of our lead researchers was working in the child online protection bureau, a police department that is targeting online predators and their behavior online.”
Intense.
Kidas, for the moment, is focused on the PC, which means it’s aimed at children who are doing more than playing Roblox on their iPad, and are hanging out in places like Discord, too. (It does not run on Chromebooks yet, which are very popular with kids.)
It’s a piece of software installed on your PC that then, according to the company, “is prompted to turn on and analyze gaming communications for any potential red flags,” which can mean anything from sharing credit card information, to financial scams, to bullying. Each week, parents receive a report of their child’s activity, a sample of which can be seen here. It’s a general view, and does not include full transcripts.
In this sample report are some of what ProtectMe looks for and flags, such as:
Flaming: Another gamer directed insulting comments toward your child. Here are some recommendations on how to have conversations about this with your child.
Hate Speech: Another gamer expressed hate speech towards your child. We encourage you to speak to your child about it. Let them know that they can report the player as hate speech is not tolerated in video games.
If an event rises to a certain threat level, a parent receives a text message. The incidents are reviewed by Kidas, to prevent the system from spamming false positives.
This is not the light touch that’s applied by built-in screen time controls, which largely focus on time management and binary app control. Products like ProtectMe are purposely invasive, and spend time monitoring the elements of your child’s gaming activities that you might not be privy to. Would your child let you stand over their shoulder and listen to their voice chat interactions? Probably not. Would they let you install a piece of software that quietly does exactly that in the background? Maybe.
“So much of parenting is learning how much freedom to give your kid and how much we can trust them,” said Ash Brandin, a middle school teacher known as “The Gamer Educator,” who helps parents navigate games and tech. “Technology is trying to make it so we could potentially give them more freedom with less trust, but that backfires if it prevents us from building that trust organically.”
“So much of parenting is learning how much freedom to give your kid and how much we can trust them. Technology is trying to make it so we could potentially give them more freedom with less trust, but that backfires if it prevents us from building that trust organically.”
Should you use something like this? A harder call, and it’s easy to imagine a child viewing this as spying, because functionally that’s what the software does, even if it’s understandable why a parent would want to know if their child is being harassed.
“It’s a tool and all tools can be used in many ways,” said Brandin. “If a parent is juggling work and home and multiple kids and they just cannot be present to monitor chats, then maybe this is a way they can build in a bit of accountability and a potential safety net if they needed it. But it could also veer into an invasive practice that erodes trust between a child and their adult.”
Kerbs doesn’t recommend parents install ProtectMe without telling their children, viewing the introduction of something like ProtectMe as part of a larger conversation.
“I think a lot of them are not even aware of the risk in gaming,” said Kerbs. “They also less care about being monitored on gaming compared to social media. They don't view gaming as a private space, they use social media or texting apps as [a] private space. They view gaming as a public space.”
If they don’t like it, though, it is reportedly very difficult to uninstall, and if attempts are made, the parents will be notified by the software that it’s happening.
ProtectMe can oversee text and voice exchanges in most popular games, but the company has also partnered with Overwolf, a platform for building apps and mods that, according to Kerbs, allows Kidas a deeper level of monitoring in-game activity.
(Overwolf is located in Tel Aviv, Israel. Kidas itself has a history with the area, given Kerbs and most of the Kidas staff worked for Israeli intelligence. The partnership between the two companies is more than collaboration; Overwolf invested in Kidas.)
Kerbs said its algorithms, combined with Overwolf, are focused on deep context.
“[It] allows us to integrate to the gaming APIs to know exactly what is happening within the game,” said Kerbs, “so who's shooting at who, who's playing at the same side, what games are they playing, if someone is killing someone in the game—all of the events from within the game. Then, we combine it with the voice conversation, text conversation, and the context of the game. You can do a lot of advanced analysis, like perceiving age detection, perceiving emotion detection. If somebody is sad, crying, if someone is happy. Then, we combine all of those factors into understanding if someone is really in danger, or just shooting at each other and telling each other ‘Yeah, I'm gonna kill you! I'm gonna kill you!" But they're playing Fortnite.”
In that case, ProtectMe would not perceive “I’m gonna kill you” as a real threat.
The data collected is not shared outside of Kidas and the parents, according to Kerbs, but it is used to inform their algorithms, and you cannot opt out of this component.

“We don't use it for any marketing purposes, we don't share it with any third party,” he said. “We actually support the two main privacy laws in the US, the Children's Online Privacy Protection Act and CCPA, the California Consumer Privacy Act. It means that if parents want to get access to their child's information, they can request it. If they want us to delete it, we will delete it. After one year, we will delete everything. We keep some information in case we're required to do that, because of a parent's request or something like that. But at the end, we save it for internal purposes to improve our algorithms and at the end then don't use it anymore.”
ProtectMe is available through Kidas, but also through another popular solution, Aura.
“Gaming is a core component of the ways that kids spend their time on devices, but it’s often overlooked when discussing child ID protections and parental controls,” said Aura CTO Ryan Toohil. “There’s a lot of focus on social media, but what many don’t realize is that kids encounter the same challenges when gaming as they do on social media.”
In 2021, Aura acquired Circle, another screen time service. The company has an “innovation fund” for companies working on solutions for “vulnerable communities, like children.” This is how Aura became interested in partnering with Kidas.
“We saw a great opportunity to partner with Kidas to accelerate research and development into the ways AI can be harnessed for online safety, while also delivering immediate value to parents using Aura’s solution,” said Toohil.
When using ProtectMe through Aura, Kidas is handles the heavy lifting. Their algorithms do the monitoring, and Kidas is still signing off on any alerts sent.
Toohil also defended both companies’ record on data collection.
“We do not sell user data for any reason,” said Toohil. “We have features designed to consistently monitor for customer data on data brokerage sites and have that information removed, as well as alert users if any information is found on the dark web and provide resources to ensure that the customer is not left vulnerable as a result of that information leak.”
Kerbs also doesn’t believe ProtectMe, or any software like it, is permanent. They don’t think kids could be going to college with monitoring devices on their computers. His view is, instead, ProtectMe can work in tandem with parents to provide guard rails.
“It's the same as teaching your child how to drive,” said Kerbs. “At the beginning, you sit right next to them, and you teach him and you view everything that he's doing. And then after that, you feel confident enough to let them drive by themselves. We view the software the same.”
Also:
These all feel a step too far for me, personally. But it’s also true my kids aren’t on voice chat yet, and most of their multiplayer is local or with friends that we know.
I’m going to start doing publishing guides to the screen time options on different platforms. People keep talking about the Switch, so I might start there.
The online bullying is what spooks me most about today’s media world. Growing is hard enough, but doing that when social media is involved? Yuck.
Have a story idea? Want to share a tip? Got a funny parenting story? Drop Patrick an email.
I'm definitely conflicted about this sort of thing (and for better or for worse, I'm still at the point where I can tell myself "my kids are still too young, I don't need to figure this out for a couple more years, at which point who knows what the internet landscape will look like, so why bother trying to solve it way in advance?")
I'm not sure I'd ever be ready to use one of these methods from screen time limiting, but I do wish some of the companies who own these devices gave parents more flexibility.