Two Important AI Updates From the World of Roblox
One use of AI seems like it'll make chat better. Another use of AI is one the company probably shouldn't be encouraging.
The term “AI” has become misleading, confusing, and aggravating. “AI” can mean what an enemy does in response to the player in a game, the audio and visual output of generative “AI” technologies, corporate slop mean to send a company’s stock price soaring by mere AI association, and increasingly, somewhere vaguely in-between.
What’s happening in Roblox is somewhere vaguely in-between, as I outlined in a recent story. What’s worrying for parents is how often AI “upgrades” are without consent, and the assumptions being made by companies is that it’s always positive.
There are two AI-related developments in Roblox I wanted to highlight.
AI Update #1: (More) AI Comes to Roblox Chat
First, the new: deploying AI to “rephrase messages that break safety rules.”
In the past, if you said “HURRY TF UP,” aka “HURRY THE FUCK UP,” in public chat, the platform would try to catch the infringing language and change it to “####.”
If you’ve ever been in a popular Roblox game with a robust public chat, otherwise known as experience chat, it’s very common to see “####” spammed all over. (In most cases, I’m guessing it’s users pushing the limits of the system to troll everyone else.)
“To help keep chat flowing and respectful, we’re enhancing our chat filters and rephrasing profanity to words that are within our guidelines,” said the company in an announcement about the latest changes to Roblox chat moderation. “This is the first step on our long-term path to reducing #### for a more natural chat experience.”
Having automated moderation combing chat boxes in online games for swearing, harassment, bullying, and other issues is not exclusive to Roblox. It’s standard.
Repeatedly getting hit with “####” in chat results in a Roblox user being muted for longer periods, with the possibility of “suspensions or bans,” per Roblox’s policies.
Going forward, “HURRY TFUP” does not become “####,” it becomes “Hurry up!”
Well, “might become,” because Roblox says it’s “still at the beginning with rephrasing and providing more context to users when text is blocked.” It’s focused on “profanity.”
When a correction occurs, the user who typed the phrase is told it’s being rephrased.
Over time, the company says, the goal is to entirely eliminate “####” from Roblox chat.
For the moment, it’s also limited to public/experience chat.
“In its current state, this technology is specific to in-experience text chat,” said a company spokesperson. “As a reminder, you can only chat after you age check and with those in your own age group and similar age groups, as appropriate—unless they are Trusted Connections.” (Trusted Connections is its own tier of “friend” on Roblox.)
AI Update #2: What Is an “Extended Interaction?
When I last wrote about Roblox and AI, I noted how easy it was to find Roblox games connecting to chatbots like ChatGPT to allow a conversational back-and-forth. Roblox does not allow parents to explicitly block games that use AI, which means that even if you’re not allowing chatbots use in your house through ChatGPT or Gemini, it’s entirely possible your child is interfacing with a chatbot through a Roblox game.
What the company told me at the time was that they “require creators to disclose if they are using AI interactions in their experience, and creators who have extended AI interactions receive a content maturity rating of Restricted (18+).”
Naturally, I had a follow-up: How does Roblox define an “extended AI interaction”?
Here is what a company spokesperson told me in response:
“‘Extended interactions’ are defined as ongoing, unlimited conversations between users and AI systems. For example, if users spend most of their time talking to an AI NPC, this would be an extended interaction. However, brief conversations with an AI NPC during regular gameplay would not qualify as extended interactions. Any experience designed primarily for extended interactions will receive a ‘Restricted’ content rating.’”
When you dig into an example of an AI-driven Roblox experience, like Talk to Character AI, you get immediate disclaimers about why it’s not “extended”:
[DISCLAIMER: The game meets all requirements to be classified as “Al Interactions (Limited)” because it’s not possible to save conversations, daily messages are limited, and therefore it’s not possible to interact with the Al continuously.]
That feels like splitting hairs?
Put aside that Roblox refuses to define what “brief” means, but it only reinforces my desire for the company to incorporate a parental control that blocks children from engaging with Roblox games that use AI, whether ChatGPT or Roblox’s own tools. I shouldn’t have to play whack-a-mole and block access by peering over their shoulder.
But hey, you’ve got me. I’ll keep peering over Roblox’s shoulder for you!
Have a story idea? Want to share a tip? Got a funny parenting story? Drop Patrick an email.
Also:
The quotes from Roblox’s Teen Council in their chat announcement are a little sus. But it’s probably how a teenager would sound filtered through corporate PR.
It does feel like my nine-year-old’s new Fortnite interest in sticking; we had an incredible moment this week I’ll share more about very soon. Letting it play out.
Naturally, our Fortnite interest comes as a price increase hits V-Bucks. Right now, we’ve punted on the Battle Pass because the current chapter is unappealing.






Maybe I'm missing something but I'm struggling to see why AI is required to replace swears with appropriate language. Way back in 2001 I remember interfering with the word processing programme on a school computer so that it would automatically change some words to other words for a laugh 🫣 I was hardly a computer genius, it was right there in the menu. I look forward to hearing how Fortnite is going. I had a few matches with my kid this evening in Reload. It was fun but he really really doesn't like it when we lose.