FTC Rejects ESRB Proposal to Use "Facial Age Estimation" Technology (For Now)
The decision falls into a larger context where the US is woefully behind in proposing ideas to try and protect children online.
Full Disclosure: I occasionally partner with the ESRB to publish parent and family-focused blogs based on the various tools the ESRB provides game developers and parents.
Update (04/08/24): In response to this piece, the ESRB noted to Crossplay the organization does “not endorse the use of this technology with children, and that ESRB itself would not be using the technology” and “it would simply be one option for parents (among many).” The group disputed the interpretation of the technology as a means of “age gating children or otherwise managing their access to video games” and underscored the FTC’s vote “denied the application without prejudice, making it clear that it took no position on the ‘substantive’ merits of the application.”
**
The US Federal Trade Commission (FTC) has rejected a proposal by the Entertainment Software Ratings Board (ESRB) and its partners to use “facial age estimation” technology, which the ESRB suggested could make it easier to prevent children from buying inappropriate games by circumventing age verification systems.
Update (04/08/24): As noted above, the ESRB disputes this description, and does not endorse this specific application of the technology, and does not recommend it as an age gate.
You can read the 2023 proposal, which was submitted to the FTC last summer, here. The ruling, which was 4-0 in favor of denial, was first reported by GamesIndustry.biz.
Such technology could work in replacement of, or perhaps in tandem with, physical IDs, which is one of the hoops Roblox has you jump through if you want to see its mature content. Roblox also requires its user to take a selfie for age verification.
The ESRB, largely known publicly as the organization that helps assign content and age ratings on video game boxes, already plays in this realm, to a degree. Game companies can partner with the ESRB to be “privacy certified,” which grants them an additional badge related to adhering to various online and mobile privacy regulation.
In the proposal, you can tell ESRB was clearly trying to get ahead of understandable concerns about data harvesting when it comes to collecting images of children, and argued the technology was meaningfully different from facial recognition technology:
“When performing a new age estimation, the system extracts the portions of the image containing a face, and only those portions of the image are analyzed for matching patterns. To match patterns, each node in Yoti’s neural network performs a mathematical function on the pixel data and passes the result on to nodes in the next layer, until a number finally emerges on the other side. The only inputs are pixels of the face in the image, and the only outputs are numbers. Based on a review of the number patterns, the system creates an age estimation.”
Regulation in the United States is woefully inadequate, which means we’re mostly relying on platform holders—companies who also stand to profit off children—to do that job. You’ve probably heard of COPPA (Children’s Online Privacy Protection Act), which tries to address this, but COPPA was passed in [checks notes] 1998, and the last time there was a meaningful update to COPPA was [checks notes] a decade ago.
When the ESRB’s proposal became public, there were characterizations about the proposal that the organization took issue with, and subsequently released a blustery statement to IGN in response to how people were characterizing the technology.
“First and foremost, this application is not to authorize the use of this technology with children," said the group. “Full stop. Nor does this software take and store 'selfies' of users or attempt to confirm the identity of users. Furthermore, this application makes no mention of using age estimation to prevent children from purchasing and/or downloading restrictively rated video games, nor do we intend to recommend its use in that way."
Those comments were in June 2023. Nearly a year later, the proposal has been denied.
Update (04/08/24): As noted above, the ESRB points out “denied the application without prejudice, making it clear that it took no position on the ‘substantive’ merits of the application.”
Here’s how the FTC explained its decision:
“Under the COPPA Rule, online sites and services directed to children under 13, and those that have actual knowledge they are collecting personal information from children under 13, must obtain parental consent before collecting, using, or disclosing personal information from a child. The rule lays out a number of acceptable methods for gaining parental consent but also includes a provision allowing interested parties to submit new verifiable parental consent methods to the Commission for approval.
After receiving more than 350 comments, the Commission voted 4-0 to deny the application without prejudice to the applicants filing in the future, when the Commission anticipates that additional information will be available to assist the Commission and the public in better understanding age verification technologies and the application. In declining the application at this time, the Commission is taking no position on the merits of the application.”
A denial of this proposal is not a denial of the technology, or even the approach. It’s possible all parties involved could return with something that would pass the FTC.
Have a story idea? Want to share a tip? Got a funny parenting story? Drop Patrick an email.
Also:
Biometric identification is where security is going, so I’m guessing we end up with something closer to this than not, perhaps with some more restrictions?
Do you put your actual age in age gates? It’s a coin flip for me. Sometimes, I do my actual age out of natural reflex, before realizing it’s probably harvesting data.
The US is not leading at all when it comes to regulation on this front, and we’re more likely to see trickled impacts from regulation that’s happening elsewhere.
I ALWAYS fake my age in age verification systems. If even by a day sometimes. Also give the wrong phone number when I register places. No company needs my data. I’m really against age verification systems. Most of it is just parents who can’t be bothered to help their children.
As someone who worked in machine learning for about a decade, I would absolutely *never* trust tech like this. It will work "fine" for maybe 95–98% of people (whose backgrounds are represented in the training dataset, and whose skin tone allows for sufficient contrast in the lighting, which are both major limitations). It will then will miscategorize the remaining people (or fail to recognize their faces as faces entirely)—but the fact that it won't work for such a small percentage of folks and would lock them out of their games or whatever. This will mean that companies will have little incentive to improve the technology for folks like Patrick who look perpetually sixteen or whatever. (lol I'm right there with you buddy, that's one of the reasons I grew a massive beard)
Add to that the privacy concerns (which remain, in my mind, despite the ESRB's assurances) and the fact that once the tech is deployed, companies will have increased incentive to expand its use-case. ("Sure, we said we wouldn't use it to age-gate games, but now that everyone's used to it...") Just a no-go for me.
But like a lot of former software devs I'm leaning hard into Luddism as I age. My reflex take on stuff like this is always cynical. 🤷