Roblox has plans to implement AI to guess user ages but the Australian Labour Government thinks more should be done to protect young people and that the current solution offered by Roblox is insufficient. There is still debate for whether or not Roblox should count as "social media" and be included in the new age restriction laws.
Roblox rolling out new safety measures to stop kids chatting with adults has done little to win favour with Labor, with the Albanese government saying all digital platforms should be proactively protecting "young Australians".
[...] The new measurers, which start in the first week of December, include age-based chats that restrict players from speaking to people outside their age group.
[...] Despite having social elements, Roblox insists it is not a social media.
The eSafety Commissioner agrees but is reviewing whether to include it in the social media ban.
(Score: 5, Funny) by jb on Monday December 01, @07:57AM
It's a sad day when even News Corp can't afford to hire journalists who can spell...
...or, given that "measurers" could perhaps be regarded as a rough synonym for "rulers", are they giving us a hint to expect some sort of coup in Canberra this week?
(Score: 4, Interesting) by pTamok on Monday December 01, @08:09AM (6 children)
I guess that the people operating Roblox want to avoid using a 'government ID' infrastructure, with all the requirements that that entails, so using machine learning for pattern recognition is a less bureaucratic approach that will make mistakes, but also will act as a porous barrier for under-age users.
The government, naturally, wants to be able to point to a system claimed to be capable of making no false positives (assessing someone as being above the cut-off age by mistake).
From a harm reduction perspective, a machine learning system will give a lot of benefit for little outlay, especially if the assessments made by the system are appealable: that is, if it falsely claims you are below the cut-off age, you can get that corrected.
And there is no need to falsely ascribe intelligence to machine-learning pattern-matching software - such software has been around for decades, used as a tool in many fields.
(Score: 2, Funny) by Anonymous Coward on Monday December 01, @08:37AM
Similarly a kid who clicks "No" in response to "Are you 18 years of age or older?" is likely not ready for the 18 years or older stuff.
Of course, not all who pass such tests are ready but at least we eliminate those who are more likely to not be ready.
(Score: 4, Insightful) by shrewdsheep on Monday December 01, @09:35AM (2 children)
I'm wondering how an ML approach could be good enough. For questions, answer lists will be put up on the internet on short notice. Likewise for chatbot sessions. What remains is usage patterns. I believe that would be very error prone, too. Also, it would result in pop-ups: Unfortunately, you have been detect to be younger than 18 yrs of age. Logging you out.
(Score: 3, Insightful) by VLM on Monday December 01, @03:42PM (1 child)
AI analysis of posted content would probably work pretty well.
(Score: 4, Touché) by Anonymous Coward on Monday December 01, @06:28PM
> AI analysis of posted content would probably work pretty well.
Joke's on you, I can think of one orange hair guy who posts frequently at a pre-teen level. Sure would be funny if he got booted for being underage.
(Score: 4, Interesting) by corey on Monday December 01, @08:05PM
I said it and will keep repeating it: these companies just need to use a 4-5 question quiz at the start with general knowledge questions that typically only adults would know the answers to. Just like the original Leisure Suit Larry. Minus the Alt-X bypass hack.
It’ll either stop kids getting in, or consume too much focus and they’ll give up and go somewhere else - problem solved.
(Score: 3, Informative) by driverless on Tuesday December 02, @02:05AM
Problem is, Roblox as it's currently run is a pedo buffet. If you've got a young kid (niece/nephew, grandkid, whatever), let them show you the sort of stuff they get exposed to on there, and the creeps who contact them. "AI age verification" and whatnot are just red herrings to draw attention from the deeper problems the platform has.
(Score: 0) by Anonymous Coward on Monday December 01, @03:34PM (3 children)
This one is suffering from too much blockage, needs a laxative, at least..
Besides, the kids are the wrong target. People over 18 are the much bigger problem
(Score: 2, Insightful) by Anonymous Coward on Monday December 01, @10:17PM (1 child)
The internet is not the problem. The billionaires (and indirectly their bought politicians) are.
Kill the billionaires (financially or literally, I don't care) and politicians stop being bought by billionaires. No matter how complicated you want it to be, it's not complicated.
(Score: 3, Interesting) by wirelessduck on Tuesday December 02, @11:04AM
(Score: 3, Insightful) by jb on Tuesday December 02, @07:51AM
There's no technical barrier to stop anyone who wants to from going back to building ad hoc networks of UUCP links and running store-and-forward implementations of mail & news (and all those '80s hacks like ftp-over-mail that went with them). Such networks require no centralised infrastructure at all so are extremely difficult to regulate.
But that only gets you freedom from pernicious regulation: one of the biggest benefits of anything that's built by engineers for engineers.
On the other hand, many of the people complaining loudest about the most recent round of pernicious regulations seem to be more interested in access to hundreds of millions of eyeballs.
Those two design goals are not really compatible with each other.
(Score: 5, Insightful) by VLM on Monday December 01, @03:40PM
Take out the chat.
I did an online degree, graduated in the mid 00s, and it was interesting to compare with my kids "covid vacation" online classes around 2020.
"Back in the day" online school software had to include everything. I don't remember a full office suite but we definitely had email and chat and video conferencing and a calendar system and and and.
My kids had much more modern basically a wiki with a login, and if you needed to email an assignment to the teacher you went off platform and used ... real email. Or if you needed to video conference in a team project, instead of using the platform's shitty conference system you went off platform and used ... real video conferencing apps.
My guess is in the long run the meme of "all video games must include a shitty partial clone of Discord" will go away and games will be games and you'll use a chat program if you want to chat. Maybe with an integration. But "all games must have their own shitty chat system" is lame.