"This bill seeks to set a new normative value in society that accessing social media is not the defining feature of growing up in Australia. There is wide acknowledgement that something must be done in the immediate term to help prevent young teens and children from being exposed to streams of content unfiltered and infinite.
(Michelle Rowland, Minister for Communications, Australian Parliament, Nov 21)
Australia's House of Representatives has passed a bill that would ban access to social media platforms TikTok, Facebook, Snapchat, Reddit, X and Instagram for youngsters under 16. The bill passed by 102 against 13.
Once the bill gets through the Senate -- expected this week -- the platforms would have a year to work out how to implement the age restriction, without using government-issued identity documents (passport, driving licenses), and without digital identification through a government system.
The leaders of all eight Australian states and mainland territories have unanimously backed the plan, although Tasmania, the smallest state, would have preferred the threshold was set at 14.
There are some counter-noises though (no, not you, Elon). More than 140 academics signed an open letter to Prime Minister Anthony Albanese condemning the 16-year age limit as "too blunt an instrument to address risks effectively."
The writers of that open letter fear that the responsibility of giving access to social media will fall on the parents, and "not all parents will be able to manage the responsibility of protection in the digital world".
Further, " Some social media 'type' services appear too integral to childhood to be banned, for example short form video streamers. But these too have safety risks like risks of dangerous algorithms promoting risky content. A ban does not function to improve the products children will be allowed to use."
The open letter pleads instead for systemic regulation, which "has the capacity to drive up safety and privacy standards on platforms for all children and eschews the issues described above. Digital platforms are just like other products, and can have safety standards imposed."
Australia's ban on social media will be a world-first, with fines of up to 50 million Australian Dollars for each failure to prevent them youngsters of having a social media account.
Under the laws, which won't come into force for another 12 months, social media companies could be fined up to $50 million for failing to take "reasonable steps" to keep under 16s off their platforms. There are no penalties for young people or parents who flout the rules. Social media companies also won't be able to force users to provide government identification, including the Digital ID, to assess their age.
From ban children under the age of 16 from accessing social media we also get the following:
Under the laws, which won't come into force for another 12 months, social media companies could be fined up to $50 million for failing to take "reasonable steps" to keep under 16s off their platforms. There are no penalties for young people or parents who flout the rules. Social media companies also won't be able to force users to provide government identification, including the Digital ID, to assess their age.
Social Media, or an "age-restricted social media platform" has been defined in the legislation as including services where:
- the "sole purpose, or a significant purpose" is to enable "online social interaction" between people
- people can "link to, or interact with" others on the service
- people can "post material", or
- it falls under other conditions as set out in the legislation.
(Score: 3, Insightful) by pTamok on Friday November 29, @02:29PM (15 children)
Is SoylentNews 'social media', according to the definitions used by Australia?
(Score: 4, Informative) by janrinok on Friday November 29, @02:38PM (14 children)
Their definition covers almost every web site that accepts comments in any shape or form. That includes similar sites to our own, discourse, mastadon, every gaming forum, etc. Websites that discuss recipes, knitting, making cosplay costumes, hardware project sites, in fact any hobby site, you name it, they can say it applies. Of course, they also have the "falls under other conditions as set out in legislation" which we know nothing about.
If we get fined $50M you will all chip in won't you?
I am not interested in knowing who people are or where they live. My interest starts and stops at our servers.
(Score: 4, Funny) by RS3 on Friday November 29, @03:26PM
Is that Australian or Canadian dollars? Asking for a friend.
(Score: 5, Interesting) by bzipitidoo on Friday November 29, @03:43PM
You're joking of course. $50m is too big an ask, as we all know well. Cheaper to hire lawyers to fight that, and hiring lawyers isn't cheap.
Just wait, I suspect it'll be struck down before it goes into effect. If this new law does go into effect simplest to just block Australia? Don't even bother asking for age verification, just block every Australian, children and adults alike. Savvy users can turn to proxies or virtual networking to get around the block.
(Score: 3, Insightful) by aafcac on Friday November 29, @03:48PM (1 child)
That was my first thought. It's one thing to ban minors from sites like FB, X, Tictock, YouTube and the like where there's an algorithm that is setup to try to keep people on the site as long as possible, regardless of what that does to the user, and quite another to ban them from sites like this where you can come and go as you like and there isn't really that much going on at any given time besides new comments.
(Score: 3, Informative) by Reziac on Saturday November 30, @04:16AM
Well then, maybe here's a metric:
If it uses infinite scrolling, the purpose can be assumed as "to keep people on the site as long as possible."
And there is no Alkibiades to come back and save us from ourselves.
(Score: 4, Interesting) by zocalo on Friday November 29, @04:06PM (8 children)
Also, as we've seen in previous attempts, e.g. for porn sites, on-line age verification is very complicated, a privacy nightmare, and quite often simply does not work - kids old enough to use social media generally know how to lie and/or enter borrowed details from a parent or older sibling to get what they want. (That works the other way too; I have a number of accounts where I've declared myself a minor, mostly because it means fewer ads and other fluff I don't want. The list of sites with any of my actual details is somewhat shorter than those that do not, and mostly limited to those where I may need to use a credit card or something).
While I don't agree with the tactic, I also suspect a least a few companies (and am almost 100% certain on X being one of those, because Elmo) will decide to call Australia's bluff and do a blanket ban on the grounds that it's impractical to implement reliably and wait and see if there will be a climbdown, or at least some form of compromise. Google and, IIRC, Facebook have also already used that tactic successfully elsewhere, so may be prepared to do so in Australia as well.
UNIX? They're not even circumcised! Savages!
(Score: 2) by bzipitidoo on Friday November 29, @04:17PM (6 children)
Yeah, putting the onus on every site maintainer is demanding way too much. People who want such censorship should see to the censoring themselves. Hire a net nanny service, if they don't want to do the hard work of blocking content themselves.
These people who ask that others self censor, ask for that favor, that handout, are the same people who get all upset that someone else might have received a handout.
(Score: 3, Insightful) by zocalo on Friday November 29, @04:52PM (5 children)
An ISP-provided net nanny service wouldn't be flawless, but would definitely be a better approach than this, and they could even charge a little extra for it to soften the blow of making providing it a legal requirement for ISPs over a given threshold (number of users or revenue, maybe). Give parents a simple dashboard that controls what kinds of content/specific sites are allowed and when, perhaps aided by some optional site-provided metadata akin to game or movie ratings to define the types of content being hosted, and you're nearly there. Alternatively, maybe require parents (or legal guardians) to actually take responsibility for being a parent, including by making them at least partly legally responsible for their offspring's misdeeds while they are still a minor (subject to due process, naturally).
The fact is that kids are well aware of Real World Shit at an earlier age than many people, and especially parents, like to think, and that includes a LOT of topics that might not be a parent's first choice to talk about, but it's still a much better option than them finding out the "facts" from the playground rumour mill or some random Internet site. The responsible thing for a parent to do is to be open about this kind of stuff, talk to their kids about it, make sure they know that while it's OK to be curious there is some nasty stuff out there and it's probably for the best if they don't go too far down the rabbit hole, and - most importantly - that they can talk frankly about it if they want to.
That so many parents feel they can abrogate the implicit responsibility to provide parental guidance is perhaps the most compelling argument for actually having something of a Nanny State, and why governments feel they need to act. That's supposed to be *your* job, not your brat's peers, school teachers, neighbours, the police, and certainly not your government's, but that's the slope they're sliding down. And, no, ignorance of the Internet, that content filtering exists, and so on isn't really an argument for parents any more. Most parents with tweens & teens today are going to be in their 30s & 40s, meaning they were kids themselves during the .COM boom, so it's incredibly unlikely that if they are in a position to provide their kids with Internet access they have not had exposure to it themselves on either a personal or professional level.
UNIX? They're not even circumcised! Savages!
(Score: 3, Insightful) by pTamok on Friday November 29, @05:15PM (4 children)
How would you implement an ISP provided Net-nanny service?
Assuming you don't install an ISP-provided certificate and exclusively use an ISP-proxy, TLS-protected traffic is opaque to casual inspection for filtering. Installing the ISP certificate gives the ISP access to all your traffic, including that to banks...
If you want to refuse to do DNS lookups for certain domains, then DNS-over-HTTPS or DNS-over-TLS stops that.
So you could play Whac-A-Mole™ with filtering particular IP addresses, which would need to be extended to all known VPN gateways...
Perhaps there is a way that I have missed. I don't claim to be an expert here, and am willing to be educated.
(Score: 2) by zocalo on Friday November 29, @06:48PM (3 children)
Frankly, I don't think there are any totally effective technical or legislative solutions to this that will work in all situations, but there are potential solutions in both camps that may at least solve part of the overall problem and work for some subset of parents and kids - as long as the parents know the options exist and how to apply them if they feel it's needed. The only solution that will pretty much work in every circumstance is effective parenting, by the actual parents/guardians, built on a mutually understood level of trust.
UNIX? They're not even circumcised! Savages!
(Score: 2) by gnuman on Friday November 29, @09:57PM (2 children)
Re: proxies, these are almost unused these days. TLS has killed the proxy. The best that can be done is at the "App" level. This is already happening with things like youtube-kids, where parents control what kids can see. The kids will work-around that via HTTPS access. I mean, a 6 year old, you can control. 12 year olds??
This law is mostly a compliance checkbox. "Are you under 16? Yes - google, No - welcome!" and then there should be some way of detecting who is breaking the rule via content inspection.
The thing is, this is not meant to have 100% compliance. Since most people follow the rules, it should make an impact in the psychological damage done to young kids by weird echo chambers. Especially if they start teaching kids about echo chambers in school before they are exposed to them.
PS. SoylentNews is not affected by this. We are not much of a "social media" site. We are not even an echo chamber like 8chan or Reddit. But we could probably add to Terms of Service and signup page that this site is not meant for minors, and the checkbox is checked.
(Score: 2) by janrinok on Friday November 29, @11:12PM (1 child)
I'm not sure that I agree that the site is not for minors. We never ask for a person's age.
If a teen joins in a technical conversation, perhaps searching for help setting up a server or something, I don't see why we have to immediately ban him. I know 12-14 year olds who have set up their own simple web pages and I do not see any harm in that. I also know that if their parents have told them not to do something then they probably wouldn't but, you know, kids will be kids. Much older as I am now I can still remember building HF radios at the same age. I got help from adults then too, it just wasn't on a computer.
We did have a mid-teen probably 6 or 7 years ago as a member - I think he was 15 at the time but it is not a clear recollection. I will have to scroll the usernames to see if I can remember him.
I am not interested in knowing who people are or where they live. My interest starts and stops at our servers.
(Score: 2) by gnuman on Saturday November 30, @12:28AM
I agree with your disagreement. But if you want to be in compliance with Australian Law, then this can be added. Or maybe conditionally on detection connection from Australian IPs. Of course, then you need to have a GDPR notice because in EU they ruled that IP Address is "identifying, private information", even hashed.
https://law.stackexchange.com/questions/61076/storing-ips-and-gdpr-compliance [stackexchange.com]
¯\_(ツ)_/¯
US had this policy about gays in military -- don't ask, don't tell. This is probably a good policy for "underage Australians" on public forums!
I agree with your sentiment though. I've had some interesting discussions on FidoNet and NNTP servers back in the day -- math and astronomy groups were interesting, though they've had some strange people there too. I think I was in that pre-16 age group at the time.
(Score: 2) by VLM on Friday November 29, @10:25PM
Also don't forget shopping reviews.
(Score: 3, Insightful) by corey on Saturday November 30, @10:56PM
I’ve read through all the comments to this story and it’s overwhelmingly against the proposed laws. I can see why, it looks like the scope might be too broad and maybe include sites that they didn’t intend on including. Like this.
But with 2 kids under 10, we’ll have no idea what Facebook or Instagram is yet, I can’t wait for these laws. I get we need to iron out the details and probably constantly look at it and tune it, but don’t we remember how much social media websites are screwing up the kids? When I look at Youtube, the shallow shit videos that come up in the default feed are horrendous. Mostly people upload that crap to get clicks for pay. Then there’s a barrage of videos amongst it all that have women with huge breasts or skimply clad as the thumbnail, again to get clicks. These websites are a net negative for growing brains and I’m glad someone is doing something. Yeah parents should pull their finger out but that ain’t happening.
Maybe the definition should include where there is an algorithm providing the content or something.
(Score: 2) by acid andy on Friday November 29, @05:16PM
Aren't these shorts likely to dumb kids down and shorten their attention spans? Not that that is reason enough to ban a medium.
Welcome to Edgeways. Words should apply in advance as spaces are highly limite—
(Score: 5, Touché) by Anonymous Coward on Friday November 29, @06:35PM
They believe all the bullshit they see and hear, and then they vote!
Nobody over 16 should be using social media.
(Score: 2) by DrkShadow on Friday November 29, @08:33PM (3 children)
All of the Australian-based social media companies will probably have a hard time complying with this law.
Can someone tell us what a few of the Australia-based social media companies are, again?
It's probably similar to the list of social-media companies that have found it viable to operate without FCC section 230 protections. Europe will have some of those.
...... could someone tell us a few of the social-media co's that are based in a place without section 230 protections?...
(Score: 4, Insightful) by gnuman on Friday November 29, @10:18PM (2 children)
That is a US law. US laws apply in US and in US only.
In Germany, it's illegal to post positive contents about Nazis or Holocaust denial and related. If companies don't deal with this quickly, there are fined.
In Russia, you can't talk about war in Ukraine, but since no companies operate in Russia, they can ignore Russia.
It basically depends where you operate. If you don't want to comply with local laws, then you hope that you are not the CEO to be arrested in that country when you visit. See Telegram and France, for example (CEO is also French national, which makes arrest less dramatic). Or Brazil and Twitter. They came to an agreement and back in compliance.
So, it's not about Australia-based media companies, but simply whether you are operating in Australia or have any large-ish exposure to Australia. If less than 0.1% of population uses your service, well, then you may as well not exist (unless you are doing something criminal, of course). If you have more, than you should be compliant with the laws there. Add a check box during sign-up, done.
Let me put it this way. GDPR in EU was suppose to be this scary law with big teeth for anyone abusing personal data processing. Yet, there are not many big fines under GDPR. Most websites and companies want to be compliant but I can tell you, most companies are NOT compliant to the letter of the law. Yet regulators are not there to kill companies -- they are just there to prevent major abuses from happening.
https://en.wikipedia.org/wiki/GDPR_fines_and_notices [wikipedia.org]
(Score: 3, Insightful) by DrkShadow on Saturday November 30, @06:11AM (1 child)
Lets put it this way.
Can you give any example of a non-american company with a large social-media aspect?
Can you show any forum with more than a hundred-thousand users that is not owned and hosted by a company subject to section 230 protections?
(Score: 2) by Lester on Saturday November 30, @02:35PM
What he is telling is that in spite of being American, it must abide the laws in the country that it operates, otherwise it will face fines. And most of them decide to abide the local laws. For instance Google in China. Business is business.
Yes, I can mention several companies. For example Chiness weechat surpasses Facebook and WhatsApp, Tick Tock. Out the western bobble, there are many things that we usually don't hear about. Russia, China, Japan, Korea and India have their own social networks, and Facebook tweeter etc are residual there.
Another reason why you don't hear of them is that USA sees those foreign social networks as a menace. USA thinks foreign governments could use the networkrs as USA does, as a tool to gather information of citizens around de world. (see tick tock judicial journey)
(Score: 2) by jb on Saturday November 30, @03:53AM
Wouldn't that would rule out all of the specific social media sites listed in TFA? Since in every case their only real purpose is to exfiltrate user data and exploit those data for profit.
(Score: 0) by Anonymous Coward on Sunday December 01, @05:58AM
The Australian government is becoming increasingly authoritarian.
You risk jail time if you run a forum on which someone posts something the government doesn't like, ISPs must log user activity for supply to government agencies on request and they must block content on government order (i.e. the great firewall of Australia)
https://en.wikipedia.org/wiki/Internet_censorship_in_Australia [wikipedia.org]
Doesn't help that people keep going to Singapore on holiday and come back thinking the place is wonderful and how things should be :/ the Australian media also looooove fearmongering as well as promoting any authoritarian brain-fart some activist has.
And to all those people that think that the social media companies will just block Australia - they tried that when laws came into effect that required them to compensate old media for articles that appear on their sites. Didn't work, the Australian government didn't back down and they eventually came back with their tails between their legs.
As an aside, any advice on a good country to emigrate to that isn't freezing cold, insane or Spain (oh my misbegotten youth) would be greatly appreciated.