Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 12 submissions in the queue.
posted by janrinok on Friday November 29, @02:12PM   Printer-friendly
from the good-luck-with-that dept.

"This bill seeks to set a new normative value in society that accessing social media is not the defining feature of growing up in Australia. There is wide acknowledgement that something must be done in the immediate term to help prevent young teens and children from being exposed to streams of content unfiltered and infinite.

(Michelle Rowland, Minister for Communications, Australian Parliament, Nov 21)

Australia's House of Representatives has passed a bill that would ban access to social media platforms TikTok, Facebook, Snapchat, Reddit, X and Instagram for youngsters under 16. The bill passed by 102 against 13.

Once the bill gets through the Senate -- expected this week -- the platforms would have a year to work out how to implement the age restriction, without using government-issued identity documents (passport, driving licenses), and without digital identification through a government system.

The leaders of all eight Australian states and mainland territories have unanimously backed the plan, although Tasmania, the smallest state, would have preferred the threshold was set at 14.

There are some counter-noises though (no, not you, Elon). More than 140 academics signed an open letter to Prime Minister Anthony Albanese condemning the 16-year age limit as "too blunt an instrument to address risks effectively."

The writers of that open letter fear that the responsibility of giving access to social media will fall on the parents, and "not all parents will be able to manage the responsibility of protection in the digital world".

Further, " Some social media 'type' services appear too integral to childhood to be banned, for example short form video streamers. But these too have safety risks like risks of dangerous algorithms promoting risky content. A ban does not function to improve the products children will be allowed to use."

The open letter pleads instead for systemic regulation, which "has the capacity to drive up safety and privacy standards on platforms for all children and eschews the issues described above. Digital platforms are just like other products, and can have safety standards imposed."

Australia's ban on social media will be a world-first, with fines of up to 50 million Australian Dollars for each failure to prevent them youngsters of having a social media account.

Under the laws, which won't come into force for another 12 months, social media companies could be fined up to $50 million for failing to take "reasonable steps" to keep under 16s off their platforms. There are no penalties for young people or parents who flout the rules. Social media companies also won't be able to force users to provide government identification, including the Digital ID, to assess their age.

From ban children under the age of 16 from accessing social media we also get the following:

Under the laws, which won't come into force for another 12 months, social media companies could be fined up to $50 million for failing to take "reasonable steps" to keep under 16s off their platforms. There are no penalties for young people or parents who flout the rules. Social media companies also won't be able to force users to provide government identification, including the Digital ID, to assess their age.

Social Media, or an "age-restricted social media platform" has been defined in the legislation as including services where:

  1. the "sole purpose, or a significant purpose" is to enable "online social interaction" between people
  2. people can "link to, or interact with" others on the service
  3. people can "post material", or
  4. it falls under other conditions as set out in the legislation.

Original Submission

 
This discussion was created by janrinok (52) for logged-in users only. Log in and try again!
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 3, Insightful) by pTamok on Friday November 29, @02:29PM (15 children)

    by pTamok (3042) on Friday November 29, @02:29PM (#1383756)

    Is SoylentNews 'social media', according to the definitions used by Australia?

    Starting Score:    1  point
    Moderation   +2  
       Insightful=2, Total=2
    Extra 'Insightful' Modifier   0  

    Total Score:   3  
  • (Score: 4, Informative) by janrinok on Friday November 29, @02:38PM (14 children)

    by janrinok (52) Subscriber Badge on Friday November 29, @02:38PM (#1383757) Journal

    Their definition covers almost every web site that accepts comments in any shape or form. That includes similar sites to our own, discourse, mastadon, every gaming forum, etc. Websites that discuss recipes, knitting, making cosplay costumes, hardware project sites, in fact any hobby site, you name it, they can say it applies. Of course, they also have the "falls under other conditions as set out in legislation" which we know nothing about.

    If we get fined $50M you will all chip in won't you?

    --
    I am not interested in knowing who people are or where they live. My interest starts and stops at our servers.
    • (Score: 4, Funny) by RS3 on Friday November 29, @03:26PM

      by RS3 (6367) on Friday November 29, @03:26PM (#1383762)

      If we get fined $50M you will all chip in won't you?

      Is that Australian or Canadian dollars? Asking for a friend.

    • (Score: 5, Interesting) by bzipitidoo on Friday November 29, @03:43PM

      by bzipitidoo (4388) on Friday November 29, @03:43PM (#1383767) Journal

      You're joking of course. $50m is too big an ask, as we all know well. Cheaper to hire lawyers to fight that, and hiring lawyers isn't cheap.

      Just wait, I suspect it'll be struck down before it goes into effect. If this new law does go into effect simplest to just block Australia? Don't even bother asking for age verification, just block every Australian, children and adults alike. Savvy users can turn to proxies or virtual networking to get around the block.

    • (Score: 3, Insightful) by aafcac on Friday November 29, @03:48PM (1 child)

      by aafcac (17646) on Friday November 29, @03:48PM (#1383769)

      That was my first thought. It's one thing to ban minors from sites like FB, X, Tictock, YouTube and the like where there's an algorithm that is setup to try to keep people on the site as long as possible, regardless of what that does to the user, and quite another to ban them from sites like this where you can come and go as you like and there isn't really that much going on at any given time besides new comments.

      • (Score: 3, Informative) by Reziac on Saturday November 30, @04:16AM

        by Reziac (2489) on Saturday November 30, @04:16AM (#1383821) Homepage

        Well then, maybe here's a metric:

        If it uses infinite scrolling, the purpose can be assumed as "to keep people on the site as long as possible."

        --
        And there is no Alkibiades to come back and save us from ourselves.
    • (Score: 4, Interesting) by zocalo on Friday November 29, @04:06PM (8 children)

      by zocalo (302) on Friday November 29, @04:06PM (#1383770)
      TFA lists six specific platforms (TikTok, Facebook, Snapchat, Reddit, X and Instagram), yet lists a generic definition that would, indeed, appear to cover every single site on the Internet that allows users to post content viewable by others, so which is it? I'd point out that the latter potentially includes many news and retail sites, as well as many sites specifically designed to provide educational content to school kids, so the collateral damage from the broadly worded definition here is huge - are Australian politicians *really* that dumb when it comes to writing legislation that they didn't that of that?

      Also, as we've seen in previous attempts, e.g. for porn sites, on-line age verification is very complicated, a privacy nightmare, and quite often simply does not work - kids old enough to use social media generally know how to lie and/or enter borrowed details from a parent or older sibling to get what they want. (That works the other way too; I have a number of accounts where I've declared myself a minor, mostly because it means fewer ads and other fluff I don't want. The list of sites with any of my actual details is somewhat shorter than those that do not, and mostly limited to those where I may need to use a credit card or something).

      While I don't agree with the tactic, I also suspect a least a few companies (and am almost 100% certain on X being one of those, because Elmo) will decide to call Australia's bluff and do a blanket ban on the grounds that it's impractical to implement reliably and wait and see if there will be a climbdown, or at least some form of compromise. Google and, IIRC, Facebook have also already used that tactic successfully elsewhere, so may be prepared to do so in Australia as well.
      --
      UNIX? They're not even circumcised! Savages!
      • (Score: 2) by bzipitidoo on Friday November 29, @04:17PM (6 children)

        by bzipitidoo (4388) on Friday November 29, @04:17PM (#1383775) Journal

        Yeah, putting the onus on every site maintainer is demanding way too much. People who want such censorship should see to the censoring themselves. Hire a net nanny service, if they don't want to do the hard work of blocking content themselves.

        These people who ask that others self censor, ask for that favor, that handout, are the same people who get all upset that someone else might have received a handout.

        • (Score: 3, Insightful) by zocalo on Friday November 29, @04:52PM (5 children)

          by zocalo (302) on Friday November 29, @04:52PM (#1383779)
          I do think there's a genuine problem here - social media was never great, but the toxicity has skyrocketed over the last few years - but I also think Australia is trying for the wrong legislative solution to the problem, and doing so with far too broad a sword at that. This only way I can see this approach going for Australia is "wrong".

          An ISP-provided net nanny service wouldn't be flawless, but would definitely be a better approach than this, and they could even charge a little extra for it to soften the blow of making providing it a legal requirement for ISPs over a given threshold (number of users or revenue, maybe). Give parents a simple dashboard that controls what kinds of content/specific sites are allowed and when, perhaps aided by some optional site-provided metadata akin to game or movie ratings to define the types of content being hosted, and you're nearly there. Alternatively, maybe require parents (or legal guardians) to actually take responsibility for being a parent, including by making them at least partly legally responsible for their offspring's misdeeds while they are still a minor (subject to due process, naturally).

          The fact is that kids are well aware of Real World Shit at an earlier age than many people, and especially parents, like to think, and that includes a LOT of topics that might not be a parent's first choice to talk about, but it's still a much better option than them finding out the "facts" from the playground rumour mill or some random Internet site. The responsible thing for a parent to do is to be open about this kind of stuff, talk to their kids about it, make sure they know that while it's OK to be curious there is some nasty stuff out there and it's probably for the best if they don't go too far down the rabbit hole, and - most importantly - that they can talk frankly about it if they want to.

          That so many parents feel they can abrogate the implicit responsibility to provide parental guidance is perhaps the most compelling argument for actually having something of a Nanny State, and why governments feel they need to act. That's supposed to be *your* job, not your brat's peers, school teachers, neighbours, the police, and certainly not your government's, but that's the slope they're sliding down. And, no, ignorance of the Internet, that content filtering exists, and so on isn't really an argument for parents any more. Most parents with tweens & teens today are going to be in their 30s & 40s, meaning they were kids themselves during the .COM boom, so it's incredibly unlikely that if they are in a position to provide their kids with Internet access they have not had exposure to it themselves on either a personal or professional level.
          --
          UNIX? They're not even circumcised! Savages!
          • (Score: 3, Insightful) by pTamok on Friday November 29, @05:15PM (4 children)

            by pTamok (3042) on Friday November 29, @05:15PM (#1383782)

            How would you implement an ISP provided Net-nanny service?

            Assuming you don't install an ISP-provided certificate and exclusively use an ISP-proxy, TLS-protected traffic is opaque to casual inspection for filtering. Installing the ISP certificate gives the ISP access to all your traffic, including that to banks...
            If you want to refuse to do DNS lookups for certain domains, then DNS-over-HTTPS or DNS-over-TLS stops that.
            So you could play Whac-A-Mole™ with filtering particular IP addresses, which would need to be extended to all known VPN gateways...

            Perhaps there is a way that I have missed. I don't claim to be an expert here, and am willing to be educated.

            • (Score: 2) by zocalo on Friday November 29, @06:48PM (3 children)

              by zocalo (302) on Friday November 29, @06:48PM (#1383788)
              Some ISPs already do this as a value-add, which is why I suggested it - e.g. Plusnet SafeGuard and Sky's Parental Controls. One method is done via a combination of DNS filtering and a content-filtering web proxy, the latter either configured in the browser or (more securely) enforced by the ISP on a per-customer connection basis at the back-end. Depending on the system used, there are usually workarounds, including using VPNs and DoH/DoT though, and yes, it does have some privacy implications. Another is to use some client side software, usually secured by a parental password, that does the filtering locally and can be enabled on a per system/user basis, but again there are workarounds. Booting a Linux live distro and using that to access the web will let get you round pretty much anything, of course.

              Frankly, I don't think there are any totally effective technical or legislative solutions to this that will work in all situations, but there are potential solutions in both camps that may at least solve part of the overall problem and work for some subset of parents and kids - as long as the parents know the options exist and how to apply them if they feel it's needed. The only solution that will pretty much work in every circumstance is effective parenting, by the actual parents/guardians, built on a mutually understood level of trust.
              --
              UNIX? They're not even circumcised! Savages!
              • (Score: 2) by gnuman on Friday November 29, @09:57PM (2 children)

                by gnuman (5013) on Friday November 29, @09:57PM (#1383795)

                Some ISPs already do this as a value-add, which is why I suggested it - e.g. Plusnet SafeGuard and Sky's Parental Controls. One method is done via a combination of DNS filtering and a content-filtering web proxy

                Re: proxies, these are almost unused these days. TLS has killed the proxy. The best that can be done is at the "App" level. This is already happening with things like youtube-kids, where parents control what kids can see. The kids will work-around that via HTTPS access. I mean, a 6 year old, you can control. 12 year olds??

                This law is mostly a compliance checkbox. "Are you under 16? Yes - google, No - welcome!" and then there should be some way of detecting who is breaking the rule via content inspection.

                The thing is, this is not meant to have 100% compliance. Since most people follow the rules, it should make an impact in the psychological damage done to young kids by weird echo chambers. Especially if they start teaching kids about echo chambers in school before they are exposed to them.

                PS. SoylentNews is not affected by this. We are not much of a "social media" site. We are not even an echo chamber like 8chan or Reddit. But we could probably add to Terms of Service and signup page that this site is not meant for minors, and the checkbox is checked.

                • (Score: 2) by janrinok on Friday November 29, @11:12PM (1 child)

                  by janrinok (52) Subscriber Badge on Friday November 29, @11:12PM (#1383805) Journal

                  I'm not sure that I agree that the site is not for minors. We never ask for a person's age.

                  If a teen joins in a technical conversation, perhaps searching for help setting up a server or something, I don't see why we have to immediately ban him. I know 12-14 year olds who have set up their own simple web pages and I do not see any harm in that. I also know that if their parents have told them not to do something then they probably wouldn't but, you know, kids will be kids. Much older as I am now I can still remember building HF radios at the same age. I got help from adults then too, it just wasn't on a computer.

                  We did have a mid-teen probably 6 or 7 years ago as a member - I think he was 15 at the time but it is not a clear recollection. I will have to scroll the usernames to see if I can remember him.

                  --
                  I am not interested in knowing who people are or where they live. My interest starts and stops at our servers.
                  • (Score: 2) by gnuman on Saturday November 30, @12:28AM

                    by gnuman (5013) on Saturday November 30, @12:28AM (#1383813)

                    I'm not sure that I agree that the site is not for minors. We never ask for a person's age.

                    I agree with your disagreement. But if you want to be in compliance with Australian Law, then this can be added. Or maybe conditionally on detection connection from Australian IPs. Of course, then you need to have a GDPR notice because in EU they ruled that IP Address is "identifying, private information", even hashed.

                    https://law.stackexchange.com/questions/61076/storing-ips-and-gdpr-compliance [stackexchange.com]

                    ¯\_(ツ)_/¯

                    If a teen joins in a technical conversation, perhaps searching for help setting up a server or something, I don't see why we have to immediately ban him. I know 12-14 year olds who have set up their own simple web pages and I do not see any harm in that.

                    US had this policy about gays in military -- don't ask, don't tell. This is probably a good policy for "underage Australians" on public forums!

                    I agree with your sentiment though. I've had some interesting discussions on FidoNet and NNTP servers back in the day -- math and astronomy groups were interesting, though they've had some strange people there too. I think I was in that pre-16 age group at the time.

      • (Score: 2) by VLM on Friday November 29, @10:25PM

        by VLM (445) on Friday November 29, @10:25PM (#1383797)

        Also don't forget shopping reviews.

    • (Score: 3, Insightful) by corey on Saturday November 30, @10:56PM

      by corey (2202) on Saturday November 30, @10:56PM (#1383852)

      I’ve read through all the comments to this story and it’s overwhelmingly against the proposed laws. I can see why, it looks like the scope might be too broad and maybe include sites that they didn’t intend on including. Like this.

      But with 2 kids under 10, we’ll have no idea what Facebook or Instagram is yet, I can’t wait for these laws. I get we need to iron out the details and probably constantly look at it and tune it, but don’t we remember how much social media websites are screwing up the kids? When I look at Youtube, the shallow shit videos that come up in the default feed are horrendous. Mostly people upload that crap to get clicks for pay. Then there’s a barrage of videos amongst it all that have women with huge breasts or skimply clad as the thumbnail, again to get clicks. These websites are a net negative for growing brains and I’m glad someone is doing something. Yeah parents should pull their finger out but that ain’t happening.

      Maybe the definition should include where there is an algorithm providing the content or something.