Stories
Slash Boxes
Comments

SoylentNews is people

posted by chromas on Thursday February 10 2022, @10:24AM   Printer-friendly
from the accountabilibuddyable dept.

A New Proposed Law Could Actually Hold Big Tech Accountable for Its Algorithms:

We’ve seen again and again the harmful, unintended consequences of irresponsibly deployed algorithms: risk assessment tools in the criminal justice system amplifying racial discrimination, false arrests powered by facial recognition, massive environmental costs of server farms, unacknowledged psychological harm from social media interactions, and new, sometimes-insurmountable hurdles in accessing public services. These actual harms are egregious, but what makes the current regime hopeless is that companies are incentivized to remain ignorant (or at least claim they to be) about the harms they expose us to, lest they be found liable.

Many of the current ideas for regulating large tech companies won’t address this ignorance or the harms it causes. While proposed antitrust laws would reckon with harms emerging from diminished competition in the digital markets, relatively small companies can also have disturbing, far-reaching power to affect our lives. Even if these proposed regulatory tools were to push tech companies away from some harmful practices, researchers, advocates and—critically —communities affected by these practices would still not have sufficient say in all the ways these companies’ algorithms shape our lives. This is especially troubling given how little information and influence we have over algorithms that control critical parts of our lives. The newly updated Algorithmic Accountability Act from Sen. Ron Wyden, Sen. Cory Booker, and Rep. Yvette Clarke could change this dynamic—and give us all an opportunity to reclaim some power over the algorithms that control critical parts of our lives.

Now, in a significant step forward, lawmakers are increasingly building impact assessments into draft legislation. The updated Algorithmic Accountability Act of 2022, which we learned about in a briefing from Wyden’s office in mid-January, would require impact assessments when companies are using automated systems to make critical decisions, providing consumers and regulators with much needed clarity and structure around when and how these kinds of systems are being used.

[...] Given the challenging legislative landscape in Congress, it is hard to say how far this bill will proceed. However, it has more co-sponsors than the previous version, and it lands at a time when many members of Congress are more eager to discuss significant changes to Big Tech’s largely unchecked power to determine how algorithmic systems determine important features of our lives.

[...] Companies consistently shirked their responsibility to the public interest, but landmark regulations brought accountability for these harmful impacts. Today, we should exercise the same right to insist tech companies uphold democratic values in the algorithms they build and send out into the world to make decisions about our lives. The Algorithmic Accountability Act could bring us even closer to holding these companies accountable.

What are the chances of such an act becoming the law?


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 5, Interesting) by driverless on Thursday February 10 2022, @10:56AM (7 children)

    by driverless (4770) on Thursday February 10 2022, @10:56AM (#1220160)

    What are the chances of such an act becoming the law?

    In the US, zero, it's just feel-good posturing. In Europe, where governments will actually stand up to social engineering companies, a lot better than zero, although it always takes a few years.

    • (Score: 4, Funny) by DannyB on Thursday February 10 2022, @03:26PM (1 child)

      by DannyB (5839) Subscriber Badge on Thursday February 10 2022, @03:26PM (#1220209) Journal

      Even though this may never become law, the fact it is being considered could still have an effect.

      Some executives who may once have been human beings at some point in the past, may consider the ill effects of what they are doing to humanity and attempt to act like a human being, if they can remember how. Some deep ancient dormant memory of being a human being may awaken slightly.

      --
      Young people won't believe you if you say you're older than Google. (born before 1998-09-03)
      • (Score: 1) by khallow on Friday February 11 2022, @06:26PM

        by khallow (3766) Subscriber Badge on Friday February 11 2022, @06:26PM (#1220597) Journal
        And some pigs who might have been sparrows in a past lifetime might start to fly.
    • (Score: 2) by mcgrew on Thursday February 10 2022, @03:47PM (4 children)

      by mcgrew (701) <publish@mcgrewbooks.com> on Thursday February 10 2022, @03:47PM (#1220217) Homepage Journal

      Is it even needed? In the US, you can sue anybody for anything. If you can prove you've been harmed, you can collect.

      I don't get it, I must need more coffee.

      --
      Carbon, The only element in the known universe to ever gain sentience
      • (Score: 1, Insightful) by Anonymous Coward on Thursday February 10 2022, @05:19PM (1 child)

        by Anonymous Coward on Thursday February 10 2022, @05:19PM (#1220248)

        "If someone rapes you, you can always ask them not to and sue them for it. If you can prove they raped you, you can collect, why would we even need the law involved in this"

        Thank you for playing...
        If you don't see the power imbalance between the harmed party and the harmer, then you have nothing valuable to contribute.

        • (Score: 1) by khallow on Friday February 11 2022, @06:29PM

          by khallow (3766) Subscriber Badge on Friday February 11 2022, @06:29PM (#1220598) Journal
          I'm pretty sure that if my algorithm rapes someone there will be a lot of hell to pay. I doubt we'll need poorly defined, shitty law to protect people.
      • (Score: 2) by driverless on Friday February 11 2022, @01:51AM (1 child)

        by driverless (4770) on Friday February 11 2022, @01:51AM (#1220400)

        And how's that working out for you? How much did you get from your settlement with Facebook and Google?

        • (Score: 2) by mcgrew on Saturday February 12 2022, @10:17PM

          by mcgrew (701) <publish@mcgrewbooks.com> on Saturday February 12 2022, @10:17PM (#1220870) Homepage Journal

          Neither Facebook nor Google has harmed me. Why would I sue? If they've harmed you, YOU sue.

          --
          Carbon, The only element in the known universe to ever gain sentience
  • (Score: 5, Interesting) by bradley13 on Thursday February 10 2022, @10:57AM (3 children)

    by bradley13 (3053) Subscriber Badge on Thursday February 10 2022, @10:57AM (#1220161) Homepage Journal

    Before the politicians pass laws holding other people responsible for unintended consequences, can we first have a trial run? Can we hold politicians responsible for unintended consequences of legislation? Things like the "war on drugs", or just about any law that was pushed through because "think of the children"?

    false arrests powered by facial recognition

    I expect there are more false arrests powered by the war on drugs: drug dogs trained to trigger when their handler wants them to, planted evidence, etc..

    massive environmental costs of server farms

    Server farms are part and parcel of the modern economy. They do less environmental harm than government activities like, let's pick an obvious one, maintaining the worlds largest, most expensive military.

    unacknowledged psychological harm from social media interactions

    This is quickly getting into WTF territory.

    --
    Everyone is somebody else's weirdo.
    • (Score: 2, Insightful) by Spamalope on Thursday February 10 2022, @11:05AM (1 child)

      by Spamalope (5233) on Thursday February 10 2022, @11:05AM (#1220163) Homepage

      What an amazingly warm and fuzzy fantasy land. How do I subscribe to your newsletter? :D

      Drug dogs: Alert whenever there is cash or property to seize!
      Note there is never a double blind test to check for false positives by an adversarial 3rd party. When it's important to be correct, accuracy is checked! (as with bomb dogs, for example) But... drug dogs detect profit...

      • (Score: 2) by DannyB on Thursday February 10 2022, @03:28PM

        by DannyB (5839) Subscriber Badge on Thursday February 10 2022, @03:28PM (#1220210) Journal

        The dogs need to be trained to recognize only cash that was illegally obtained and distinguish it from cash that was legally obtained.

        --
        Young people won't believe you if you say you're older than Google. (born before 1998-09-03)
    • (Score: 4, Insightful) by khallow on Thursday February 10 2022, @01:38PM

      by khallow (3766) Subscriber Badge on Thursday February 10 2022, @01:38PM (#1220181) Journal
      Some of this stuff is covered by qualified immunity too. Not much point to passing laws on law enforcement theft of property, if they're immune to prosecution for the crimes.

      And the far reaching, vague, and unpredictable nature of this law should be a warning sign. If it's going to trigger for intangibles like social media interactions, then it's a disaster waiting to happen perhaps in conjunction with future bad law. Imagine for example, that aristarchus and I both sue SoylentNews for the psychological harm that we mutually caused to each other by saying mean things.
  • (Score: 2) by Mojibake Tengu on Thursday February 10 2022, @10:58AM (10 children)

    by Mojibake Tengu (8598) on Thursday February 10 2022, @10:58AM (#1220162) Journal

    GLWT.

    Seriously, if the lawmaking process is not Turing-complete, it's no match for algorithms since it cannot compete with them over regulation domain.
    Insufficient definitions in the Law will become those first to fail. Enforceability will fail next.

    --
    The edge of 太玄 cannot be defined, for it is beyond every aspect of design
    • (Score: 2, Interesting) by khallow on Thursday February 10 2022, @01:13PM (9 children)

      by khallow (3766) Subscriber Badge on Thursday February 10 2022, @01:13PM (#1220174) Journal

      Seriously, if the lawmaking process is not Turing-complete, it's no match for algorithms since it cannot compete with them over regulation domain. Insufficient definitions in the Law will become those first to fail. Enforceability will fail next.

      Nonsense. Algorithms don't have magic cooties that let them ignore laws. Good law doesn't require complex algorithms in order for one to act lawfully or create huge profits in edge cases (the places that said algorithms would be looking for).

      Insufficient definitions in the Law will become those first to fail.
      Enforceability will fail next.

      Because judges will just take the opinion of algorithms on faith? And enforcers will just do what the computer tells them to do? What's supposed to be good again about a Turing complete legal process that no human can follow?

      If you're pursuing Turing complete legal processes, you're doing it wrong [xkcd.com].

      • (Score: 2) by mcgrew on Thursday February 10 2022, @03:53PM (8 children)

        by mcgrew (701) <publish@mcgrewbooks.com> on Thursday February 10 2022, @03:53PM (#1220219) Homepage Journal

        Seriously, if the lawmaking process is not Turing-complete

        Do you really think that what Turing came up with is the very best that can be? Yes, it's the only computing platform there is today, but 200 years from now it will look like Babbage's computer. Something completely different will come around that makes our best look comically primitive.

        --
        Carbon, The only element in the known universe to ever gain sentience
        • (Score: 2, Interesting) by khallow on Thursday February 10 2022, @05:33PM (4 children)

          by khallow (3766) Subscriber Badge on Thursday February 10 2022, @05:33PM (#1220250) Journal
          Actually, odds are good that we'll never have a true full Turing computer. You need infinite memory and computation time which likely is impossible in our universe.
          • (Score: 2) by maxwell demon on Friday February 11 2022, @10:52AM (3 children)

            by maxwell demon (1608) Subscriber Badge on Friday February 11 2022, @10:52AM (#1220467) Journal

            A Turing machine does not have infinite time (with infinite time, you could solve the halting problem: Just wait infinitely long, and then look whether the algorithm halted).

            --
            The Tao of math: The numbers you can count are not the real numbers.
            • (Score: 1) by khallow on Friday February 11 2022, @01:00PM (2 children)

              by khallow (3766) Subscriber Badge on Friday February 11 2022, @01:00PM (#1220477) Journal
              Infinite time != wait infinitely long.
              • (Score: 2) by maxwell demon on Friday February 11 2022, @01:14PM (1 child)

                by maxwell demon (1608) Subscriber Badge on Friday February 11 2022, @01:14PM (#1220482) Journal

                No, that's exactly what infinite time means. Maybe you meant unlimited time?

                --
                The Tao of math: The numbers you can count are not the real numbers.
                • (Score: 1) by khallow on Friday February 11 2022, @06:04PM

                  by khallow (3766) Subscriber Badge on Friday February 11 2022, @06:04PM (#1220589) Journal
                  Obviously, I don't share that opinion, but let's go with unlimited time then.
        • (Score: 2) by maxwell demon on Friday February 11 2022, @10:50AM (2 children)

          by maxwell demon (1608) Subscriber Badge on Friday February 11 2022, @10:50AM (#1220466) Journal

          Assuming you mean the Analytical Engine, that also was Turing complete (up to limitations of memory, but that's true for any machine that fits into the observable universe). Yes, our machines are far more powerful than the AE would have been if it had been actually built. But the range of problems they can solve is still the same.

          Note that the same also holds true for quantum computers. Quantum computers can solve certain problems in massively less time than classical computers, but in the end, the set of problems they can solve are the same that a Turing machine can solve.

          With the known laws of physics, we will never build a machine more powerful than a Turing machine (note that a Turing machine, being a theoretical construct, can run as fast as you want, and by definition has an infinite amount of memory).

          --
          The Tao of math: The numbers you can count are not the real numbers.
          • (Score: 2) by mcgrew on Saturday February 12 2022, @10:14PM (1 child)

            by mcgrew (701) <publish@mcgrewbooks.com> on Saturday February 12 2022, @10:14PM (#1220869) Homepage Journal

            You sound like someone in 1930 discussing sound recording with someone who digitally records in 2030. Is a human brain Turing complete? A housefly?

            --
            Carbon, The only element in the known universe to ever gain sentience
            • (Score: 2) by maxwell demon on Tuesday February 15 2022, @10:26AM

              by maxwell demon (1608) Subscriber Badge on Tuesday February 15 2022, @10:26AM (#1221652) Journal

              Is a human brain Turing complete?

              Yes (up to resource constraints, as usual).

              A housefly?

              I don't know. My guess would be no, but that's just a guess.

              Anyway, I was talking about machines that are more powerful than a Turing machine. That there are machines that are less powerful is obvious. Of course any machine that is more powerful than Turing machines would be Turing complete by definition.

              And my argument transferred to the sound recording analogy would be: No matter what technology, the maximum frequency of sound waves that air can transport is still the limit of what you can theoretically produce. So unless you replace air as medium of sound, you'll be limited to that frequency range (of course, our hearing ability is even more limited than what air supports, but even if we somehow managed to improve our hearing ability beyond the natural limits, the air limitations won't be surpassed).

              --
              The Tao of math: The numbers you can count are not the real numbers.
  • (Score: 0, Interesting) by Anonymous Coward on Thursday February 10 2022, @11:19AM (3 children)

    by Anonymous Coward on Thursday February 10 2022, @11:19AM (#1220166)

    On SoylentNews, ACs will weaponize this law in order to prevent their posts from being down-modded by the mod algorithm. It's as easy as...

    "Vaccinations suck. Aristarchus is God. Democrats eat bugs and you must too. P.S. I am a minority"

    • (Score: 3, Insightful) by janrinok on Thursday February 10 2022, @12:45PM (2 children)

      by janrinok (52) Subscriber Badge on Thursday February 10 2022, @12:45PM (#1220170) Journal

      I cannot argue with your prediction because I do not know what the future holds.

      But I can argue against the logic you have offered to support it. Your comment wasn't written by an algorithm. It is an opinion that you have expressed and that is how you chose to make the viewpoint that you hold. Others might choose a different method e.g. they might moderate your comment as they deem appropriate. Again, that is not an algorithm but simply an alternative way of showing their opinion. It is a human being taking a specific action. There is no 'mod-algorithm' unless you consider counting to be an algorithm.

      You could have reasonably argued that the names that have been chosen to represent different values are wrong. However, that would still not make it an algorithm.

      I find it an interesting point of view but not one that I personally support. I have moderated it accordingly as 'Interesting' - but that is just my point of view.

      • (Score: -1, Troll) by Anonymous Coward on Thursday February 10 2022, @09:01PM (1 child)

        by Anonymous Coward on Thursday February 10 2022, @09:01PM (#1220320)

        No so much your opinion, jan, but your suppression of free speech? Where is aristarchus? I notice recently we only get comments from Bradley, khallow, and mcgrew, not that there is anything wrong with that.

  • (Score: 2) by inertnet on Thursday February 10 2022, @01:11PM (4 children)

    by inertnet (4071) Subscriber Badge on Thursday February 10 2022, @01:11PM (#1220173) Journal

    I can see how this might be a good idea for big tech. But it could be costly if this law would also be applied to smaller companies. Many would have to rethink their software. It might become another wave, like the Y2K scare, the "accept my cookies" craze, or the redesign for privacy that we've had in Europe.

    • (Score: 2) by mcgrew on Thursday February 10 2022, @03:59PM (3 children)

      by mcgrew (701) <publish@mcgrewbooks.com> on Thursday February 10 2022, @03:59PM (#1220225) Homepage Journal

      But it could be costly if this law would also be applied to smaller companies.

      Test your code like you test airplanes, and if you can't, you're in the wrong business. If a poorly engineered airplane crashes and kills my daughter, it's going to cost them or their insurance company big time. If your poorly written software for an MRI machine kills her, it will be more expensive.

      What small coding house will be writing anything that can cause real unintended harm? Give me an example.

      --
      Carbon, The only element in the known universe to ever gain sentience
      • (Score: 1) by khallow on Thursday February 10 2022, @05:46PM

        by khallow (3766) Subscriber Badge on Thursday February 10 2022, @05:46PM (#1220256) Journal
        You just gave two examples: airplanes and MRI machines. Nothing magical about those problems that requires a large team.
      • (Score: 1) by fustakrakich on Thursday February 10 2022, @08:13PM

        by fustakrakich (6150) on Thursday February 10 2022, @08:13PM (#1220312) Journal

        Test your code like you test airplanes

        Make it just strong enough to keep the rain out?

        If we want to properly regulate business, we have to dangle their corporate charter in front of them and remind them what a shame it would be if something were to happen to it, but the real desire is to regulate the users' input into mass media while keeping the Wall Street bubble fully inflated.

        These algorithms are just being fucked up by bots

        --
        La politica e i criminali sono la stessa cosa..
      • (Score: 0) by Anonymous Coward on Friday February 11 2022, @12:30AM

        by Anonymous Coward on Friday February 11 2022, @12:30AM (#1220382)

        I knew one guy who turned a database of drug interactions into a reference tool.

        One guy.

        If you don't see how a DB interpretation error could have catastrophic consequences for someone, you lack imagination.

        OK, now tell him: "Fuck you asshole, the moment someone has so much as an unscheduled bowel movement, you're on the hook for it!" and see how fast he takes up some other hobby, such as golf, or fancy woodwork, or damn near anything that has nothing to do with algorithms because you've just turned algorithms involving anything that anyone might ever care about into a third rail that only companies with colossal QA teams and vast budgets could ever even contemplate risking.

        There you are, there's your actual real-world example. (He wrote it in python, as I recall, a couple of decades ago.)

  • (Score: 0) by Anonymous Coward on Thursday February 10 2022, @02:40PM

    by Anonymous Coward on Thursday February 10 2022, @02:40PM (#1220195)

    No thanks.

  • (Score: 0) by Anonymous Coward on Thursday February 10 2022, @03:12PM (2 children)

    by Anonymous Coward on Thursday February 10 2022, @03:12PM (#1220202)

    If this bill passes, BigTech will just have to pay the politicians even more than now to stay on their good side. I see this as the politicians simply upping their price.

    • (Score: 2) by Fnord666 on Thursday February 10 2022, @03:30PM (1 child)

      by Fnord666 (652) on Thursday February 10 2022, @03:30PM (#1220212) Homepage

      If this bill passes, BigTech will just have to pay the politicians even more than now to stay on their good side. I see this as the politicians simply upping their price.

      This is how members of Congress get their bonuses. Propose legislation that will impact BigTech™, then wait for the "campaign contributions" to roll in. Lather, rinse and repeat. Who's turn is it to propose that legislation this time?

      • (Score: 3, Interesting) by mcgrew on Thursday February 10 2022, @04:10PM

        by mcgrew (701) <publish@mcgrewbooks.com> on Thursday February 10 2022, @04:10PM (#1220231) Homepage Journal

        You dumbasses act like campaign contributions belong to the politicians. They don't; bribery is a felony. I don't expect to see that level of ignorance here! Crooked politicians know they can't take bribes to get rich (as if $140k a year isn't rich; I make less than ten times that) so they get rich from insider trading.

        They're looking to stop that. Anyone who votes against that bill should be voted out of office.

        --
        Carbon, The only element in the known universe to ever gain sentience
  • (Score: 3, Interesting) by crafoo on Thursday February 10 2022, @04:17PM

    by crafoo (6639) on Thursday February 10 2022, @04:17PM (#1220233)

    Dictating from the top using laws and threats of violence by the state really should be a last resort if you want your society to last.

    Instead, balance the incentives for the outcomes you would like to have more of. If you reward bad behavior you will get more of it. These are the steps needed to build a cooperative culture. If everyone benefits from "good" behavior, you will get more of it.

    The problem is us. We get trash programs from trash companies because the average person believes the marketing and doesn't care that much about buggy, frustrating products as long as it's CHEAP and EASY. The average person will happily sacrifice their future for temporary but immediate comfort.

    If you want better you will have to PAY well above what the average person is willing to pay for a product. Hire people to deal with government agencies for you. Pay top dollar for the best tools. It's all available, it just costs money.

  • (Score: 0) by Anonymous Coward on Thursday February 10 2022, @10:13PM

    by Anonymous Coward on Thursday February 10 2022, @10:13PM (#1220347)

    one algorithem to rule them all.

  • (Score: 1) by khallow on Saturday February 12 2022, @08:14PM

    by khallow (3766) Subscriber Badge on Saturday February 12 2022, @08:14PM (#1220828) Journal
    This reminds me of old science fiction where one tweaks this one thing - like say get rid of all men (or all women), make lying a capital crime, kill everyone over the age of 30, mind control everyone for the greater good, drug everyone up, and so on. The rest of the story then revolves about the horrible consequences.

    My take is that it'll work just as well in the real world as it does in fiction. There is no factor, be it bad algorithms, recreational drugs, or chlorine [wordpress.com], that you can just take away and magically make everything better.

    A better solution would be to pass small laws that target actual problems not covered by existing law.
(1)