Stories
Slash Boxes
Comments

SoylentNews is people

posted by janrinok on Monday July 21 2014, @06:02PM   Printer-friendly
from the future-to-avoid? dept.

Tech pioneers in the US are advocating a new data-based approach to governance - 'algorithmic regulation'. But if technology provides the answers to society's problems, what happens to governments ?

What is Algorithmic Regulation? Well, here and here are two attempts to explain it. For example: the "smartification" of everyday life follows a familiar pattern: there's primary data - a list of what's in your smart fridge and your bin - and metadata - a log of how often you open either of these things or when they communicate with one another. Both produce interesting insights: cue smart mattresses - one recent model promises to track respiration and heart rates and how much you move during the night - and smart utensils that provide nutritional advice.

In addition to making our lives more efficient, this smart world also presents us with an exciting political choice. If so much of our everyday behaviour is already captured, analysed and nudged, why stick with unempirical approaches to regulation? Why rely on laws when one has sensors and feedback mechanisms? If policy interventions are to be - to use the buzzwords of the day - "evidence-based" and "results-oriented," technology is here to help.

This new type of governance has a name: algorithmic regulation. In as much as Silicon Valley has a political programme, this is it. Tim O'Reilly, an influential technology publisher, venture capitalist and ideas man (he is to blame for popularising the term "web 2.0") has been its most enthusiastic promoter. In a recent essay that lays out his reasoning, O'Reilly makes an intriguing case for the virtues of algorithmic regulation - a case that deserves close scrutiny both for what it promises policy-makers and the simplistic assumptions it makes about politics, democracy and power.

To see algorithmic regulation at work, look no further than the spam filter in your email. Instead of confining itself to a narrow definition of spam, the email filter has its users teach it. Even Google can't write rules to cover all the ingenious innovations of professional spammers. What it can do, though, is teach the system what makes a good rule and spot when it's time to find another rule for finding a good rule - and so on. An algorithm can do this, but it's the constant real-time feedback from its users that allows the system to counter threats never envisioned by its designers. And it's not just spam: your bank uses similar methods to spot credit-card fraud.

Algorithmic regulation, whatever its immediate benefits, will give us a political regime where technology corporations and government bureaucrats call all the shots. The Polish science fiction writer Stanislaw Lem, in a pointed critique of cybernetics published ,as it happens, roughly at the same time as The Automated State, put it best: "Society cannot give up the burden of having to decide about its own fate by sacrificing this freedom for the sake of the cybernetic regulator."

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by BasilBrush on Monday July 21 2014, @06:16PM

    by BasilBrush (3994) on Monday July 21 2014, @06:16PM (#71915)

    They already had this idea a century ago. Technocracy. Looks like the technology to implement it properly is here now, or coming soon.

    http://en.wikipedia.org/wiki/Technocracy [wikipedia.org]

    --
    Hurrah! Quoting works now!
    • (Score: 1) by Murdoc on Tuesday July 22 2014, @12:12AM

      by Murdoc (2518) on Tuesday July 22 2014, @12:12AM (#72082) Homepage
      That's not really a good article to explain it. It makes the mistake of conflating political and economic technocracy, which are two very different things, and leads to disparaging comments like some of the below. Political technocracy is just putting scientists or engineers in power. Whether they actually use science at all in governing is not really a requirement (and debatable as to whether you can or not). This is what people tend to talk about today in a negative manner. Economic technocracy, what they came up with in the 1920s [technocracy.ca], is an entirely different animal, something most people don't know anything about, and the confusion between the use of the terms makes it even harder to teach them about it. It also leads to articles like this where people look for high-tech solutions to government problems that, even if they work, wouldn't do nearly as much good as economic technocracy would, which would in effect largely make government unnecessary. Basically, we don't need data feeds from your fridge to make things better. We can do far more with far less. Heck, we could have had this working in the 1930s, although it definitely would work better now.
  • (Score: 2, Insightful) by Anonymous Coward on Monday July 21 2014, @06:20PM

    by Anonymous Coward on Monday July 21 2014, @06:20PM (#71919)

    The last thing we need is a technocrat dominated government. Technology-oriented people as a group are terrible at understanding the human condition. The one and only way to get good governance is for people to have a voice in the process. No matter how "stupid" or undeserving we think some people are, deciding that 'we' know better is at best a recipe for abuse through neglect and more often outright oppression. Just look at how the NSA is operating today, they "know better" than the rest of us so they've built a system for turnkey tyranny as a result. Or consider how the creation of laws that prevent felons from voting has coincided with an 8x increase in the prison population over the last 40 years.

    • (Score: 3, Funny) by buswolley on Monday July 21 2014, @08:21PM

      by buswolley (848) on Monday July 21 2014, @08:21PM (#71980)

      Pretty soon it will be, "algorithmic voting". Let your algorithm vote for you to maximize your benefits. And then we will just be fat slugs with a Netflix remote that cues the algorithm that chooses the program that will most likely entertain us.

      --
      subicular junctures
    • (Score: 2, Insightful) by Anonymous Coward on Monday July 21 2014, @08:52PM

      by Anonymous Coward on Monday July 21 2014, @08:52PM (#72003)

      I'll go ahead and take the bait.

      The thing we MOST need is a technocrat dominated government. Because once you trend above a few million people, the answer to "What is best for society?" should come down to hard numbers, public health, and epidemiology - NOT knee jerk reactions to tearful anecdotes. I assume this is what you meant when you made the unsubstantiated claim about "not understanding the human condition."

      Put differently, if the data says saving 0.00003% of the populace would be possible given 10% of the entire society's GDP, that shit is a non-starter. Sorry. We have to progress as a species. The one and only way to get a good government is to get informed people to vote. That is precluded today thanks to media control and the objectively, verifiably proven, broken two party system.

      Governing from data means ensuring that the incentives are done right, and everyone on welfare brings home additional net money for every penny they earn. Instead, thanks to mathematically inept laws piling on each other, single unemployed mothers in the USA have a net gain for each additional child they have out of wedlock and receive LESS net income if they start working at any salary under $70k. That is only one example. Another is having break points in the tax code; it should be a graduated curve.

      If you are against objectively good government as I just described, then yes I'm going to call you stupid or at least uninformed. Note that I am solidly, wholly against the NSA and invasion of privacy. This entire mini-rant is basically to get you to disassociate governing using objective principles and data from the creeping tyranny of the surveillance state. They are not (necessarily) the same. We desperately need the former, and desperately need to destroy the latter. Don't toss the baby out with the bath water.

      • (Score: 0) by Anonymous Coward on Monday July 21 2014, @11:17PM

        by Anonymous Coward on Monday July 21 2014, @11:17PM (#72057)

        Yes, you took the bait hook line and sinker.

        "What is best for society?" should come down to hard numbers, public health, and epidemiology - NOT knee jerk reactions to tearful anecdotes.

        You sound like the kind of person who believes that "the numbers don't lie."

        It has nothing to do with "tearful anecdotes." What it does have to do with is who decides what numbers matter, and how to measure them. And that is something that geeks are universally terrible at. We see it with "teaching to the test" and even institutional cheating [newyorker.com] in schools, we see police regularly downgrading or outright ignoring crimes [chicagomag.com] even benchmark "optimization" on computers. [anandtech.com]

        As long as people are involved in the process, the numbers will lie because what to measure and how to measure it have always been and will always be the fundamental problem.

        • (Score: 1, Interesting) by Anonymous Coward on Tuesday July 22 2014, @09:14AM

          by Anonymous Coward on Tuesday July 22 2014, @09:14AM (#72211)

          Even if the numbers are perfectly accurate and relevant, it may turn out that you optimize for the false values. Now, it happens already in human-driven politics (like, confusing "good for the economy" with "good for big business" (not to mention that "good for the economy" is not always the same as "good for the people" either; slavery was certainly good for the economy of the time).

          And even if the numbers are absolutely accurate and relevant, and the goals are all good, the optimization algorithm may still reach unacceptable results because of some rule that was forgotten. As an extreme example, the algorithm may figure out that it is best for public health if all ill people, no matter how harmless their illness is, are killed and their bodies immediately burned. After all, if you kill ill people, you reduce the fraction of ill people, and by burning them immediately, you'll prevent their deceases to spread further.

          That's why we need humans in the loop. Not because humans are inherently better in optimizing for a goal (they probably aren't), but because if something goes horribly wrong, humans will be able to recognize it and change their rules, while the computer will determine that according to the algorithm it was given, everything is going well.

    • (Score: 1) by Delwin on Monday July 21 2014, @09:16PM

      by Delwin (4554) on Monday July 21 2014, @09:16PM (#72021)

      You have one point dead on - a technocracy is by definition terrible at understanding the human condition. What you missed is that where we need it most is when we are not dealing with humans.

      For-profit corporations do not have a human condition. They have a very simple set of driving goals (maximize return to shareholders). Thus they can, and do, react predictably to various changes in the regulatory environment.

      For situations where the regulations are dealing primarily, or entirely, with corporations I argue that a technocracy is not only advisable but is in fact the optimal solution.

  • (Score: 2) by Sir Garlon on Monday July 21 2014, @06:20PM

    by Sir Garlon (1264) on Monday July 21 2014, @06:20PM (#71920)

    "Society cannot give up the burden of having to decide about its own fate by sacrificing this freedom for the sake of the cybernetic regulator."

    I am sympathetic to that view, but for the sake of argument: we already have a system for protecting the rights of the individual. The judicial system. If the algorithm says you're not paying your taxes properly or whatever, you can explain to a judge that the algorithm's output is unreasonable or that there are extenuating circumstances that made you unable to comply.

    On the other hand, it is a lot harder for a judge to determine whether the algorithm's output is consistent with regulatory authority and other law if the algorithm itself is completely opaque. So while I don't see a theoretical difference between a continually-tuned algorithm and the arcane technobabble of current tax law (for example), I do see a practical difference.

    --
    [Sir Garlon] is the marvellest knight that is now living, for he destroyeth many good knights, for he goeth invisible.
    • (Score: 2) by redneckmother on Monday July 21 2014, @06:40PM

      by redneckmother (3597) on Monday July 21 2014, @06:40PM (#71938)

      Excellent points.

      My personal arguments against a technocracy stem from an old AT&T slogan: "The system is the solution".

      An employee of AT&T told me, "The system is NOT the solution, it's the f*cking PROBLEM!"

      I think that employee's assessment is true for both technology AND governments (at least, in present forms).

      I am thankful that I read Charles Perrow's Normal Accidents: Living with High-Risk Technologies.

      --
      Mas cerveza por favor.
  • (Score: 4, Insightful) by eapache on Monday July 21 2014, @06:25PM

    by eapache (3822) on Monday July 21 2014, @06:25PM (#71925)

    This ignores the fact that the underlying disagreements in politics are all philosophical, not practical. Nobody is against picking the best means, the problem is in choosing which ends to aim for (and which means are not acceptable because of other philosophical reasons).

    • (Score: 3, Insightful) by Rune of Doom on Monday July 21 2014, @06:57PM

      by Rune of Doom (1392) on Monday July 21 2014, @06:57PM (#71943)

      I agree that "algorithmic regulation" isn't actually a solution, but it's not only philosophical disagreements that we currently have problems with. It's underlying issues like wealth inequality, corruption, and total government capture by the 'too rich for anyone's good' class of companies and individuals. By gut reaction to "algorithmic regulation" is that, if implemented, it will be government of the too-rich, by the rich, and for the rich - but much less problematical for them than this messy pseudo-democracy we have now.

      • (Score: 1) by Delwin on Monday July 21 2014, @09:35PM

        by Delwin (4554) on Monday July 21 2014, @09:35PM (#72029)

        Those in power will fight algorithmic regulation tooth and nail because it removes the power that they currently have.

        • (Score: 2) by dry on Tuesday July 22 2014, @03:07AM

          by dry (223) on Tuesday July 22 2014, @03:07AM (#72119) Journal

          Or perhaps subvert it. "See, the algorithm says that trickle down economics is the best system to enrich society and that only the lower classes should be taxed" which is the problem, who decides on which algorithm to use.

        • (Score: 2) by etherscythe on Tuesday July 22 2014, @11:11PM

          by etherscythe (937) on Tuesday July 22 2014, @11:11PM (#72533) Journal

          Look at the stock market and tell me that again. The rich OWN the algorithms of the system. The only change that would occur is that they would, somehow, become even more powerful, for probably less-obvious and less-articulable reasons.

          --
          "Fake News: anything reported outside of my own personally chosen echo chamber"
    • (Score: 1) by MBasial on Tuesday July 22 2014, @12:19AM

      by MBasial (1910) on Tuesday July 22 2014, @12:19AM (#72084)

      My experience is that folks want the same ends -- can you find anyone that says "more homelessness is better", "the real problem with this country is that there's not enough teen pregnancy", "things were better when crime rates were higher"? The disagreement is how to get there. Do we make being poor very uncomfortable, so people will go get jobs? Or do we provide a safety net so that people can look for their next paying gig without worrying that they'll be choosing between food and shelter at the end of the month? Sex ed or abstinence-only? Punish prostitutes or johns? Arrest dealers, arrest users, disrupt foreign manufacturing, or legalize?

      That said, I'm sure there are folks today that say they want one thing ("less teen pregnancy") but actually mean another ("more obedience to my god"). They'll have to ask for those currently-unstated desires to be included in the algorithm, if they want them considered by the machine. I bet we can all manage to agree on and achieve "less teen pregnancy" long before any particular religion/political party gets its recruiting needs incorporated into the algorithm.

      • (Score: 0) by Anonymous Coward on Tuesday July 22 2014, @12:49AM

        by Anonymous Coward on Tuesday July 22 2014, @12:49AM (#72092)

        > can you find anyone that says "more homelessness is better", "the real problem with this country is that
        > there's not enough teen pregnancy", "things were better when crime rates were higher"?

        Not exactly, but close. There are lots of people who believe that homeless people want to be homeless, that teen pregnancy is a just punishment for loose morals and that people live in high-crime neighborhoods are of poor character themselves and so deserve to be there.

        • (Score: 1) by MBasial on Tuesday July 22 2014, @07:17PM

          by MBasial (1910) on Tuesday July 22 2014, @07:17PM (#72418)

          Agreed, but I think the "not exactly" is really important. I think (hope?) that the strength of algorithmic government will be that those "just punishment" folks will be forced to make those desires explicit, or not have them considered. I can see almost anyone standing up and saying "I support policies that lead to less teen pregnancy", but it takes a lot more to stand up and say "I support policies that result in more teen pregnancy, because those policies support my goal of punishing sexual transgression."

          At the moment, they have the cover of abusing statistics when choosing policies (e.g. abstinence is 100% effective if you don't count the failures to abstain; well, the Pill also gains several percentage points of effectiveness if we assume humans don't do human things). I am assuming that the machine will have instructions to choose/create policies that are effective. If abstinence-only education leads to fewer teen pregnancies, then that's what the machine will choose. I feel like I've seen data that says the machine will be choosing another approach.

          I'd like to think that in the US, separation of church and state rules will exclude a lot of the "just punishment" issues from the algorithm. Some folks will try, I'm sure, but it's one thing to have agreement on a policy with a side-effect of punishing teen moms, and another to have agreement on a policy of explicit punishment. Without the explicit punishment goals, the algorithm is going to try for a comfortable life for everyone, including teen moms. In my imagining of the situation, the machine is allowed to notice whether income support for single moms or desperate poverty results in better outcomes for kids. I don't think there are a lot of people willing to ask the machine to include "The single mom's sinfulness requires that she be punished so harshly that the child has a 10% greater chance of being poor at age 30 (or whatever outcome pattern the machine sees)."

          In other words, I expect algorithmic government to get a lot of foolishness out of our rule-making process. We currently establish a lot of rules based on how we think the world should work (e.g. poverty creates incentive to work). I expect the machine will have a better handle on how the world works (e.g. poverty is more often stressful, demoralizing, and demotivating), and that it will be gloriously ignorant about how it ought to work.

  • (Score: 2) by WillAdams on Monday July 21 2014, @06:28PM

    by WillAdams (1424) on Monday July 21 2014, @06:28PM (#71927)

    I'd like to see a return to the tax structure we had under Eisenhower, incl. the 90% tax bracket at the highest level.

    • (Score: 2) by redneckmother on Monday July 21 2014, @06:43PM

      by redneckmother (3597) on Monday July 21 2014, @06:43PM (#71939)

      Don't worry, there'd be plenty of exemptions and loopholes.

      --
      Mas cerveza por favor.
    • (Score: 3, Interesting) by tathra on Monday July 21 2014, @06:57PM

      by tathra (3367) on Monday July 21 2014, @06:57PM (#71942)

      90% may be a bit high, but i agree, the wealthiest people are damaging the economy by hoarding money. money has to circulate for the economy to function (yes, i know "money" isn't strictly necessary, but it works as a good proxy for capital or labor; dont get caught up on semantics), by keeping a significant chunk out of circulation they're hindering the economy. i know that banks lend out money based on how much they have in deposits (yes, i know it doesnt work that way anymore), but going between 2 points - the lender to the lendee and back to the lender - is not circulation. this would be especially critical if we were to ever go back to the gold standard like some conservatives, at least, have been wanting for a while; being on the gold standard would mean there is only a fixed amount of monetary wealth available, making it even more critical that money not be hoarded.

      i'm thinking an algorithm would naturally raise taxes for the rich because of this to promote economic growth. the only other way to put more money into circulation, aside from warfare, would be deflation, if i understand correctly, but deflation promotes hoarding rather than spending so its not a very good option.

      the most important thing is to get money out of politics, at least the ability to bribe via campaign donations, lobbying, and however else its accomplished. removing humans from governance would accomplish this, but thats a bit too drastic a step to take at this moment, and humans should never be completely removed from governance because its important to have a human perspective when dealing with humans.

      • (Score: 0) by Anonymous Coward on Tuesday July 22 2014, @02:07PM

        by Anonymous Coward on Tuesday July 22 2014, @02:07PM (#72283)

        90% will never work.

        However, lets say for a second you managed to get that 90% tax bracket. Lets say you were in this 90% bracket. What would you do with your money? Would you hand it over to the gov? Or would you find ways to shield your profits? Doing things like hiding it offshore in tax havens?

        the most important thing is to get money out of politics
        That is why mostly. People are using our gov to lock people out. Just having cash does not do that. Influence does.

        Hear is the real deal. "we are created equal" however we will never be treated equal.

    • (Score: 0) by Anonymous Coward on Wednesday July 23 2014, @12:57AM

      by Anonymous Coward on Wednesday July 23 2014, @12:57AM (#72561)

      Better also bring back Bretton Woods, or you're gonna miss the '50s and land in the '70s instead.
      Sadly I think that genie is well and truly out of the bottle at this point.

  • (Score: 2) by emg on Monday July 21 2014, @07:06PM

    by emg (3464) on Monday July 21 2014, @07:06PM (#71948)

    We used to call this idea of enforcing regulations through embedded technology 'Blue Goo'. I shouldn't have to explain why it's likely to be only slightly less devastating than 'Grey Goo'.

    • (Score: 0) by Anonymous Coward on Monday July 21 2014, @08:47PM

      by Anonymous Coward on Monday July 21 2014, @08:47PM (#71996)

      > 'Blue Goo'. I shouldn't have to explain why it's likely to be only slightly less devastating than 'Grey Goo'.

      But you do have to explain the "blue" part. Is that a dated reference to IBM? Or the blue screen of death?

  • (Score: 0, Troll) by Anonymous Coward on Monday July 21 2014, @07:08PM

    by Anonymous Coward on Monday July 21 2014, @07:08PM (#71949)

    Data driven regulations are impossible because political coalitions particularly Black and Hispanic voters make this impossible.

    For example, the NYPD reported that about 98% of murders and 95% of shootings were from Black and Hispanic suspects. Mayor Bloomberg cited this stat, which has been replicated at every major city, as justification for stop and frisk EVEN THOUGH it will disproportionately affect Black and Latino men.

    Because Black and Latino Men commit the overwhelming majority of violent crimes. Not just in NYC, globally.

    Asians for example in the NYPD crime stats don't even appear above 1% in shootings and murders and assaults. Whites are not that much higher.

    An empirical, data driven approach to regulation would have stop and frisk to remove deadly weapons, guns and knives primarily, from Black and Latino street thugs, as most street thugs are Black and Latino. This will be an annoyance to law abiding Black and Latino men but will also greatly improve their safety; as they are also principal victims of Black and Latino thugs.

    However Black and Latino VOTERS will side with their criminal relatives: cousins, sons, nephews, etc. (the vast proportion of street thugs once again are men) and kill stop and frisk by voting for politicians like Bill de Blasio who promise to abolish it.

    Diversity and PC are a religion, and self-interest by White elites (Bill de Blasio being a good example) and Black/Hispanic voters in solidarity with criminal relatives that denies the reality in favor of feel-good or naked self interest nonsense. New York City was rescued from the constant, and seemingly inherent (absent DNA modification or Clockwork Orange behavior modification) criminality of Black and Hispanic lower class people. Which is the overwhelming majority of that racial/ethnic group. This is reality, it is not consistent with the universalist, idiot religion of PC and Diversity. But it is born out again and again by data. Asians are the least criminal, then Whites, then Hispanics, with Blacks the most criminal. This is true of overall crime rates and violent crime rates.

    Recognizing that empirical reality is about as comfortable for those embracing a universalist, PC/Diversity religion as a sun-centric universe was for the Catholic Church in Galileo's time was, which wasn't very comfortable. We have our own Inquisitions and PC purges just like Renaissance Italy. Religion is all, combined with naked self-interest by relatives of Black and Hispanic criminals ("Don't put my nephew in jail!")

    Various real estate data sites list: criminal activity, race, ethnicity, school ratings, etc. which are illegal under the Fair Housing Act for real estate agents to mention, and the DOJ under Eric Holder is considering prosecuting such sites as Zillow, etc.

    So no, we will NEVER use data to conceive of regulations because it runs afoul of PC/Diversity and interests of the Black and Hispanic underclass.

    • (Score: 3, Insightful) by BasilBrush on Monday July 21 2014, @08:09PM

      by BasilBrush (3994) on Monday July 21 2014, @08:09PM (#71976)

      For example, the NYPD reported that about 98% of murders and 95% of shootings were from Black and Hispanic suspects.

      Actually "Over 90%", not 98%.

      But there's a bigger indicator than that. 93% were male. So unless you are suggesting random searches of males due to gender profiling, you shouldn't be suggesting racial profiling either.

      Also, a true technocratic solution would see that the difference in crime stats for different races is a symptom, not a cause. The cause being income inequality. And would use taxation progressive taxation and other remedies to reduce that inequality.

      Indeed you know deep down that income inequality is the problem, because you used the word "under-class". Class structures being based on wealth and power.

      --
      Hurrah! Quoting works now!
      • (Score: 2) by buswolley on Monday July 21 2014, @08:39PM

        by buswolley (848) on Monday July 21 2014, @08:39PM (#71990)

        Thank you for setting the troll straight.

        --
        subicular junctures
      • (Score: 0) by Anonymous Coward on Monday July 21 2014, @10:19PM

        by Anonymous Coward on Monday July 21 2014, @10:19PM (#72040)

        "you know deep down that income inequality is the problem"
        Correlation != Causation

        Is "knowing deep down" a technocratic thing?

      • (Score: 0) by Anonymous Coward on Monday July 21 2014, @10:35PM

        by Anonymous Coward on Monday July 21 2014, @10:35PM (#72047)

        "income inequality is the problem" completely ignores the stats regarding Asians. Seems like cultural attitudes toward the value of education is a big factor in income inequality.

    • (Score: 3, Informative) by Sir Garlon on Monday July 21 2014, @09:05PM

      by Sir Garlon (1264) on Monday July 21 2014, @09:05PM (#72016)

      There was a rather upbeat story about data-driven policing in Chicago [csmonitor.com] just this morning. If you read it, you will note a profound lack of race warfare in Chicago's approach. I considered submitting that story to Soylent but didn't because I figured the only comments it would get would be from racists like parent. (I hope he doesn't get modded down too far because sometimes offensive points of view contribute to the discussion and this is one of those times. You can't talk about crime without confronting prejudice.)

      --
      [Sir Garlon] is the marvellest knight that is now living, for he destroyeth many good knights, for he goeth invisible.
      • (Score: 0) by Anonymous Coward on Monday July 21 2014, @11:30PM

        by Anonymous Coward on Monday July 21 2014, @11:30PM (#72063)

        And for a completely different perspective on the same program:

        Chicago PD's Big Data: using pseudoscience to justify racial profiling [boingboing.net]

        It's the myth that "algorithms" can't be racist when in fact the people who create a system imbue it with their own assumptions.

        • (Score: 2) by Sir Garlon on Tuesday July 22 2014, @12:06PM

          by Sir Garlon (1264) on Tuesday July 22 2014, @12:06PM (#72244)

          Thank you for posting that.

          --
          [Sir Garlon] is the marvellest knight that is now living, for he destroyeth many good knights, for he goeth invisible.
  • (Score: 3, Insightful) by VLM on Monday July 21 2014, @07:16PM

    by VLM (445) on Monday July 21 2014, @07:16PM (#71951)

    Its likely to be as successful and humane as management by metric.

    And we all know that is never corrupted to give a predetermined result (LOL)

    • (Score: 2) by Alfred on Monday July 21 2014, @08:59PM

      by Alfred (4006) on Monday July 21 2014, @08:59PM (#72011) Journal

      Well if not that, then it will be as efficient as any government-specified enterprise-caliber investment.

  • (Score: 0) by Anonymous Coward on Monday July 21 2014, @09:02PM

    by Anonymous Coward on Monday July 21 2014, @09:02PM (#72014)

    or Morozov on O'reilly

    http://www.thebaffler.com/past/the_meme_hustler [thebaffler.com]

  • (Score: 2) by meisterister on Monday July 21 2014, @11:57PM

    by meisterister (949) on Monday July 21 2014, @11:57PM (#72075) Journal

    Alright, here are the obstacles and problems that I can immediately think of:

    1. Computers are fast. Although the amount of data involved would be very vast, an unregulated computer could generate millions of regulations or laws every second. They would be effectively be unenforceable unless the computer is limited to change things at a certain rate.

    2. Who gets to write the AI/algorithm? This one is pretty obvious. You would need someone who is either insane or absurdly altruistic to write a government without including any sort of kickbacks for themselves.

    3. What to optimize for? There are some problems where you cannot optimize for all outcomes. There would have to be some mechanism by which the AI is set to optimize for some outcomes (ie. the AI could be directed to work on economic growth one year and social problems the next).

    4. There should be a way of opting out of government data collection. Some people don't want to vote now for various reasons, and having the government be directed by personal information is a bit much.

    --
    (May or may not have been) Posted from my K6-2, Athlon XP, or Pentium I/II/III.
  • (Score: 2) by Phoenix666 on Tuesday July 22 2014, @01:59PM

    by Phoenix666 (552) on Tuesday July 22 2014, @01:59PM (#72278) Journal

    The general concept of basing policy on data is sound. The points raised about the pitfalls of technocracy ring true. But there is one way to apply technology to governance that is an unalloyed good: replace the ridiculously Kafka-esque bureaucracy with algorithms. Why do there need to be armies of clerks at the DMV you have to wait in line hours to see only to, joy of joys, have to deal with their mini-Napolean complexes when you get to the counter? Renew your license online, fire all those a-holes. All those giant buildings in Washington DC full of bureaucrats pushing paper around at each other? Write some software, fire all those people, and turn those facilities into something useful like water treatment plants.

    --
    Washington DC delenda est.