Stories
Slash Boxes
Comments

SoylentNews is people

Community Reviews
posted by on Monday May 08 2017, @06:15AM   Printer-friendly
from the beep-beep-i-am-a-gadget dept.

I read a couple of good books recently, and wanted to share them and do some writing to collect my thoughts on a subject that is currently of news-worthy relevance and of particular interest to "Soylentils". Enjoy, and I look forward to the discussion!


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 0) by Anonymous Coward on Monday May 08 2017, @06:28AM (1 child)

    by Anonymous Coward on Monday May 08 2017, @06:28AM (#506213)

    Technopoly: https://www.amazon.com/Technopoly-Surrender-Technology-Neil-Postman/dp/0679745408/ [amazon.com]
    You Are Not A Gadget: https://www.amazon.com/You-Are-Not-Gadget-Manifesto/dp/0307389979/ [amazon.com]
    Not the submitter, just thought it would be nice to have links. Mods feel free to add these to the summary and delete this post.

  • (Score: 0) by Anonymous Coward on Monday May 08 2017, @06:29AM (5 children)

    by Anonymous Coward on Monday May 08 2017, @06:29AM (#506214)

    But you are a machine, and when the capitalists are done with you, it's time to be decommissioned.

    • (Score: 0) by Anonymous Coward on Monday May 08 2017, @06:44AM (4 children)

      by Anonymous Coward on Monday May 08 2017, @06:44AM (#506219)

      >Implying the other system is any better where you wish you had machine.

      • (Score: 0) by Anonymous Coward on Monday May 08 2017, @01:20PM (3 children)

        by Anonymous Coward on Monday May 08 2017, @01:20PM (#506331)

        Implying there are two only two systems and and only one way to run either of them?

        The depth of your knowledge and imagination is truly disheartening.

        • (Score: 1) by khallow on Monday May 08 2017, @02:23PM (2 children)

          by khallow (3766) Subscriber Badge on Monday May 08 2017, @02:23PM (#506360) Journal

          Implying there are two only two systems and and only one way to run either of them?

          Indeed. The real problem here is that capitalism really is the only game in town at the national level, if you want a prosperous nation. There's not much point to noting that capitalism isn't perfect, when you don't have a better idea in mind.

          In particular, when one is complaining about the problem of people being treated as machines, it's worth noting that's no different than any other era where people worked for a living as opposed to the usual hunter/gatherer thing. Capitalists didn't create this outlook. Instead, it's been around since the dawn of agriculture. In particular, that indicates to me that when rival approaches are developed, they'll have similar outlook problems with this.

          This is just a routine conflict of interest that happens when someone else values a person's labor. Everyone will have that, unless the society somehow succeeds in making human labor valueless.

          Finally, what exactly is supposed to be so bad about private ownership and trading of capital, currency and banking, or the other features of modern capitalist societies? A lot of the complaints seem to want to do away with the best parts of capitalism rather than the worst.

          • (Score: 2) by pnkwarhall on Monday May 08 2017, @03:34PM (1 child)

            by pnkwarhall (4558) on Monday May 08 2017, @03:34PM (#506393)

            when one is complaining about the problem of people being treated as machines, it's worth noting that's no different than any other era where people worked for a living

            Capitalists didn't create this outlook. Instead, it's been around since the dawn of agriculture.

            One of Postman's main examples of technological impact on the viewing of workers as "just another machine" (vs as something special that couldn't be replaced by a machine) was the widespread adoption of Taylorism [wikipedia.org]. I would argue that the perspective of "workers as machines" was not feasible in many types of production (i.e. products made by craftsmen) prior to the industrial revolution--the individual level of skill and experience used to make a finished product are readily visible in many crafts. In fact, I want to go as far as saying "most types of production", because I imagine that, prior to the present era where the majority of product categories are commodities due to lack of major differences in quality, the skill of the craftsman made a huge difference in the sales value of an item.

            I would agree that the outlook has "been around since the dawn of agriculture"--after all, agriculture has many low-skill tasks that can be done by easily-replaceable workers. But Postman's argument is that this perspective has become pervasive and societally-dominant due to common methods of production and business organization, and negatively affected peoples' own view of themselves and their self-worth based on their contribution to society.

            I think your reply to AC's criticism of capitalism contains his whole idea: "unless society somehow succeeds in making human labor valueless", with the substitution of "Capitalists" for "society". That, more or less, is the implicit goal of many capitalist enterprises--labor is a cost, and should thus be minimized. But are the human elements of a productive system a cost or an investment?

            --
            Lift Yr Skinny Fists Like Antennas to Heaven
            • (Score: 1) by khallow on Monday May 08 2017, @11:34PM

              by khallow (3766) Subscriber Badge on Monday May 08 2017, @11:34PM (#506663) Journal

              I would argue that the perspective of "workers as machines" was not feasible in many types of production (i.e. products made by craftsmen) prior to the industrial revolution--the individual level of skill and experience used to make a finished product are readily visible in many crafts. In fact, I want to go as far as saying "most types of production", because I imagine that, prior to the present era where the majority of product categories are commodities due to lack of major differences in quality, the skill of the craftsman made a huge difference in the sales value of an item.

              Agriculture was pretty widespread prior to the industrial revolution. I'd say that by itself, it would be most people for all but the most highly urbanized societies. Second, there are several other sectors in the same situation: warfare and low skilled labor. So no, I think it was widely prevalent contrary to the basic assumption.

              I would agree that the outlook has "been around since the dawn of agriculture"--after all, agriculture has many low-skill tasks that can be done by easily-replaceable workers. But Postman's argument is that this perspective has become pervasive and societally-dominant due to common methods of production and business organization, and negatively affected peoples' own view of themselves and their self-worth based on their contribution to society.

              Compared to when? There were a lot of societies who had slaves or indentured servants/peasants. Instead, today is far more respectful of workers than the past was.

              I think your reply to AC's criticism of capitalism contains his whole idea: "unless society somehow succeeds in making human labor valueless", with the substitution of "Capitalists" for "society". That, more or less, is the implicit goal of many capitalist enterprises--labor is a cost, and should thus be minimized. But are the human elements of a productive system a cost or an investment?

              No, reducing cost is not the goal. Increasing profit would be a typical goal, but you can't get there by merely cutting costs. You have to have revenue generation as well.

  • (Score: 4, Interesting) by takyon on Monday May 08 2017, @07:19AM (2 children)

    by takyon (881) <reversethis-{gro ... s} {ta} {noykat}> on Monday May 08 2017, @07:19AM (#506227) Journal

    Here's a little interview/debate from 3 years ago [pbs.org] in which he (on a whim) advocates incentivizing against uploading to Facebook and whines about the devaluation of human mental labor in the face of big data/machine learning while offering no solution:

    I mean, you have to — people have to be valued for what they actually do. The economy has to be honest. And so what I am concerned about is that by getting everybody to input all their productivity for free to these Silicon Valley companies, including the one that funds my lab, by the way, so I’m a beneficiary of what I’m criticizing — but in order to pretend that all this stuff, you know, it comes in for free, and what we give people in exchange is access to services, we’re taking them out of the economic cycle.

    We’re putting them into an informal economy, which is an unbalanced way to grow a society. And that’s also a road to ruin. I’m not asking for artificial make-work projects. I’m asking for honesty, where we acknowledge when people generate value, and make them first-class economic citizens.

    And then I think that all of these amazing schemes of automation, the self-driving cars, the 3-D printers, these will lead to a world of happy, meaningful lives, as well as great economic growth. You know, that’s the ticket, is honesty.

    Since 2014 when this segment aired, we've seen even greater reliance on tech sweatshop "mechanical turks" feeding human insight into machine learning systems. Like the army that is currently combing YouTube videos to determine which ones are potentially "inappropriate" for advertising brands and labeling the specific reasoning (profanity, violence, bullying, racism, ISISm, etc.). These classifications will eventually train an AI that can do the obviously gargantuan task much cheaper and more thoroughly than tens of thousands of humans could.

    Jaron talks about Google Translate devaluing translators. Google Translate had its problems back in 2014. Now it runs on machine learning and TPUs [nytimes.com]:

    The new incarnation, to the pleasant surprise of Google’s own engineers, had been completed in only nine months. The A.I. system had demonstrated overnight improvements roughly equal to the total gains the old one had accrued over its entire lifetime.

    It won't be long before it's too late to value menial data tasks, as most of them will have been completed. It will be too late to create the Mechanical Turks Union.

    Lanier is in a special position to add to the the outcry against this fantasy--there is more to the essence of a person than mere computational processes that can be modeled on a computer, no matter how fast the processor or large the storage capacity.

    If the essence of a person is ideological dreck, maybe the computers are getting a good deal.

    But seriously though, machine learning is a shortcut around complete emulation or creation of a sentient entity. It accomplishes human-like performance on useful tasks without the need for a human or artificial human. Yet strong AI is still on the menu and will probably be created in secret. It is within the realm of possibility for the tech giants to create neuromorphic chips with billions of neuron-like components. They will not need to match human-scale intelligence or the human's 20 W power envelope to create a potentially useful system. And it may already exist [nextbigfuture.com] with the usual suspects like the military and NSA reaping the benefits.

    --
    [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
    • (Score: 2) by AthanasiusKircher on Monday May 08 2017, @03:17PM

      by AthanasiusKircher (5291) on Monday May 08 2017, @03:17PM (#506383) Journal

      I agree with a lot of this; and thanks for the thoughts and links. After reading the article on recent advances in Google Translate, I'm interested in seeing how it does again. Last year I was asked by a friend to give some comments on a book draft in a language I'm not fluent in (though I know enough to read pretty well), and Google Translate still seemed very rough. It helped me get the gist in some passages quicker than a traditional approach with a dictionary, but I'd never mistake it for the work of a human translator. And it still made a lot of basic errors and seemed "confused" quite often by minor grammatical issues like word order, simple idioms, or obvious things like proper names. (Obviously it should go without saying that Google Translate even a few years ago was leagues ahead of what was available before Google Translate.)

      I haven't tried putting long bits of text into it since then, but perhaps I should try again...

    • (Score: 2) by pnkwarhall on Monday May 08 2017, @04:09PM

      by pnkwarhall (4558) on Monday May 08 2017, @04:09PM (#506409)

      The lack of solutions to any of the specific problems Lanier brought up was the main problem I had with the book. However, I wonder if this was intentional--the preface to "You Are Not A Gadget" was IMHO cleverly constructed to challenge the reader to read with a deliberately conscious and reflective approach. Lanier ends the book by saying that the only solutions he has are ones specific to his experience and tools-at-hand (i.e. VR-based communication tools); but I think the implicit message is that the essence that makes humans special is that they can imagine, develop, and implement the missing solutions (then reflect on the outcome and start all over again!). As a "gadget" is something that un-thinkingly carries out orders/tasks--like a person who reads and implements a solution without much thought about the whys, hows, and consequences--it's our's to understand the problems and find the solutions.

      Your example of mechanical turks supplying their efforts to making said efforts able to be accomplished by algorithm (w/o their input) contains the ironic assumption that, at a certain point, the algorithm will be able to duplicate the "work" of the mechanical turk. It's ironic because at it's most basic this is a Turing Test, and Turing Tests say more about us than they do about the algorithm. One of Lanier's main points is that we are, to put it simply, allowing the data needs/assumptions of the algorithm to mold our own perceptions of who we are and how we function.

      If the essence of a person is ideological dreck[...]

      One man's trash is another man's treasure.

      --
      Lift Yr Skinny Fists Like Antennas to Heaven
  • (Score: 2) by kaszz on Monday May 08 2017, @08:35AM (3 children)

    by kaszz (4211) on Monday May 08 2017, @08:35AM (#506246) Journal

    The ability to control fire came to use already 1.9 million years ago. And that has of course not changed anything.. *duck*
    Wheels came in to use circa 7100 years ago. Tech and it did affect society, no?
    First commercially successful piston steam power at 3.7 kW in 1712. Needs no comment..

    So tools and technology has shaped human existence even before humans were a thing. Better deal with it. What has changed though is their control and ownership by corporations. The rate of technological paradigm change also accelerates. The real game changer is perhaps computers since the 1950s. Even if IBM machines were used to find Jews in Germany before that. Another shaping..

    Providing data without checking the recipient can be exemplified by CDDB [wikipedia.org]. In short users provided data for free. Then they got charged for it's usage. So pay attention to what you contribute. And don't be a f-cking facebook drone. Abuse them if you have to but don't ever let them get the complete picture. Which is also why BSD and GPL licensing exists.

    Another aspect to keep in mind is that information giving and taking doesn't have to be in monetary value. Information can be exchanged without being beholden to the monetary system rules. I think that is one area where the original source gets this internet thing wrong.

    An area that may already be observed as shaping is the bay of ships with black flags and "evil" files or where raided machines turn into gibberish for anyone not intended to read the contents. Society tries to rule this but the technical possibilities nullify it quite efficiently. Someone will figure out how to do IRS vs AI, it may be hilarious.

    The mechanical Turks union will not come to existence because they can easily be replaced and thus not enough bargaining power. And they can also be replaced in the feature with AI. The catch is that the bar for what constitute a mechanical Turk is constantly becoming higher. Ie the "humans need not to apply".

    Turning on a real and full AI have significant likelihood of being the dominant species. The creators of such thing may think they control the AI. But the AI will eventually control everything around it.

    • (Score: 2) by VLM on Monday May 08 2017, @11:43AM (2 children)

      by VLM (445) Subscriber Badge on Monday May 08 2017, @11:43AM (#506294)

      The catch is that the bar for what constitute a mechanical Turk is constantly becoming higher.

      My coworkers who were involved in the "drinking while turking" game claim its harder now. I'm not sure if thats the traditional nostalgia for the good old days or marketplace supply/demand dynamics or the marketplace for dumb stuff drying up.

      • (Score: 2) by maxwell demon on Monday May 08 2017, @12:10PM (1 child)

        by maxwell demon (1608) on Monday May 08 2017, @12:10PM (#506305) Journal

        My coworkers who were involved in the "drinking while turking" game claim its harder now.

        That reminds me of another set of technologies that changed the world: Brewing, wine fermentation and distilling.

        Actually there's a theory that humans didn't start doing agriculture for food. Rather they started agriculture for beer.

        --
        The Tao of math: The numbers you can count are not the real numbers.
        • (Score: 1) by pnkwarhall on Monday May 08 2017, @04:11PM

          by pnkwarhall (4558) on Monday May 08 2017, @04:11PM (#506411)

          >started agriculture for beer

          I can believe it!

          --
          Lift Yr Skinny Fists Like Antennas to Heaven
  • (Score: 2) by Phoenix666 on Monday May 08 2017, @03:20PM

    by Phoenix666 (552) on Monday May 08 2017, @03:20PM (#506386) Journal

    We're all familiar with Open Source, so it's surprising that it and its philosophy aren't looming larger in this discussion. Open Source contributors love the thing itself. They are motivated by something deeper and much better than money. That is a revolutionary idea and an existential threat to capitalism as it exists now: that people should do something meaningful and productive for free, and then enable millions of others to benefit from it.

    Wags will point out that Apple, MicroSoft, and many other software companies still make billions of dollars. That is because they have spent billions on propaganda to scare laymen about the dangers of open source software. Part of it, too, is social lag. Regular people going about their lives don't have the time or context to understand the issues, so they have to wait until they seep in from the sides and they steep in it a long while before they come to understand it on a visceral level. But the day is not far off when FLOSS will win; Over the 20 years I've been using Linux I've seen a dramatic change--non-technical people around me, people I didn't convert to FLOSS, run Linux now.

    Beyond software, too, I see a similar progression. The Maker movement exemplifies it. People are returning to the practice of creating technology instead of outsourcing it all to some company. But, again, it's early days and there is a lag to overcome. The day the person manufactures the applicance he needs with his home additive manufacturing setup is far off, but it will come because the advantages are too clear to ignore.

    There is a lot of trepidation now about technological progress because thus far companies have had the first-mover advantage. They have used technology as a means to concentrate wealth and control others. As barriers to entry of technological production, though, have crashed into the basement it has become possible for regular people everywhere to build what they want, when they want.

    So I'd agree with the premise that technology is not an end, but a tool. It has been used to oppress, but we can also use it to liberate. There are geeks and makers out there who are working to disintermediate corporations and increase human freedom. We should all join them.

    --
    Washington DC delenda est.
  • (Score: 2) by AthanasiusKircher on Monday May 08 2017, @04:04PM (6 children)

    by AthanasiusKircher (5291) on Monday May 08 2017, @04:04PM (#506405) Journal

    I'll thank to contributor for the review, which was an interesting and thoughtful read.

    Though to me there seems to be an unnecessary and counterproductive conflation of science (or "scientific inquiry") with technological shifts. The critiques in the books, as I understand them (I looked at Technopoly a bit years ago, but I haven't read the other, though I've heard of it) are mostly about technology. But talking about "progress" in technology strikes me as very different from talking about "progress" in science. Certainly technological advances can follow scientific ones and vice versa, but science is generally about trying to find an increasing "better fit" to explanations of the workings of the natural world. Technological "progress" has no such metric.

    One can say that Einstein's theory of general relativity is a better model of physics than Newton's theory of mechanics in a somewhat objective fashion. (I'm setting aside deeper philosophical issues about empiricism for the moment.) One cannot generally make similar statements that a self-propelled power mower is objectively "better" than an old-fashioned power mower, or that both are objectively "better" than a non-powered reel mower. The former are seen to be technological "advances," but that judgment is based on some sort of subjective utility measure based on various cultural assumptions. If I really dislike noise or I'm concerned about pollution and use of fossil fuels or I just like the way it cuts my grass, I might think the reel mower is a better technological option. Whereas I can't really say that in Einstein vs. Newton -- clearly it's easier to get an estimate at normal speeds, etc. with Newton's theory (which is why we still teach it as a tool in intro courses), but there's never any question about which is actually a more accurate or complete theory.

    This distinction is crucial because the review starts with some meditations on whether "science 'serves the common good', and that the process of scientific inquiry is the best path to finding solutions to cultural and and societal problems" only to spend most of the review on technological "progress."

    And then there's the notion of "progress" in general, which is insufficiently interrogated. The problem with "cultural and societal problems" is that their "solutions" and even their very definitions are also cultural and societal, which introduces even more subjectivity into the equation. How exactly do we judge "progress"? Is the average person "happier" than they were 50 or 100 or 500 years ago? I'm not sure we have any great evidence of that. Murder rates and violent crime in general have mostly gone down over the centuries; is that "progress"? But as we make progress by some metrics, we falter by others. For example, child abductions have been declining for decades (especially by strangers), but we seem more terrified of stranger abductions than ever. Have we "solved" the social problem or merely traded the additional security of lowering already rare stranger abduction rates while increasing our general paranoid and fear level substantially (to the point that it disrupts child rearing and development)? Is that trade-off "progress"?

    Historians have been talking about all of this for a lot longer than these tech books. Back in the old days, it was a debate over so-called Whiggish history [wikipedia.org]. Generally, that entailed an overuse of a teleological approach to historiography, or in simpler terms -- historians tended to write as though the present is the "goal" and the past is a set of gradual "progress" that "leads up" to that goal. The problem of course is that historical figures didn't know the future and they frequently had very different goals and priorities than we do today. It's questionable whether they'd even consider some of our narratives to be "progress" for the things they cared about. "Progress" is very relative to one's perspective, and these tech books are a similar critique to those historians have been making about narratives of "progress" for a much longer time.

    One statement that stuck out to me in the review's conclusion:

    I believe the implicit connection is laid bare in the implications of statements on the 'March For Science' website. The core principles state, again and again, that science "does things" for society. I think it's more accurate to say that people and technology (tools) do things, both positive and negative, for society--the fruits of scientific research merely provide means to make the tools.

    Yes, that makes a lot of sense to me. Technology is about tools, and in some ways so is science (i.e., tools for understanding). But I still resist the author's need to try to superimpose arguments from these tech critics onto science. The reason the "March for Science" folks highlight technology is because "We have a better model for gravity" isn't going to mean much to most folks in their everyday lives. Technology allows a practical lens to see the possibilities created by progress in scientific knowledge. But whether that technology is "good" or "bad" or even constitutes "progress" in a social or cultural sense -- that seems to be confusing a bunch of categories.

    • (Score: 2) by pnkwarhall on Monday May 08 2017, @05:22PM (5 children)

      by pnkwarhall (4558) on Monday May 08 2017, @05:22PM (#506440)

      First, thank you for your compliments. "Technopoly" is definitely worth another look--although it does focus on technological shifts, it contains an interesting perspective about how the mindset of "scientific inquiry as road to truth" led to the devaluation of that which could not be scientifically explored. I would also agree that the conflation of "science" and "technology" is wrong-headed--but my POV (as well as Postman's) is that this substitution/misunderstanding is not only the current general perspective, but a main source of societal problems.

      Cultures and societies are the implemented solutions to fundamental human problems--Postman's main point is that a "technopoly" is a specific type of society (i.e. solution) to human problems that has serious negative consequences because it doesn't address, indeed ignores, vital parts of the human experience. Your statement that technology "allows a practical lens to see ["experience" is a more useful word, IMO] possibilities for progress is responded to directly by both authors with the assertion that, instead of "allowing" (i.e offering freedom) experience of possibilities, technological choices constrain the possibilities for human progress along particular paths. Lanier in particular offers concrete examples of more-or-less arbitrary software design choices that, due to widespread adoption of the specific software, became "baked into" not only further software development but actual cultural evolution and societal structure through the spread of fundamental memes with which we understand ourselves and our role in society.

      I think it's interesting you mention that these ideas and criticisms are not new. (Postman's book starts with a reference to an ancient Egyptian argument about the societal value of the adoption of technologies.) One of my personal observations (particularly on Reddit) is that many people seem to believe that ancient cultures did not have similar debates, that in the modern era we are somehow vitally different (and advanced) in our perspectives because of our scientific and technological progress!

      --
      Lift Yr Skinny Fists Like Antennas to Heaven
      • (Score: 2) by AthanasiusKircher on Monday May 08 2017, @06:48PM (4 children)

        by AthanasiusKircher (5291) on Monday May 08 2017, @06:48PM (#506492) Journal

        Interesting. By the way, I absolutely agree with what you said about technology constraining as well as allowing possibilities of scientific application. My point in my previous post was restricted to the specific context about the "March for Science" that you mentioned in your review. I think the participants there were trying to emphasize practical application instead of pure scientific theory and knowledge, and that's what I meant by "allowing," i.e., the expansion from the theoretical to the practical. But you're right: the specific practical applications that end up becoming popular technologies then often constrain perspectives and possibilities.

        I also briefly looked at some other reviews I could find on Technopoly, and I have to say that my memory of it was muddled and perhaps got a bit more confused by seeing your joint review along with the other book. As you note, Postman's critique is much broader than technology, and he is critiquing what is generally termed scientism [wikipedia.org], which both advocates for the expansion of "scientific" methods to all areas of human life and often argues against alternative (more "human" or "humanistic") approaches.

        I sympathize with that view too. About 10-15 years ago I was involved in discussions about a couple projects that would now fall under what people call "digital humanities." At the time, I was kind of intrigued by the possibility of collecting "data" and manipulating it. I really thought the path forward to some traditional problems in that humanities field was basically through statistical analysis of large bodies of data.

        Although my direct involvement with the project was brief, within a few years, such approaches started to become commonplace. And what result was, frankly, TERRIBLE. The very criticisms I had labeled at earlier research (arbitrariness, subjective judgments, focusing on the research author's interests or aims while excluding alternative possibilities) were simply translated into statistical methods. But now they appeared to have greater authority since they were based in "data" and had math to back them up.

        I now realize that the earlier problems in the field weren't due to lack of "scientific" (or scientistic) rigor, but rather because research is always done by humans, and humans are flawed. Asking bad questions or using bad methodology with large datasets or statistics doesn't result in better results than asking the same bad questions without as much data or stats. Perhaps even more than technological concerns, to me the resonance with Postman (or at least from your review and the few others I just skimmed) is with the idea that information must be "the answer" to everything. But more information (omnipresent as a buzzword these days) just means it's even harder a create good interpretations of that information, which is often the key to understanding it.

        So, yes, I agree with Postman's critiques of things like using IQ scores as a proxy for "intelligence" or assuming opinion polls really measure "what the public believes" rather than how the question was asked. Obviously a more quantitative and numerical approach to "human" problems can have benefits, but only if we recognize stats and data as one possible set of "tools" that can be used for good and bad too. It's interesting to think of Postman's criticism in relation to the recent controversies in many areas of science where results of major studies (that have been frequently cited) are then shown to actually not be reproducible when the experiments are run again. There are all sorts of flaws generally happening in these cases -- from confirmation bias in experimental design to data collection abnormalities to statistical errors to publication bias -- but at its heart what these things are pointing out again and again to me is that science is a human and social endeavor. And the human and social aspects are clearly not working well if we have lots of research being promoted that turns out to be wrong (because we don't encourage publication of negative results or reproducibility experiments or whatever).

        Not only is science sometimes failing at being a good tool for understanding humanity, but the ejection of "human" concerns from science (or at least the failure to acknowledge their strong influence) has caused more holes to appear in actual scientific "progress" too.

        • (Score: 2) by pnkwarhall on Monday May 08 2017, @08:03PM (3 children)

          by pnkwarhall (4558) on Monday May 08 2017, @08:03PM (#506537)

          I'm aware of "scientism", and thanks for pointing it out... I really should have included that term/concept in the piece.

          Your point about "digital humanities" and statistics-based "science" is actually very close to one of the criticisms that led me down my current thought path. I was unaware until recently that there was this whole branch of scientific inquiry based around statistical inference, and my original criticisms of social science research were based on ideological principles. I guess there's a soylent commenter who consistently complains about 'null hypothesis statistical significance testing' (whose comments led me to look it up), and when I found out what it entailed, my principled skepticism seemed justified by the approach's reliance on way too many assumptions and simplifications.

          --
          Lift Yr Skinny Fists Like Antennas to Heaven
          • (Score: 2) by aristarchus on Tuesday May 09 2017, @08:37AM (2 children)

            by aristarchus (2645) on Tuesday May 09 2017, @08:37AM (#506790) Journal

            I was unaware until recently that there was this whole branch of scientific inquiry based around statistical inference, and my original criticisms of social science research were based on ideological principles.

            Stats do not have ideology, unless they are fake stats, but then, those are not that hard to come by. .

            I guess there's a soylent commenter who consistently complains about 'null hypothesis statistical significance testing' (whose comments led me to look it up), and when I found out what it entailed, my principled skepticism seemed justified by the approach's reliance on way too many assumptions and simplifications.

            You should be equally skeptical of the "null hypothesis" skeptic, this seems to be a meme that is floating around these days, I hear it from people who listen to right-wing radio, and otherwise have no interest in science. Not saying the criticism is unfounded, but it is unfounded until it is understood.
                But nice reviews, Pink Narwal!!

            • (Score: 1) by khallow on Tuesday May 09 2017, @02:07PM

              by khallow (3766) Subscriber Badge on Tuesday May 09 2017, @02:07PM (#506909) Journal

              Stats do not have ideology

              Counterexample: finding the statistics that cast the issue in the best light for your ideology. For example, if you want to present the US's economy in a good light, speak of the US household average income of $52k per year. If you want to present it in a bad light, speak of the 38% of wealth owned by the 1%. Statistics even when valid is merely a viewpoint. And it is easy to choose advantageous viewpoints.

            • (Score: 1) by pnkwarhall on Wednesday May 10 2017, @11:04PM

              by pnkwarhall (4558) on Wednesday May 10 2017, @11:04PM (#507792)

              >Pink Narwal

              Thanks, I'll have to remember that one [i.redd.it].

              --
              Lift Yr Skinny Fists Like Antennas to Heaven
(1)