Stories
Slash Boxes
Comments

SoylentNews is people

posted by CoolHand on Tuesday May 26 2015, @06:03PM   Printer-friendly
from the return-to-mysticism dept.

Richard Horton writes that a recent symposium on the reproducibility and reliability of biomedical research discussed one of the most sensitive issues in science today: the idea that something has gone fundamentally wrong with science (PDF), one of our greatest human creations. The case against science is straightforward: much of the scientific literature, perhaps half, may simply be untrue. Afflicted by studies with small sample sizes, tiny effects, invalid exploratory analyses, and flagrant conflicts of interest, together with an obsession for pursuing fashionable trends of dubious importance, science has taken a turn towards darkness. According to Horton, editor-in-chief of The Lancet, a United Kingdom-based medical journal, the apparent endemicity of bad research behaviour is alarming. In their quest for telling a compelling story, scientists too often sculpt data to fit their preferred theory of the world or retrofit hypotheses to fit their data.

Can bad scientific practices be fixed? Part of the problem is that no-one is incentivized to be right. Instead, scientists are incentivized to be productive and innovative. Tony Weidberg says that the particle physics community now invests great effort into intensive checking and rechecking of data prior to publication following several high-profile errors,. By filtering results through independent working groups, physicists are encouraged to criticize. Good criticism is rewarded. The goal is a reliable result, and the incentives for scientists are aligned around this goal. "The good news is that science is beginning to take some of its worst failings very seriously," says Horton. "The bad news is that nobody is ready to take the first step to clean up the system."


[Editor's Comment: Original Submission]

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 4, Insightful) by Thexalon on Tuesday May 26 2015, @08:14PM

    by Thexalon (636) on Tuesday May 26 2015, @08:14PM (#188242)

    More to the point, "publish or perish" is all about attempting to quantify something that really isn't quantifiable, so that it can be turned into performance metrics that can then be used as an excuse to cut people and keep the salaries of scientists from getting too expensive from the point of view of college administrators (the administrators' salaries, of course, need to be increased dramatically).

    As soon as you create a performance metric, smart people will find a way to fake it, guaranteed.

    --
    The only thing that stops a bad guy with a compiler is a good guy with a compiler.
    Starting Score:    1  point
    Moderation   +2  
       Insightful=2, Total=2
    Extra 'Insightful' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   4  
  • (Score: 0) by Anonymous Coward on Wednesday May 27 2015, @12:35PM

    by Anonymous Coward on Wednesday May 27 2015, @12:35PM (#188563)

    As soon as you create a performance metric, smart people will find a way to fake it, guaranteed.

    Indeed and sometimes it really bites you https://en.wikipedia.org/wiki/Cobra_effect [wikipedia.org]

  • (Score: 0) by Anonymous Coward on Wednesday May 27 2015, @02:08PM

    by Anonymous Coward on Wednesday May 27 2015, @02:08PM (#188609)

    More to the point, "publish or perish" is all about attempting to quantify something that really isn't quantifiable, so that it can be turned into performance metrics that can then be used as an excuse to cut people and keep the salaries of scientists from getting too expensive from the point of view of college administrators (the administrators' salaries, of course, need to be increased dramatically).

    As soon as you create a performance metric, smart people will find a way to fake it, guaranteed.

    Without a metric, how would you determine which of the thousands of scientists should our limited resources be funneled to? If you agree to use a metric, which metric would be better than the admittedly terrible publish-or-perish model?

    Put it this way. A random stranger comes up to you and says, "Give me money to study exothermaldynamics. No, I won't promise you any results or anything else. You can trust me I'm using this money well, I'm a scientist."

    Maybe if you know this person personally you would trust them. Maybe if you were proficient enough in exothermaldynamics to be confident to determine a legitimate question from a pure money grab you would give them some money. But neither of those scale at all. So what is the better model we should be using?

    • (Score: 0) by Anonymous Coward on Wednesday May 27 2015, @03:25PM

      by Anonymous Coward on Wednesday May 27 2015, @03:25PM (#188649)

      Well, in the 1600-1700s science was done by people with wealthy and powerful patrons. The scientists would come over for dinner and entertain the guests. If the science was not interesting enough , the scientists would also get into feuds with each other which would still entertain and contribute to a kind of "science race" amongst the patrons.

      I don't necessarily think that worked better, but it is one option.

    • (Score: 3, Insightful) by Thexalon on Wednesday May 27 2015, @03:31PM

      by Thexalon (636) on Wednesday May 27 2015, @03:31PM (#188653)

      Put it this way. A random stranger comes up to you and says, "Give me money to study exothermaldynamics. No, I won't promise you any results or anything else. You can trust me I'm using this money well, I'm a scientist."

      Nobody is suggesting that. There's a fairly good way of vetting somebody who's trying to get grant money:
      1. First, you check the random strangers' educational qualifications. For example, an exothermaldynamicist would be expected to either have a doctorate or be working on their dissertation. You would want them to have done well in their coursework, which you can get from their college transcripts.
      2. Second, you check their previous work, if any. If they're somebody new working on their dissertation or something like that, then you'll understand them not having much of a record but you'll probably be a bit stingier with the grants.
      3. Third, you ask the acknowledged experts of exothermaldynamics to see what they think of the proposal and the person who's proposing it. Answers like "total crackpot!" or "hmm, there might be something to that, it would be worth a try" should give you some good guidance.
      4. Fourth, you get opinions on the random stranger from everybody who knows or has worked with them, particularly academic advisors and professors and such.

      The fact that there are far more qualified scientists out there than there is funding for them is truly shameful, because it means that we're intentionally holding back the rate of scientific discovery due to a fear of losing small green pieces of paper.

      --
      The only thing that stops a bad guy with a compiler is a good guy with a compiler.