Stories
Slash Boxes
Comments

SoylentNews is people

posted by n1 on Wednesday November 19 2014, @01:33AM   Printer-friendly
from the peer-reviewed-study-confirms-it dept.

Phys.org is running a story on some of the issues with modern peer review:

Once published, the quality of any particular piece of research is often measured by citations, that is, the number of times that a paper is formally mentioned in a later piece of published research. In theory, this aims to highlight how important, useful or interesting a previous piece of work is. More citations are usually better for the author, although that is not always the case.

Take, for instance, Andrew Wakefield's controversial paper on the association between the MMR jab and autism, published in leading medical journal The Lancet. This paper has received nearly two thousand citations – most authors would be thrilled to receive a hundred. However, the quality of Wakefield's research is not at all reflected by this large number. Many of these citations are a product of the storm of controversy surrounding the work, and are contained within papers which are critical of the methods used. Wakefield's research has now been robustly discredited, and the paper was retracted by the Lancet in 2010. Nevertheless, this extreme case highlights serious problems with judging a paper or an academic by number of citations.

Personally, I've been of the opinion that peer review is all but worthless for quite a while. It's nice to know I'm not the only one who has issues with the process.

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 3, Informative) by dltaylor on Wednesday November 19 2014, @01:45AM

    by dltaylor (4693) on Wednesday November 19 2014, @01:45AM (#117479)

    Peer review and citations have nothing in common. A paper is peer-reviewed PRIOR to publication to evaluate its fitness for publication. The reviewers of Wakefield's paper were incompetent, and should never be allowed to review another. There is a later review by, of a sort, by researchers trying to build on a previous work; if they cannot re-establish the baseline from the prior work, it may be found flawed (and then there are papers like the FTL neutrinos, where the team was hoping that someone would either find the flaw that they suspected was there, or give them a Nobel).

    Citations are a post-publication figure of merit, usually associated work on which others successfully build. A simple fix for papers like Wakefield's is to have negative citations not count as simple citations.

    Starting Score:    1  point
    Moderation   +2  
       Informative=2, Total=2
    Extra 'Informative' Modifier   0  

    Total Score:   3  
  • (Score: 2) by cafebabe on Wednesday November 19 2014, @02:29AM

    by cafebabe (894) on Wednesday November 19 2014, @02:29AM (#117488) Journal

    have negative citations not count as simple citations.

    I wish that applied to hyperlinks too.

    --
    1702845791×2
  • (Score: 2) by TheRaven on Wednesday November 19 2014, @09:28AM

    by TheRaven (270) on Wednesday November 19 2014, @09:28AM (#117573) Journal

    Citations are a post-publication figure of merit, usually associated work on which others successfully build. A simple fix for papers like Wakefield's is to have negative citations not count as simple citations.

    The UK's REF (the system that evaluates research output from universities) explicitly did not include bibliometrics (for computer science, at least) because they are generally pretty poor at judging research impact. Papers with catchy titles end up cited a lot when someone wants a citation for some broad area and papers about tools and techniques are vastly over-cited. For example, the vast majority of computer architecture papers cite gem5, which is a popular simulator. That doesn't mean that gem5 is particularly interesting research (or that the numbers that it generates are in any way useful - it's very easy to abuse it and come up with completely nonsense results).

    --
    sudo mod me up
  • (Score: 0) by Anonymous Coward on Wednesday November 19 2014, @09:39AM

    by Anonymous Coward on Wednesday November 19 2014, @09:39AM (#117577)

    This, a thousand times...

  • (Score: 1) by mathinker on Wednesday November 19 2014, @08:19PM

    by mathinker (3463) on Wednesday November 19 2014, @08:19PM (#117831)

    You're totally correct that the quoted passage is unrelated to the stated problem. But there's a third elephant in the room: the fact that much research is never attempted to be replicated, because papers dealing with replication (either negative or positive) are less likely to be published and do not confer enough "academic karma" compared with the effort involved.