Stories
Slash Boxes
Comments

SoylentNews is people

posted by n1 on Friday July 11 2014, @07:57AM   Printer-friendly
from the quantity-matters-quality-is-helpful dept.

Research into the frequency of scientists publishing papers over the last 16 years has found that, while less than 1% of scientists who published at least one paper in the last 16 years have published at least one paper each year over the last 16 years (having what the article calls an "uninterrupted, continuous presence" [UCP]), they published 41.7% of all papers in the same period and 87.1% of all papers with >1000 citations in the same period.

While it may seem obvious that publishing more papers would mean that a scientist would get more citations, the research indicates that the total citations conditioned for number of papers was higher for UCP authors compared to non-UCP authors.

Skipping even a single year substantially affected the average citation impact. We also studied the birth and death dynamics of membership in this influential UCP core, by imputing and estimating UCP-births and UCP-deaths. We estimated that 16,877 scientists would qualify for UCP-birth in 1997 (no publication in 1996, UCP in 1997-2012) and 9,673 scientists had their UCP-death in 2010. The relative representation of authors with UCP was enriched in Medical Research, in the academic sector and in Europe/North America, while the relative representation of authors without UCP was enriched in the Social Sciences and Humanities, in industry, and in other continents.

Authors with uninterrupted, continuous presence over all these 16 years eventually had a much higher citation impact than other authors. To some extent this higher impact is generated through a larger volume of published papers. However, the citation impact in the UCP authors goes beyond just publishing more papers. Even after conditioning on the number of papers, the total citations and h-index of their work were higher than those of non-UCP authors; the exception was authors with fewer than 3 papers per year and who did not have any discernible difference in citation impact regardless of whether they had UCP or not.

Related Stories

SAGE Retracts 60 Scientific Papers due to Fraud Concerns 11 comments

This seems to be one of the biggest cases of scientific misconduct ever:

On July 8, scientific publisher SAGE announced that it was retracting a whopping 60 scientific papers connected to Taiwanese researcher Peter Chen, in what appears to be an elaborate work of fraud.

This case is one of what appears to be a recent spate of scientific malfeasance. So what's going on here? Is this just a uniquely bad run? Or does the recent spate of scientific misconduct point to a flaw in the peer-review process? Vox.com provides a rundown.

The Chen case is quite astounding. Publisher SAGE announced it was retracting 60 papers from 2010-2014 in the Journal of Vibration and Control, which covers acoustics, all connected to Peter Chen of National Pingtung University of Education, Taiwan.

Chen allegedly created up to 130 fake email accounts to create a 'peer review and citation ring'.

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 3, Interesting) by jimshatt on Friday July 11 2014, @08:06AM

    by jimshatt (978) on Friday July 11 2014, @08:06AM (#67492) Journal
    I suppose the results vary heavily based on field of science. Medicine is a highly competitive branch of science with lots of publications. Not publishing often means drowning in irrelevancy. This also means that a lot of publications are meaningless cruft.
    This is not to say that there aren't any meaningless publications in physics or other harder branches of science. But not publishing for one or more years is less of a problem, especially if you want to present something meaningful.
  • (Score: 0) by Anonymous Coward on Friday July 11 2014, @08:16AM

    by Anonymous Coward on Friday July 11 2014, @08:16AM (#67497)

    Like me on facebook!
    Follow me on twitter!
    Subscribe me on youtube!

    Cite me in your papers!!

    They're all popularity contests.

    • (Score: 1, Interesting) by Anonymous Coward on Friday July 11 2014, @11:18AM

      by Anonymous Coward on Friday July 11 2014, @11:18AM (#67531)

      One hopes, though, that scientific popularity is based more on one's ability to consistently say something interesting than on one's ability to find cute cat pictures.

      It looks the same as a popularity contest. Personality is important: no one is going to hunt you down at a conference to see if you have something interesting to say; few people will come back if you swear at or insult them. And, honestly, even scientists have trouble distinguishing between "Dr. Jones is a really fun dinner companion" and "Dr. Jones does really good research."

      Research is a positive feedback loop: good results improve funding; strong funding gets better results. A year without publications is recognized by tenure review committees and by grant review committees as a question of productivity. A year without funding...

      • (Score: 2) by Oligonicella on Friday July 11 2014, @02:26PM

        by Oligonicella (4169) on Friday July 11 2014, @02:26PM (#67590)

        "strong funding gets better results." - Ha! Nice presumption based on nothing but hope.

  • (Score: 2) by aristarchus on Friday July 11 2014, @08:27AM

    by aristarchus (2645) on Friday July 11 2014, @08:27AM (#67499) Journal

    Could someone estimate, please, the signal noise ratio in scientific publication? Or perhaps in academic publication as a whole. I would suggest that no more than 1% are worth reading, and less than .001% are going to be influential. The problem is, we will never know, until.

    • (Score: 2) by RamiK on Friday July 11 2014, @09:21AM

      by RamiK (1813) on Friday July 11 2014, @09:21AM (#67511)

      Academic publications as a whole are irrelevant since while reviewing the lot of them so you'd be needlessly reviewing the humanities and social studies.

      --
      compiling...
    • (Score: 3, Interesting) by Sir Garlon on Friday July 11 2014, @01:14PM

      by Sir Garlon (1264) on Friday July 11 2014, @01:14PM (#67559)

      Could someone estimate, please, the signal noise ratio in scientific publication?

      Of course not.

      The boundary between a good paper and an average paper pretty much a matter of opinion. Similarly for the difference between an average paper and a weak paper, and a weak paper and a complete waste of ink.

      People's opinions vary.

      --
      [Sir Garlon] is the marvellest knight that is now living, for he destroyeth many good knights, for he goeth invisible.
      • (Score: 1) by Wootery on Friday July 11 2014, @02:44PM

        by Wootery (2341) on Friday July 11 2014, @02:44PM (#67596)

        pretty much a matter of opinion

        Which is why there are objective measures such as impact factor [wikipedia.org].

        • (Score: 2) by maxwell demon on Friday July 11 2014, @06:35PM

          by maxwell demon (1608) on Friday July 11 2014, @06:35PM (#67759) Journal

          I hope you meant that as joke. The impact factor of the journal a scientific work is published in does not tell you much about the scientific quality of the paper. It tells you more about its visibility. An excellent paper in an obscure journal will very likely get less citations than a mediocre paper in a high-visibility journal. The impact factor is mostly interesting if you want to publish your paper there; the higher the impact factor, the more visibility the paper gets (if it gets accepted).

          From the very Wikipedia article you linked to:

          Because "the impact factor is not always a reliable instrument", in November 2007 the European Association of Science Editors (EASE) issued an official statement recommending "that journal impact factors are used only—and cautiously—for measuring and comparing the influence of entire journals, but not for the assessment of single papers, and certainly not for the assessment of researchers or research programmes".

          --
          The Tao of math: The numbers you can count are not the real numbers.
          • (Score: 3, Insightful) by aristarchus on Friday July 11 2014, @06:56PM

            by aristarchus (2645) on Friday July 11 2014, @06:56PM (#67763) Journal

            Hmm, only opinion? But surely some opinions are better informed than others, and those would be the ones that should be screening the submissions to scientific journals. Otherwise we could just go by the number of "likes" on Facebook! (Which, come to think of it, is another measure of impact, no? ) And indeed, I was aiming at something more substantial, the judgment of History, which while it is just an opinion, as Hegel said, it's opinion is just the truth.

            • (Score: 2) by maxwell demon on Saturday July 12 2014, @06:20AM

              by maxwell demon (1608) on Saturday July 12 2014, @06:20AM (#67994) Journal

              The impact factor doesn't tell you how well the review process works (the review process is what does the screening). The impact factor tells you how many people cite papers from that journal (which depends on a lot of factors; it definitely does not only depend on the quality of the articles).

              To take up your Facebook analogy: Taking the impact factor of the journal as measure of the quality of a single article published in that journal would be as if you measured the quality of a web site by the total likes for web sites hosted by the same hoster.

              --
              The Tao of math: The numbers you can count are not the real numbers.
              • (Score: 2) by aristarchus on Saturday July 12 2014, @08:33AM

                by aristarchus (2645) on Saturday July 12 2014, @08:33AM (#68016) Journal

                Oh, Demon of Maxwell! (And how well your user name is chosen!) We are incomplete agreement! I often have said, (something like) "judge no philosopher before their time." This actually means, do not judge a philosopher until they have been dead for at least a hundred years. The date of expiration on scientists may be less, but not so if we want to elevate them to the level of philosophers. There have been may "stars" in science and philosophy, but I always ask, "who reads Christian Wolff these days?" The answer is, wait for it, almost no one. The main take away, since we are in the age of stupid phrases like "take away", is that there is nothing to take away from current academic endeavors, since they operate on the time scales of aeons, not some stupid quarterly spreadsheet that was cooked up by the lesser mind of the "School of Business". School of what? That is not a discipline, nor an area of study, it is only a bunch of con-artists trying to justify their existence in an institution of higher education where they have no place, and so now they recoil upon the institution with the venom of "accountability" to finally kill their host, like the idiotic parasite that they are. Business, school, non comprehendum et stupido maximo.

                Now, back to the original question, yes, it will all come out in the wash. Fraud and popularity and the Kardashians, all these will fade away, but the true purpose of science and philosophy, the pursuit of truth, will always triumph in the end. Else, there will be no end..

          • (Score: 1) by Wootery on Sunday July 20 2014, @11:11PM

            by Wootery (2341) on Sunday July 20 2014, @11:11PM (#71639)

            Fair points. If astrologists started a journal, their citation patterns wouldn't make any difference to the scientific merit of the works - argumentum ad populum, and all that.

  • (Score: 0) by Anonymous Coward on Friday July 11 2014, @02:25PM

    by Anonymous Coward on Friday July 11 2014, @02:25PM (#67589)

    Who would have thunk that science would have celebrities as well.

    Yes, those that write more often will get noticed more, those that get noticed get followed, those that are followed get cited. Wow.