Stories
Slash Boxes
Comments

SoylentNews is people

posted by Fnord666 on Thursday August 22 2019, @02:07PM   Printer-friendly
from the you-scratch-my-back-and-I'll-scratch-yours dept.

Submitted via IRC for SoyCow3196

Hundreds of extreme self-citing scientists revealed in new database

The world's most-cited researchers, according to newly released data, are a curiously eclectic bunch. Nobel laureates and eminent polymaths rub shoulders with less familiar names, such as Sundarapandian Vaidyanathan from Chennai in India. What leaps out about Vaidyanathan and hundreds of other researchers is that many of the citations to their work come from their own papers, or from those of their co-authors.

Vaidyanathan, a computer scientist at the Vel Tech R&D Institute of Technology, a privately run institute, is an extreme example: he has received 94% of his citations from himself or his co-authors up to 2017, according to a study in PLoS Biology this month. He is not alone. The data set, which lists around 100,000 researchers, shows that at least 250 scientists have amassed more than 50% of their citations from themselves or their co-authors, while the median self-citation rate is 12.7%.

The study could help to flag potential extreme self-promoters, and possibly 'citation farms', in which clusters of scientists massively cite each other, say the researchers. "I think that self-citation farms are far more common than we believe," says John Ioannidis, a physician at Stanford University in California who specializes in meta-science — the study of how science is done — and who led the work. "Those with greater than 25% self-citation are not necessarily engaging in unethical behaviour, but closer scrutiny may be needed," he says.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 3, Insightful) by Snospar on Thursday August 22 2019, @03:21PM (7 children)

    by Snospar (5366) Subscriber Badge on Thursday August 22 2019, @03:21PM (#883668)

    250 out of 100,000 doesn't seem like it's a huge problem. It must be common to cite previous papers that you've worked on (with or without co-authors) especially if you're working in a relatively niche field. I'd agree that 94% self-citation appears to be an issue and I guess the >25% scrutiny would be a good thing.

    A bigger problem, IMHO, are those self proclaimed scientists with almost zero citations to their name. Surely all this data should be out in the open, not to foster a blame culture or to encourage finger pointing, just so that regular review can take place and outliers can be detected and if necessary investigated. An awful lot of Science is funded from the public purse and the data and details like this should all be open and available to the tax payers who funded it.

    --
    Huge thanks to all the Soylent volunteers without whom this community (and this post) would not be possible.
    Starting Score:    1  point
    Moderation   +1  
       Insightful=1, Total=1
    Extra 'Insightful' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   3  
  • (Score: 1, Insightful) by Anonymous Coward on Thursday August 22 2019, @03:39PM

    by Anonymous Coward on Thursday August 22 2019, @03:39PM (#883676)

    People in very niche areas of a field may not even attract much attention. So if you have 10 citations, and 9 are from your other papers, then at least someone else read something.

    And just because it's "out there", doesn't mean anyone else gives a flying fuck about it. Hence, only self-citations. ;)

    Anyway, much better than 90+% of Twitter users. Raging against themselves mostly.

  • (Score: 0) by Anonymous Coward on Thursday August 22 2019, @03:58PM

    by Anonymous Coward on Thursday August 22 2019, @03:58PM (#883686)

    250:100k is a big problem if it's a power distribution where 95% are essentially irrelevant.

  • (Score: 5, Insightful) by ikanreed on Thursday August 22 2019, @04:21PM (4 children)

    by ikanreed (3164) Subscriber Badge on Thursday August 22 2019, @04:21PM (#883692) Journal

    The problem for lack of public knowledge isn't closed journals. It's that a lot of scientific papers are, well, super incremental. Most of the ones you read about in the news are actually retreads of well studied things with a fun narrative and a better PR machine behind it.

    If you go to actual second or third tier journals doing real research, papers are titled things like Cloning and Expression Analysis of GmCYP78A5 Promoter [sciencepublishinggroup.com]. They transplanted a single gene into tobacco plants and analyzed the difference in how much other genes end up being expressed and finding that the gene behaves similarly in both soybeans and tobacco. It's relevant if you're a botanist, thinking of working with the metabolic pathways that gene interacts with. But the implications to someone like you or me is almost nothing.

    It's of public interest, because it broadly advances science, but it's basic research. Only people super familiar with the specific field are going to make use of it. You get zero citations until that research becomes relevant to future research. Citations don't come from the ether.

    • (Score: 2) by Snospar on Thursday August 22 2019, @04:57PM (2 children)

      by Snospar (5366) Subscriber Badge on Thursday August 22 2019, @04:57PM (#883709)

      Totally agree. The rate of advance of science can cause a flurry of activity with numerous (valid) citations between related papers. On the other hand you may find someone referring back to a paper 30 years old that's still valid but not as active... almost geological time scales.

      --
      Huge thanks to all the Soylent volunteers without whom this community (and this post) would not be possible.
      • (Score: 2) by ikanreed on Thursday August 22 2019, @05:51PM

        by ikanreed (3164) Subscriber Badge on Thursday August 22 2019, @05:51PM (#883737) Journal

        I don't know about in science, but the number of times I get in internet arguments and someone cites shitty(and sometimes good) social psychology or criminology papers from before 1990 is beyond my ability to count.

      • (Score: 3, Insightful) by maxwell demon on Thursday August 22 2019, @05:56PM

        by maxwell demon (1608) on Thursday August 22 2019, @05:56PM (#883740) Journal

        There's also the case of papers that gather very few citations for years, until someone finds an interesting use for it, and then citations skyrocket.

        --
        The Tao of math: The numbers you can count are not the real numbers.
    • (Score: 3, Interesting) by SunTzuWarmaster on Thursday August 22 2019, @08:09PM

      by SunTzuWarmaster (3971) on Thursday August 22 2019, @08:09PM (#883772)

      This. My most-cited research paper is the "we made this system, everyone is using it, and more people ought to" from 2011. One of my least cited papers is "oh yea, and it works as designed!" from 2015. My most read paper is "you can use this system for all kinds of things, here are some ideas" from 2013, which has relatively few citations. Notably, the last two papers cite the first one, because they build on a body of work. Naturally, no one outside the field cares. Somewhat fundamentally, most scientific work, even when successful, is irrelevant.