Stories
Slash Boxes
Comments

SoylentNews is people

posted by Fnord666 on Thursday August 22 2019, @02:07PM   Printer-friendly
from the you-scratch-my-back-and-I'll-scratch-yours dept.

Submitted via IRC for SoyCow3196

Hundreds of extreme self-citing scientists revealed in new database

The world's most-cited researchers, according to newly released data, are a curiously eclectic bunch. Nobel laureates and eminent polymaths rub shoulders with less familiar names, such as Sundarapandian Vaidyanathan from Chennai in India. What leaps out about Vaidyanathan and hundreds of other researchers is that many of the citations to their work come from their own papers, or from those of their co-authors.

Vaidyanathan, a computer scientist at the Vel Tech R&D Institute of Technology, a privately run institute, is an extreme example: he has received 94% of his citations from himself or his co-authors up to 2017, according to a study in PLoS Biology this month. He is not alone. The data set, which lists around 100,000 researchers, shows that at least 250 scientists have amassed more than 50% of their citations from themselves or their co-authors, while the median self-citation rate is 12.7%.

The study could help to flag potential extreme self-promoters, and possibly 'citation farms', in which clusters of scientists massively cite each other, say the researchers. "I think that self-citation farms are far more common than we believe," says John Ioannidis, a physician at Stanford University in California who specializes in meta-science — the study of how science is done — and who led the work. "Those with greater than 25% self-citation are not necessarily engaging in unethical behaviour, but closer scrutiny may be needed," he says.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 4, Insightful) by Thexalon on Thursday August 22 2019, @02:19PM (3 children)

    by Thexalon (636) on Thursday August 22 2019, @02:19PM (#883622)

    There's no objective way of determining the performance of academics. Tracking citations has been one way of attempting this, but like all other attempts to track people via metrics (e.g. evaluating coders via lines of code written), it doesn't work because academics are smart enough to figure out how to make the metric say what they want.

    One reason there's no objective way of determining performance is that very useful and creative research might not make a huge splash right away but slowly over time become important, while well-hyped but useless research involving some celebrity in the field might attract a lot of attention but ultimately be unimportant. That's one of many reasons neither human institutions nor metrics have been consistently able to tell the difference between an actual crackpot and a shunned researcher who happens to be right but won't be corroborated for decades or even centuries.

    --
    The only thing that stops a bad guy with a compiler is a good guy with a compiler.
    Starting Score:    1  point
    Moderation   +2  
       Insightful=1, Interesting=1, Total=2
    Extra 'Insightful' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   4  
  • (Score: 1) by hellcat on Thursday August 22 2019, @09:31PM (2 children)

    by hellcat (2832) Subscriber Badge on Thursday August 22 2019, @09:31PM (#883789) Homepage

    Citations are objective, by definition. I can count them, they exist.
    Perhaps you mean that they are not directly relevant to an assumed purpose of research?
    In that case you are absolutely correct.
    The problem is that:
    1) We should define the purpose ahead of time, and
    2) Try and develop objective measures that do a better job.

    • (Score: 3, Interesting) by Thexalon on Thursday August 22 2019, @10:32PM (1 child)

      by Thexalon (636) on Thursday August 22 2019, @10:32PM (#883806)

      Citations aren't objective. Far from it. For instance, let's say that there are 4 studies that happened around the same time and more-or-less agree with each other as to their results. Which one gets cited when somebody needs to reference that result? There are a lot of factors that can go into this that have absolutely nothing to do with the quality or correctness of the work, like:
      1. One of the authors of one of the studies was a buddy of an author on the new paper, and they knew that citing their buddy was likely to result in a cross-citation from said buddy. Or one of the authors of one of the studies is on a hiring or tenure committee and the new author is trying to butter them up.
      2. They were the first result in JSTOR due to factors like the name of the journal they were published in, what the first author's last name was (seriously, someone named "Aaronson" can get more attention than someone named "Zbotovic" solely because somebody displayed something in alphabetical order), or publication date.
      3. The new author happened to meet up with one of the authors of one of the studies at a conference and had a beer with them. The authors of the other papers were at a different prestigious conference held the next month.

      I think you're making the mistake of assuming that taking subjective things and counting them will necessarily lead to objective results.

      --
      The only thing that stops a bad guy with a compiler is a good guy with a compiler.
      • (Score: 2) by AthanasiusKircher on Thursday August 22 2019, @10:56PM

        by AthanasiusKircher (5291) on Thursday August 22 2019, @10:56PM (#883821) Journal

        I think you're making the mistake of assuming that taking subjective things and counting them will necessarily lead to objective results.

        I don't think that's what GP was saying at all. GP was simply pointing out that the number of citations is an objective fact, i.e., the count exists. But it's often a bad metric because of all the subjectivity that goes into why and how things get cited (which often have nothing to do with quality of research, as you rightly confirm). You actually were, I think, agreeing with GP's point.

        I would actually disagree slightly with GP's actual point in that not all journals and other research sources are indexed in the citation count, and (as noted in a previous SN story [soylentnews.org]) the ways citations are counted often aren't robust enough to actually count all occurrences of a paper title, hence making the counts potentially inaccurate or dependent on the exact software that's doing the counting. So yes, there is theoretically an objective count of citations out there I suppose, but we have flawed and sometimes subjective (which sources count as "academic" enough to get counted when they make a citation of another academic source?) mechanisms for counting them.