Submitted via IRC for SoyCow3196
Hundreds of extreme self-citing scientists revealed in new database
The world's most-cited researchers, according to newly released data, are a curiously eclectic bunch. Nobel laureates and eminent polymaths rub shoulders with less familiar names, such as Sundarapandian Vaidyanathan from Chennai in India. What leaps out about Vaidyanathan and hundreds of other researchers is that many of the citations to their work come from their own papers, or from those of their co-authors.
Vaidyanathan, a computer scientist at the Vel Tech R&D Institute of Technology, a privately run institute, is an extreme example: he has received 94% of his citations from himself or his co-authors up to 2017, according to a study in PLoS Biology this month. He is not alone. The data set, which lists around 100,000 researchers, shows that at least 250 scientists have amassed more than 50% of their citations from themselves or their co-authors, while the median self-citation rate is 12.7%.
The study could help to flag potential extreme self-promoters, and possibly 'citation farms', in which clusters of scientists massively cite each other, say the researchers. "I think that self-citation farms are far more common than we believe," says John Ioannidis, a physician at Stanford University in California who specializes in meta-science — the study of how science is done — and who led the work. "Those with greater than 25% self-citation are not necessarily engaging in unethical behaviour, but closer scrutiny may be needed," he says.
(Score: 2) by AthanasiusKircher on Thursday August 22 2019, @10:56PM
I don't think that's what GP was saying at all. GP was simply pointing out that the number of citations is an objective fact, i.e., the count exists. But it's often a bad metric because of all the subjectivity that goes into why and how things get cited (which often have nothing to do with quality of research, as you rightly confirm). You actually were, I think, agreeing with GP's point.
I would actually disagree slightly with GP's actual point in that not all journals and other research sources are indexed in the citation count, and (as noted in a previous SN story [soylentnews.org]) the ways citations are counted often aren't robust enough to actually count all occurrences of a paper title, hence making the counts potentially inaccurate or dependent on the exact software that's doing the counting. So yes, there is theoretically an objective count of citations out there I suppose, but we have flawed and sometimes subjective (which sources count as "academic" enough to get counted when they make a citation of another academic source?) mechanisms for counting them.