Social science papers that failed to replicate racked up 153 more citations, on average, than papers that replicated successfully. [sciencemag.org]
This latest result is “pretty damning,” says University of Maryland, College Park, cognitive scientist Michael Dougherty, who was not involved with the research. “Citation counts have long been treated as a proxy for research quality,” he says, so the finding that less reliable research is cited more points to a “fundamental problem” with how such work is evaluated.
...
Citation counts on Google Scholar were significantly higher for the papers that failed to replicate, they report today in Science Advances, with an average boost of 16 extra citations per year. That’s a big number, Serra-Garcia and Gneezy say—papers in high-impact journals in the same time period amassed a total of about 40 citations per year on average.
And when the researchers examined citations in papers published after the landmark replication projects, they found that the papers rarely acknowledged the failure to replicate, mentioning it only 12% of the time.
Well, nobody likes a Debbie Downer, do they?