Not everything that can be counted counts, and not everything that counts can be counted. – William Bruce Cameron
Australian universities have been in the media in recent weeks for the dubious treatment of overseas students and the problem of plagiarism. But they are in serious trouble for another reason: their reliance on "bibliometrics" for major decision making.
Two international companies, Thomson Reuters and Elsevier, rate the apparent prestige of the journals in which academics' publications appear, and the frequency with which other authors refer to them, i.e. their citations. Two of the key summary results are the Hirsch index (or h-index), which reflects citations, and journal impact factor (JIF), claimed to reflect the importance of journals.
Ratings such as these dominate decisions on academic promotions, tenure, grant funding and the status of departments and universities. They have been universally adopted by universities in Australia because of perceived benefits of speed, cost-effectiveness and alleged objectivity. They underpin the government's Excellence in Research for Australia (ERA).
This is of immediate national interest because of the links between these metrics, academic rankings and government funding of science and the universities. Also the potential harm to careers and the very way research is carried out.
Internationally, opposition has taken the form of the San Francisco Declaration on Research Assessment (DORA) [PDF]. Institutions are urged to acknowledge that the scientific content of a paper is more important than publication metrics or the identity of the journal in which it was published.
Content rather than metrics is what ought to count.
http://theconversation.com/our-obsession-with-metrics-is-corrupting-science-39378
(Score: 2) by VLM on Tuesday June 02 2015, @01:09PM
Institutions are urged to acknowledge that the scientific content of a paper is more important than publication metrics
So whats the new improved solution to "We graduate 100 PHDs into a job market of stable or decreasing size with only 5 academic job slots"?
The existing system of metrics seems to work pretty well at deciding which 95 of the 100 aren't getting jobs in "their" field. If it ain't broken you aren't going to get traction to fix it.
Not just at the postdoc position level, but prof jobs, and tenure level. And probably higher level like those administrators who do nothing but collect huge salaries, I'm guessing they get a few applicants for each position LOL.
An excellent analogy is real estate. That house in that neighborhood in that school district... well I'm sure she's the best kindergarten teacher ever, but there's no getting past the metric result that the district graduation rate is only 60%, so I end up in the burbs where maybe she's not the best kindergarten teacher ever, but the district graduation rate rounds to 99%. Its not like the world is running out of land or houses or school districts.