Skip to Main Content

Impact Metrics

Controversies

The world has entered an approach of systematic statistical evaluations, and so has academia. This is called the Publish or Perish (POP) mindset.

Keep in mind that the abuse of metrics can produce effects such as:

  • Researchers break up their production in order to increase their chances of being cited
  • Researchers focus on trendy areas to find more funding
  • Researchers can manipulate indicators by using self-citation or by partnering with others
  • A researcher who excels in his field could be excluded from a scientific department even if his research would advance knowledge because he was judged non-productive in terms of quantity
  • The use of metrics can be a way to evaluate the visibility, but no metrics can replace a human opinion to evaluate the work of a researcher

"In this short video, Dr Kevin Lalor, School of Social Sciences and Law, Dublin Institute of Technology, highlights some of the limitations of use of journal impact data in the social sciences and humanities and all the types of publication that are missed"

The publication pressure put on academics conducted them to ask for the improvement of evaluation criteria and appropriate indicators. Academics defend taking into account the quality of the work in relation to the quantity produced.

  • The San Francisco Declaration on the Evaluation of Research (DORA) "was initiated at the 2012 Annual Meeting of the American Society for Cell Biology by a group of editors and publishers of scholarly journals, recognizes the need to improve the ways in which the outputs of scientific research are evaluated". It covers all scientific disciplines and denounces the increasing use of metric tools (see poster presenting the objectives of DORA).
  • The Metric Tide report (UK) was published in 2015. This report identifies 20 recommendations for UK research institutions to use "responsible metrics".

Other initiatives:

  • Snowball metrics - "Standardized research metrics – by the sector for the sector" is a collaboration between Elsevier and several universities (UK, US and Australia / New Zealand).

Links

ACUMEN is "a European research collaboration aimed at understanding the ways in which researchers are evaluated by their peers and by institutions, and at assessing how the science system can be improved and enhanced".

The Declaration on Research Assessment (DORA) "recognises the need to improve the ways in which the outputs of scholarly research are evaluated". Consult the list of organizations or individuals signatories of the declaration.

The LSE Impact Blog is "a hub for researchers, administrative staff, librarians, students, think tanks, government, and anyone else interested in maximising the impact of academic work in the social sciences and other disciplines".

The Metrics Toolkit is "a resource for researchers and evaluators that provides guidance for demonstrating and evaluating claims of research impact".

The ORCID Blog to follow updates of ORCID activities.

Readings

Abbott, A., Cyranoski, D., Jones, N., Maher, B., Schiermeier, Q., & Van Noorden, R. (2010). Metrics: Do metrics matter?. Nature News, 465(7300), 860-862.

Barnes, C. (2015). The use of altmetrics as a tool for measuring research impact. Australian Academic & Research Libraries, 46(2), 121-134.

Biagioli, Mario, and Alexandra Lippman. Gaming the Metrics: Misconduct and Manipulation in Academic Research. Cambridge: The MIT Press, 2020. Web.

Brown, M. (2014). Is altmetrics an acceptable replacement for citation counts and the impact factor?. The Serials Librarian, 67(1), 27-30.

Costas, Rodrigo, and María Bordons. "Is g-index better than h-index? An exploratory study at the individual level." Scientometrics 77.2 (2008): 267-288.

Erdt, M., Nagarajan, A., Sin, S. C. J., & Theng, Y. L. (2016). Altmetrics: an analysis of the state-of-the-art in measuring research impact on social media. Scientometrics, 109(2), 1117-1166.

Gingras, Y. (2015). Dérives et effets pervers de l’évaluation quantitative de la recherche: sur les mauvais usages de la bibliométrie. Revue internationale PME, 28(2), 7-14.

Gruber, T. (2014). Academic sell-out: how an obsession with metrics and rankings is damaging academia. Journal of Marketing for Higher Education, 24(2), 165-177.

Hirsch, J. E. (2005). An index to quantify an individual's scientific research output. Proceedings of the National academy of Sciences, 102(46), 16569-16572.
Muller, Jerry Z. The Tyranny of Metrics. Princeton: Princeton University Press, 2018. Print.

University of Waterloo Working Group on Bibliometrics. (2016). Measuring Research Outputs through Bibliometrics.