The world has entered an approach of systematic statistical evaluations, and so has academia. This is called the Publish or Perish (POP) mindset.
Keep in mind that the abuse of metrics can produce effects such as:
"In this short video, Dr Kevin Lalor, School of Social Sciences and Law, Dublin Institute of Technology, highlights some of the limitations of use of journal impact data in the social sciences and humanities and all the types of publication that are missed"
The publication pressure put on academics conducted them to ask for the improvement of evaluation criteria and appropriate indicators. Academics defend taking into account the quality of the work in relation to the quantity produced.
ACUMEN is "a European research collaboration aimed at understanding the ways in which researchers are evaluated by their peers and by institutions, and at assessing how the science system can be improved and enhanced".
The Declaration on Research Assessment (DORA) "recognises the need to improve the ways in which the outputs of scholarly research are evaluated". Consult the list of organizations or individuals signatories of the declaration.
The LSE Impact Blog is "a hub for researchers, administrative staff, librarians, students, think tanks, government, and anyone else interested in maximising the impact of academic work in the social sciences and other disciplines".
The Metrics Toolkit is "a resource for researchers and evaluators that provides guidance for demonstrating and evaluating claims of research impact".
The ORCID Blog to follow updates of ORCID activities.
Abbott, A., Cyranoski, D., Jones, N., Maher, B., Schiermeier, Q., & Van Noorden, R. (2010). Metrics: Do metrics matter?. Nature News, 465(7300), 860-862.
Barnes, C. (2015). The use of altmetrics as a tool for measuring research impact. Australian Academic & Research Libraries, 46(2), 121-134.
Biagioli, Mario, and Alexandra Lippman. Gaming the Metrics: Misconduct and Manipulation in Academic Research. Cambridge: The MIT Press, 2020. Web.
Brown, M. (2014). Is altmetrics an acceptable replacement for citation counts and the impact factor?. The Serials Librarian, 67(1), 27-30.
Costas, Rodrigo, and María Bordons. "Is g-index better than h-index? An exploratory study at the individual level." Scientometrics 77.2 (2008): 267-288.
Erdt, M., Nagarajan, A., Sin, S. C. J., & Theng, Y. L. (2016). Altmetrics: an analysis of the state-of-the-art in measuring research impact on social media. Scientometrics, 109(2), 1117-1166.
Gingras, Y. (2015). Dérives et effets pervers de l’évaluation quantitative de la recherche: sur les mauvais usages de la bibliométrie. Revue internationale PME, 28(2), 7-14.
Gruber, T. (2014). Academic sell-out: how an obsession with metrics and rankings is damaging academia. Journal of Marketing for Higher Education, 24(2), 165-177.
Hirsch, J. E. (2005). An index to quantify an individual's scientific research output. Proceedings of the National academy of Sciences, 102(46), 16569-16572.
Muller, Jerry Z. The Tyranny of Metrics. Princeton: Princeton University Press, 2018. Print.
University of Waterloo Working Group on Bibliometrics. (2016). Measuring Research Outputs through Bibliometrics.