1.6

SCIENTOMETRIC AND EXPERT ASSESSMENT: DISCUSSION ISSUES

T.M. KARMADONOVA
https://orcid.org/0000-0002-3384-6067
Dobrov Institute for Scientific and Technological Potential and Science History Studies of the NAS of Ukraine

Nauka naukozn. 2022, 2(116): 65–84
https://doi.org/10.15407/sofs2022.02.065

Section: Scientometrics
Language: Ukrainian
Abstract: The article is devoted to topical issues of the research performance evaluation. Applications of scientometric and expert assessment methods in the research system of our times and related problems are analyzed. The texts of San Francisco Declaration on the Evaluation of Research and the Leiden Manifesto for Bibliometrics were used as documents that are in the focus of scientists’ and public attention. It is emphasized that one of their main conclusions concerns with the inappropriateness of overestimating the significance of quantitative indicators and the need to consider for the multidimensionality of research objectives.

Advantages and drawbacks of scientometric evaluation of the research performance are analyzed. Potentials and limitations of citation indices, Hirsch index, impact factor and other indices in performance assessment of researchers or research institutions and ranking of scientific journals are shown.

The interpretation of “expert opinion” the is given, with discussing main methods of expert assessment: Delphi method, brainstorming, synectics. Possible errors and shortcomings involved in their applications are highlighted. A review of expert assessment models is made: single blind peer review, double blind peer review and open peer review. It is shown that Internet environment techniques have become determinants of the transition from the model of closed pre-publication review to the model of open post-publication online review.

Based on the by S. Funtowicz and J. Ravetz concept of post-normal science, the concepts of “extended expert community” and “extended expert assessment” are elaborated on, with emphasizing on that that they reflect public participation in science and technology assessment. It is argued that the extended expert assessment has to be arraged in a way to avoid ideological, political or religious bias. The rise of a new phenomenon of “beta reader” in evaluating research texts is analyzed, with discussing several online platforms of “beta readers”.

It is concluded that qualitative and quantitative methods of assessment should be used as complementary analytical tools.

Keywords: expert assessment, citation index, Hirsh index, impact factor, expert opinion, peer review, expert review, extended expert assessment, “beta reader”.

References

  1. Horovy, V. (2015) Criteria for the quality of scientific achievements in the context of the security of national interests. Visn. Nac. Akad. Nauk Ukr., 6, 74–80 [in Ukrainian].
  2. Kirichenko, I.V., & Shelyubskaya, N.V. (2019). The system for assessing the quality of scientific research in European countries. University Management: Practice and Analysis, 23(4), 9–20. https://doi.org/10.15826/umpa.2019.04.025 [in Russian].
  3. Kostenko, L., & Simonenko, T. (2016). Scientometrics: from numerology to the Leiden Manifesto. Scientific Periodicles of Vernadsky National Library of Ukraine, 43, 285–295 [in Ukrainian]. https://doi.org/10.15407/np.43.285
  4. Malitsky, B., Ribachuk, V., Koretsky, A., & Popovich A. (2013). Scientometrics: new functions and problems of adequacy. Nauka innov., 1 (119), 11–17 [in Russian].
  5. Didenko, Yu.V., & Radchenko, A.I. (2017). Publication activity as a way of scientific communication and pursuit of ratings. Visn. Nac. Akad. Nauk Ukr., 9, 82–98. https://doi.org/10.15407/visn2017.09.082 [in Ukrainian].
  6. Rybachuk, V.P. (2013). Bibliometric portrait of academician Vladimir Ivanovich Vernadsky: fame in the world. Libraries of National Academies of Sciences: Problems of Functioning, Development trends, 11, 22–33 [in Russian].
  7. Kavunenko, L.P., Khorevin, V.I., Kostrytsya, O.P., & Levchenko, O.G. (2010). Scientometric monitoring of scientific periodicals in the socio-humanitarian sphere of Ukraine. Science of Ukraine in the Global Information Space, 3, 71–79 [in Ukrainian].
  8. Malitsky, B.A. (2017). Who and how should evaluate the scientific results of a scientist. Science and Science of Science, 3, 34–53. https://doi.org/10.15407/sofs2017.03.034 [in Russian].
  9. Moskaleva, O.V. (2013). The use of scientometric indicators for the evaluation of scientific activity. Science Policy Research, 13, 85–109 [in Russian].
  10. Vyalkov, A.I., & Glukhova E.A. (2013). Evaluation of the quality of scientific research activities of a medical organization using scientometric indicators. Healthcare of the Russian Federation, 3, 3–6 [in Russian].
  11. Kostenko, L.I., Simonenko, T.V., Grachev, O.A., & Rybachuk, V.P. (2017). Bibliometrics of domestic science: opportunities and limitations of the application of the Google Scholar web system. Science and Science of Science, 3, 87–96 [in Russian]. https://doi.org/10.15407/sofs2017.03.087
  12. Aleksinska, M. (2016). Beware the index! Why international ratings cannot properly assess the Ukrainian labor market. Economic policy. URL: https://voxukraine.org/osterigaitesreityngiv-ua/ (last accessed: 01.21.2022) [in Ukrainian].
  13. Balatsky, E.V., Ekimova, N.A., & Tretyakova O.V. (2021). Methods for assessing the quality of scientific economic journals. Journal of Institutional Studies, 2, 27–52 [in Russian]. https://doi.org/10.17835/2076-6297.2021.13.2.027-052
  14. Obolkina, S.V., & Popova, N.G. (2019). Expertise vs peer review: conceptual foundations of competencies. Sociology of Science and Technology, 4, 38–50 https://doi.org/10.24411/2079-0910-2019-14003 [in Ukrainian].
  15. Maslennikov, E.V. (2017). The possibility of using expert knowledge as a source of concepts for the development of organizations. Bulletin of the Moscow University, 2, 229–249 [in Russian]. https://doi.org/10.24290/1029-3736-2017-23-2-229-249
  16. Zhuravleva, V.A. (2012) Expert assessment method: historical explication and modern model. Bulletin of Peoples’ Friendship University of Russia. Series: Sociology, 2, 28–38 [in Russian].
  17. Pavliuk, K.V. (2019). Problems of evaluation of scientific activity. RFI Scientific Papers, 4, 5–19. https://doi.org/10.33763/npndfi2019.04.005 [in Ukrainian].
  18. Zhenchenko, M.I. (2016). New model of reviewing scientific publications in the digital environment. State and Regions. Series: Social Communications, 1, 169–172 [in Ukrainian].
  19. Gerasimenko, A.G., Mazaraki, N.A., & Duginets, G.V. (2019). Reviewing as a tool for promoting social and economic achievements. Economic Space, 141, 25–35. https://doi.org/10.30838/P.ES.2224.100119.25.343 [in Ukrainian].
  20. Funtowicz, S.O., & Ravetz, J.R. (1991). A new scientific methodology for global environmental issues. Ecological economics: The science and management of sustainability. New York: Columbia Univ. Press.
  21. Bucchi, M., & Trench, B. (2018). Handbook of Public Communication of Science and Technology. Trans. from English. Moscow: Alpina non-fiction [in Russian].
  22. Jenkins, H. (1958). Convergence Culture: Where Old and New Media Collide. NYU Press.
  23. Hellekson, K., & Busse, K. (2006). Fan Fiction and Fan Communities in the Age of the Internet: New Essays. McFarland.
  24. Grigoriev, V.E. (2018). Sociology of science. Moscow: Prospekt [in Russian].
  25. Bornmann, L. & Daniel, H.-D. (2005). Does the h-index for ranking of scientists really work? Scientometrics, 65(3), 391–392. https://doi.org/10.1007/s11192-005-0281-4
  26. Shostak, A.V., Lukach, V.S., Boris, M.M., & Kupchuk, I.M. (2012). Hirsch index and impact factor as a tool for scientometrics at the previous university. Collection of Scientific Works of the Vinnitsa National Agrarian University, 11(65), 375–380 [in Ukrainian]
  27. Chaikovskiy, Yu., Silkina, Yu., & Pototska, O. (2013). Scientometric bases and their quantitative indicators (Part I. Comparative characteristics of scientometric bases). Visn. Nac. Akad. Nauk Ukr., 8, 90–94 [in Ukrainian].
  28. Popovich, O. (2020). Scientometric ignorance (the mania of bureaucracies in implementing digital assessment of R&D). Granite of Science. URL: https://un-sci.com/ru/2020/02/06/ naukometrichne-neviglastvo-shhodo-maniї-byurokratij-zaprovaditi-czifrove-oczinyuvannya-nauki/ (last accessed: 21.01.2022) [in Ukrainian].
  29. Currie, R.R., & Pandher, G.S. (2020). Finance journal rankings: Active scholar assessment revisited. Journal of Banking & Finance, 111, 1–14. https://doi.org/10.1016/j.jbankfin.2019.105717
  30. Jacalyn Kelly, Tara Sadeghieh, & Khosrow Adeli (2014). Review in Scientific Publications: Benefits, Critiques, & A Survival Guide. The Journal of Internaitonal Federation of Clinical Chemistry and Laboratory Medicine, 25(3), 227–243. URL: https://www.ncbi.nlm.nih. gov/pmc/articles/PMC4975196/ (last accessed: 01/21/2022).
  31. Funtowicz, S., & Ravetz, J. (1993). Science for the post-normal age. Futures, 25(7), 739–755. https://doi.org/10.1016/0016-3287(93)90022-L

Full Text (PDF)