How to define quality scientific research: before us is an imitation, an empty piece of paper with beautiful phrases or real science? Scientists, of course, see it, but the question is how to show it objectively. Yes, and so that the result, preferably, could be “touched”, so that there were some specific things under it.
The question is very important, because the assessment of the significance of the study depends on: a) its funding (whether the country or the grantor needs it or not) and b) the reputation of individual researchers and entire institutions and teams working on projects.
In Ukraine, there has long been a discussion about what ruler should be used to measure and evaluate scientific research.
There are several points of view. First, it is necessary to focus mainly on formal, including quantitative scientometric indicators, and on the requirements for publications for defending dissertations or providing funding. The opposite opinion is that everything should be decided by experts, and the requirements for publications should be abandoned altogether. But we understand that “peer review”, where there is a human factor, depends not only on the professionalism of experts, but also on very subjective things, from personal relationships to resistance to corruption. Therefore, without special measures to ensure independence and impartiality, adequate assessment is impossible.
And how is it solved in the world?
Recently international Assembly of stakeholderswhich included more than 350 universities and organizations from more than 40 countries, prepared and adopted Research Assessment Reform Treaty (hereinafter referred to as the Agreement). Ukraine in this meeting was represented by the Lutsk National Technical University, the Council of Young Scientists of the Ministry of Education and Science and the public organization “Innovative University”, co-founded by the Council of Rectors, and the “Growth Pole”.
What do they want to change in the evaluation of scientific research?
I will outline the main ideas of the Treaty.
The first and main idea is that evaluations of scientific research should focus on quality, and not on formal indicators.
The agreement proposes to abandon the inadequate use of journal impact factors and Hirsch indices, as well as other indicators that do not reflect the quality of a particular study, for evaluating research. At first glance, these ideas are in tune with the demands of the opponents of Scopus, who want a return to the times when it was enough for a scientist to have: a) articles in local journals without independent peer review, and b) good relations with superiors. But this is only at first glance.
The agreement proposes to focus primarily on peer review. This term does not exactly correspond to the translation “peer review”, because it involves not just the evaluation of research by people who are considered experts in a certain field, but also certain procedures for selecting independent experts. It is important that scientific research should be evaluated not by officials, but by scientists themselves. The international document suggests that adequate financial resources should be allocated to reform the procedures for evaluating scientific research.
The second idea, without which the first is impossible, is intolerance for plagiarism.
If we in Ukraine really want to introduce world standards for evaluating scientific results, then we must actually demonstrate absolute intolerance to plagiarism and other types of dishonesty. All plagiarists must be immediately removed from office and stripped of their titles. Among the Ukrainian signatories is the NGO “Innovative University”, which is related to the Council of Rectors – she holds the cards in her hands.
And let me remind you that comparative tables demonstrating incorrect textual borrowings are sufficient evidence of academic dishonesty. The requirement of a court decision to establish the fact of plagiarism should be interpreted as manipulation and support of dishonesty.
Without an effective fight against the dishonesty of specific individuals, all documents and declarations will be meaninglessns.
Third idea. Research evaluation should be applied for all purposes, including providing funding for research, accountability for the use of public funds, recruitment, promotion and remuneration of researchers, decision-making on setting research priorities, and implementation of research development strategies for all decision makers, including donors and organizations awarding prizes and awards for scientific activity.
The treaty does not deal with the evaluation of scientific activity at the country level and the evaluation of the activities of scientific institutions and universities outside of scientific research and establishes a general direction for reforming research evaluation with respect for the autonomy of individual organizations.
Fourth idea. The freedom of scientific research and the autonomy of research organizations must be ensured.
There are several aspects in mind. Freedom in the choice of topics and research methods. The ability to recognize the results of research that enriches knowledge and will potentially have an impact on the development of knowledge, science, humanity (not necessarily immediately). Freedom of diversity of research activities.
An international paper suggests avoiding ranking research organizations in the process of evaluating scientific research.
What will slow down the real implementation Agreements in Ukraine?
There are three important things here.
The first is war. And this means that research for defense is financed and supported primarily. There are even demands to fire scientists who cannot be quickly turned into defense technology development engineers. And this affects the evaluation of scientific activity.
Of course, supporting the defense capability is extremely important for the country. But there are a few “buts”. If we think about the future and about restoration, we need to preserve scientific research, scientific personnel, scientific schools. Besides, who will distinguish a technology developer from an imitator who promises a miracle weapon right tomorrow or simply adds a few beautiful paragraphs about their defense value to the subject of empty articles? Scientists who can appreciate this are themselves devalued and in danger of being fired as unnecessary.
Second circumstance — we have a thriving imitation of science.
The discussion about methods for evaluating scientific research has always been not just sharp, but, I would say, hot. Because the ears of interests stick out in it, often not related to real science.
I’ll explain with an example. Many years ago, one of the authors of ZN.UA proposed divide people who are considered scientists in Ukraine into two categories – “singers” and “globalists”. At the same time, a person’s belonging to a certain category in no way depends on the presence of publications in indexed publications, Hirsch index indicators or industry.
“Singers” are people who honestly believe that writing any text, even pseudoscientific and manipulative, and publishing it in a collection that calls itself scientific is science.
“Globalists” are people, one way or another integrated into world science, who consider scientific publications to be exclusively presentations of the results of their own scientific research.
In my opinion, there is a third category – “buyers” who do not even imitate any science, but simply buy or receive a degree as a gift – often without even knowing what the requirements for publications really are.
Obviously, it is necessary to change the system for evaluating scientific research so that the above-mentioned types of “scientists” do not have a single chance to be called scientists. And not everyone will like it, there will be resistance.
The third circumstance is games with formal requirements that imitate real changes. Here is a recent example. Recently, the MES made a small innovation in the rules for the publication of scientific results for the defense of dissertations. “The Ministry of Education has canceled the requirement for publications in Scopuse!” social networks reacted to this.
In fact, no significant change has occurred, there is only a relaxation of the rules: the requirement to have five publications in a journal indexed in Scopus for the defense of a doctoral dissertation has been postponed for a certain period, but the previous requirement to have three such publications remains.
The motivation of the MON is interesting: war, research is difficult or impossible, so let’s relax the requirements. But who needs imitation articles and dissertations without research and why? Did those who manage science in our country want to say that doctoral dissertations without research suit them perfectly?
What to do?
There was a growing awareness around the world that the principles of assessment needed to be changed. And our institutions and organizations, having signed the Treaty, also assumed such an obligation. The signatories agreed to exchange reform experience. But this does not mean that we should blindly copy someone else’s experience. If only because in conditions of mass tolerance for the dishonesty and incompetence of officials, on whom science is completely dependent, such copying can lead to devastating consequences.
Expert evaluation of scientific research is possible under several conditions. First of all, there must be a real guarantee of academic integrity. It is also necessary to develop procedures for independent expert evaluation in avoide conflicts of interest, different assessment procedures for “fast” and “slow” industries (“fast” are industries where new knowledge changes rather quickly, new results appear frequently and are relevant for a relatively short period), training in the analysis of scientometric indicators to it was not just a comparison of quantitative data.
To implement new principles, it will be necessary to think and learn – not only Ukrainians.
Other articles by Irina Egorchenko read the link.