Letters from your editor
An Open Letter to the owners of the Impact Factor
Ladies & gentlemen,
I am writing this open letter to you, the people who produce the Impact Factor reports, in good will and because I believe you can improve it.
I first learned about citation as “indication of scientific impact” in the short articles that Eugene Garfield used to publish in the Current Contents booklets back in the 1980’s, when I was a student. There he mentioned the idea of “core journals”, i.e. those that were allegedly more important because they were more cited; this idea may be one answer to the question: “How can I, as a librarian who knows nothing about Physics or Chemistry, decide which journal subscriptions to buy with my limited budget?” (Marder, Kettenmann & Grillner, 2010).
As I advanced in my career, I learned that citations were also being used to calculate an “Impact Factor”, and saw how that factor was used to decide who got job positions, which research teams got the funds and which institutions received more financial support from authorities (as denounced by Salazar-Vallejo & Carrera-Parra, 1998). This practice is wrong from several points of view, but bureaucrats continue to follow it, often in ignorance of the methodological and ethical problems involved (Monge-Nájera, 2002; Scheckman, 2013; Ferguson 2016).
When I became an editor, authors started asking me: “What is the impact factor of your journal”, and university authorities asked me to increase my journal’s factor because that would be taken into account when deciding which journals would continue to be published. All of this made sense: they need to know which articles actually have some impact on the advancement of science and technology; it would be unethical to waste funds in publications that no one reads, and when you find them cited you assume they were actually read and used.
But even since my student times I felt there was something wrong about this approach. The first realization came when, as a teacher, I got a letter from the university library asking which journals I would need the next year, so they could subscribe. Why were not they using the list of top cited journals to decide? Then I had my eureka moment: the journals I needed for my work as a tropical biologist had little to do with the top cited journals; just the opposite, by being tropical, they were mostly absent from the list of American and European journals that top the Impact Factor charts. But our librarians were too smart to fall for that “just buy the most cited journals” sales trick. More recently, the position defended by Garfield has been called an “obsolete principle of selecting journals to create a fake-representative sample of ‘journals that matter’ “ (Fernández-Llimos, 2016).
The second realization came later, when a friend showed me more than 80 citations of the Revista de Biología Tropical in a herpetology book and commented “none of these are counted for the Impact Factor, that is, 80 citations give you an Impact Factor of zero!”. Finally, that uncertain feeling became a certainty: to count citations, you must see them, and the Impact Factor -as currently calculated- can be diagnosed as “legally blind”. Others realized that the index was not only wrong in itself, but seriously misused by bureaucrats and even by some scientists, a problem that continues to the present (e.g.: Salazar-Vallejo; Carrera-Parra, 1998; Curry, 2012; Alberts, 2013; Nature, 2013; Bohanom, 2016). Furthermore, the Science Citation Index (SCI) and the Impact Factor are remains from older times; today, being indexed in Google Scholar is the most important thing for a paper to be visible, read and cited (e.g. Syme, 2013, and my own experience).
Our research has found that most citations are not counted by the Impact Factor, even in journals covered by the SCI, because the vast majority of citations occur after you stop counting (e.g. Monge-Nájera and Ho, 2012, 2015, 2017a, 2017b).
Please let me remind you (if I remember his articles correctly after more than 30 years) that Garfield’s original idea was to “computerize” all the scientific literature in the world. His goal was abandoned when the task proved huge and the people in charge had to settle for a fraction of the journals; instead of a statistically valid sample, they chose a commercially viable list of journals from the richest markets (i.e. the USA and Western Europe); these journals still dominate the Impact Factor reports: http://scientific.thomsonreuters.com/imgblast/JCRFullCovlist-2016.pdf).
I have personally seen how, in recent years, commercial reasons led for-profit science companies to expand in the regions they disdained in the past, like Latin America, because growth in the USA and Europe has reached a plateau. This is an opportunity for you to cure the Impact Factor of its blindness by doing two things:
• Add more journals to your database, starting with those in Latin America and other tropical countries: this is of key importance because the bulk of biodiversity is precisely in those countries whose journals you do not cover at all. An example is enough to show the seriousness of your problem: Central America produces 800 indexed journals (Latindex.org), yet you only cover one of them (the Revista de Biología Tropical), missing thousands of citations every year.
• Start counting all citations, not only those published two years after the original publication of the article; again to illustrate, over 90 % of citations of Central American articles occur after you stop counting, that is like ignoring the impact of Mendel´s 1865 work after 1867 (it had no citation impact until 1900 and most of its effect on science took place afterwards: Carlson, 2004).
I believe that this radical correction of the Impact Factor’s fatal flaws is possible because the costs of digital analysis have plummeted and because the capacity of computerized study has grown exponentially. I suspect your limitation in journals and years covered is a case of cultural inertia from the old days when computers could not do it. I have used your database for some basic analyses of scientific output: I value the data you have made available to those of us interested in scientometrics and I sincerely hope you can leave that inertia behind. There are new players in this game and some -such as Google Scholar or SCIELO- could pick up the torch, but independently of who does it, human knowledge will benefit when we reach the urgent goal of a fair and reliable Impact Factor.
Julián Monge-Nájera
Editor-in-Chief
REFERENCES
Alberts, B. (2013). Impact factor distortions. Science, 340(6134), 787-787. doi: 10.1126/science.1240319
Bohannon, J. (2016). Hate journal impact factors? New study gives you one more reason. Science, Posted in Scientific Community Technology. doi:10.1126/science.aag0643
Carlson, E. A. (ed.). (2004). Mendel’s Legacy: the origin of classical genetics. Cold Spring Harbor, New York: Cold Spring Harbor Laboratory Press.
Curry, S. (2012). Sick of Impact Factors. Londres, Occams Typewritter. Retrieved from http://occamstypewriter.org/ scurry/2012/08/13/sick-of-impact-factors
Ferguson, M.W.J. (2016). Treat metrics only as surrogates. Nature, 538, 453-455 (doi:10.1038/538453a)
Fernández-Llimos, F. (2016). Bradford’s law, the long tail principle, and transparency in Journal Impact Factor calculations. Pharmacy Practice, 14(3), 842.
Marder, E., Kettenmann, H., & Grillner, S. (2010). Impacting our young. PNAS, 107(50), 21233. doi: https://doi.org/10.1073/pnas.1016516107
Monge-Nájera, J. (2002). How to be a tropical scientist. Revista de Biología Tropical, 50(3-4).
Monge-Nájera, J., & Ho, Y. S. (2012). Costa Rica Publications in the Science Citation Index Expanded: A bibliometric analysis for 1981-2010. Revista de Biología Tropical, 60(4), 1649-1661. doi: https://doi.org/10.15517/rbt.v60i4.2158
Monge-Nájera, J., & Ho, Y. S. (2015). Bibliometry of Panama publications in the Science Citation Index Expanded: publication type, language, fields, authors and institutions. Revista de Biología Tropical, 63(4), 1255-1266. doi:https://doi.org/10.15517/rbt.v63i4.21112
Monge-Nájera, J., & Ho, Y. S. (2017a). Bibliometrics of Nicaraguan publications in the Science Citation Index Expanded. Revista de Biología Tropical, 65(2), 643-655.
Monge-Nájera, J., & Ho, Y. S. (2017b). Honduras publications in the Science Citation Index Expanded: institutions, fields and authors. Revista de Biología Tropical, 65(2), 657-668.
Nature Materials (2013). Editorial: Beware the Impact Factor. Nature Materials, 12, 89.
Salazar-Vallejo, S., & Carrera-Parra, L. (1998). Taxonomía biológica, Factor de Impacto y evaluación curricular para el siglo XXI. Interciencia, 23(5), 293-298.
Schekman, R. (2013). How journals like Nature, Cell and Science are damaging science. The Guardian, 9, 12-23.
Syme, C. (2013). I hate missing crucial papers! http://www.caitlinsyme.com/news-and-updates/i-hate -missing-crucial-papers