e-Ciencias de la Información ISSN electrónico: 1659-4142

OAI: https://revistas.ucr.ac.cr/index.php/eciencias/oai
Web of Science as a research tool and support for scientific activity: lights and shadows of their collections, products and indicators
PDF (Español (España))
HTML (Español (España))
EPUB (Español (España))
XML (Español (España))

Keywords

Web of Science
Bibliometric
Bibliometric Indicators
Scientific Evaluation
Journals
Bibliographic databases
Web of science
Bibliometría
Indicadores Bibliométricos
Evaluación científica
Revistas
Bases de datos bibliográficas

How to Cite

Gregorio Chaviano, O., López Mesa, E. K., & Limaymanta, C. H. (2022). Web of Science as a research tool and support for scientific activity: lights and shadows of their collections, products and indicators. E-Ciencias De La Información, 12(1). https://doi.org/10.15517/eci.v12i1.46660

Abstract

A description of the collections, products and bibliometric indicators of Web of Science is made, with special emphasis on its usefulness and importance in scientific evaluation activities. The main limitations of coverage and indicators, which have an impact on the analysis of scientific production in peripheral countries and/or regions and in areas of knowledge with less representation in the source, are also discussed. The specific contributions of the database to the different activities and phases of scientific research, such as researchers, journals, publishing groups and libraries, are also discussed. Specifically, the volume of data is shown, its collections, products and indicators are detailed, together with the evaluation of some positive and negative aspects. Comparisons are made with other sources of information existing in the scientific research market, which also allow bibliometric research to be carried out, providing the reader with an important characterization of the tool and its competitors, which helps to know its perspectives of use within the research scenario. The ideas developed and systematized in the text lead to the conclusion that despite its relevance for scientific activity at different levels and aggregates, the biases of its indicators, the impossibility of accessing the source in many institutions and the existence of other tools with similar features and ease of use, are aspects that should be taken into account because they affect its application, future use and permanence in the research ecosystem

https://doi.org/10.15517/eci.v12i1.46660
PDF (Español (España))
HTML (Español (España))
EPUB (Español (España))
XML (Español (España))

References

Aguillo, I. F. (2015). La Declaración de San Francisco (DORA) y la mala bibliometría. Anuario ThinkEPI, (9), 183-188. doi: https://doi.org/10.3145/thinkepi.2015.43

American Society for Cell Biology (2013). San Francisco Declaration on Research Assessment (DORA). Recuperado de http://www.ascb.org/dora-old/files/SFDeclarationFINAL.pdf

Anninos, L. N. (2014). Research performance evaluation: some critical thoughts on standard bibliometric indicators. Studies in Higher Education, 39(9), 1542-1561. doi: https://doi.org/10.1080/03075079.2013.801429

Arencibia Jorge, R., y de Moya Anegón, F. (2008). La evaluación de la investigación científica: una aproximación teórica desde la cienciometría. Acimed, 17(4), 1-27. Recuperado de http://scielo.sld.cu/pdf/aci/v17n4/aci04408.pdf

Arencibia-Jorge, R., y Peralta-González, M. J. (2020). Recommendations on the use of Scopus for the study of Information Sciences in Latin America. Iberoamerican Journal of Science Measurement and Communication, (1), 1-4. doi: https://doi.org/10.47909/ijsmc.07

Bakkalbasi, N., Bauer, K., Glover, J., y Wang, L. (2006). Three options for citation tracking: Google Scholar, Scopus and Web of Science. Biomedical digital libraries, 3(1), 1-8. doi: https://doi.org/10.1186/1742-5581-3-7

Birkle, C., Pendlebury, D. A., Schnell, J., y Adams, J. (2020). Web of Science as a data source for research on scientific and scholarly activity. Quantitative Science Studies, 1(1), 363–376. doi: https://doi.org/10.1162/qss_a_00018

Camps, D. (2008). Limitaciones de los indicadores bibliométricos en la evaluación de la actividad científica biomédica. Colombia Médica, 39(1), 74-79. Recuperado de http://www.scielo.org.co/pdf/cm/v39n1/v39n1a9.pdf

Castelló-Cogollos, L., Sixto-Costoya, A., Lucas-Domínguez, R., Agulló-Calatayud, V., de Dios, J. G., y Aleixandre-Benavent, R. (2018). Bibliometría e indicadores de actividad científica (XI). Otros recursos útiles en la evaluación: Google Scholar, Microsoft Academic, 1findr, Dimensions y Lens. org. Acta Pediátrica Española, 76(9/10), 123-130. Recuperado de https://www.actapediatrica.com/images/pdf/Volumen-76---Numeros-9-y-10---Septiembre-y-octubre-2018.pdf

Chavarro, D, Rafols, I., y Tang, P. (2018). To what extent is inclusion in the Web of Science an indicator of journal ‘quality’? Research Evaluation, 27(2), 106-118. doi: https://doi.org/10.1093/reseval/rvy001

Clarivate (2021). Clarivate – data, insights and analytics for the innovation. Recuperado de https://clarivate.com/

Clarivate (2021). Web of Science Journal Evaluation Process and Selection Criteria. Recuperado de https://clarivate.com/webofsciencegroup/journal-evaluation-process-and-selection-criteria/

Delgado López-Cózar, E., Orduña-Malea, E., y Martín-Martín, A. (2019). Google Scholar as a data source for research assessment. En: W. Glanzel, H. Moed, U. Schmoch, & M. Thelwall (Eds.), Springer handbook of science and technology indicators. Berlin, Alemania: Springer. doi: https://doi.org/10.1007/978-3-030-02511-3

Delgado, E., y Repiso, R. (2013). El impacto de las revistas de comunicación: comparando Google Scholar Metrics, Web of Science y Scopus. Comunicar, 21(41), 45-52. doi: http://dx.doi.org/10.3916/C41-2013-04

Delgado-López-Cózar, E., Ràfols, I., y Abadal, E. (2021). Carta: Por un cambio radical en la evaluación de la investigación en España. El Profesional de la Información, 30(3), e300309. doi: https://doi.org/10.3145/epi.2021.may.09

Escalona Fernández, M. I, Lagar Barbosa, P., y Pulgarin Guerrero, A. (2010). Web of Science vs. Scopus: un estudio cuantitativo en Ingeniería Química. Anales de documentación, (13), 159-175. Recuperado de https://revistas.um.es/analesdoc/article/view/107121/101801

Falagás, M. E., Pitsouni, E. I., Malietzis, G. A., y Pappas, G. (2008). Comparison of PubMed, Scopus, web of science, and Google scholar: strengths and weaknesses. The FASEB Journal, 22(2), 338-342. doi: https://doi.org/10.1096/fj.07-9492LSF

Garfield, E. (1955). Citation indexes for science: A New Dimension in Documentation through Association of Ideas. Science, 122(3159), 108-111. doi: https://10.1126/science.122.3159.108

Garfield, E. (1972). Citation analysis as a tool in journal evaluation. Science, 178(4060), 471-479. Recuperado de https://www.jstor.org/stable/1735096

Harzing, A. W. (2019). Two new kids on the block: How do Crossref and Dimensions compare with Google Scholar, Microsoft Academic, Scopus and the Web of Science? Scientometrics, (120), 341–349. doi: https://doi.org/10.1007/s11192-019-03114-y

Herzog, C., Hook, D., y Konkiel, S. (2020). Dimensions: bringing down barriers between scientometricians and data. Quantitative Science Studies, 1(1), 387–395. doi: https://doi.org/10.1162/qss_a_00020

Hicks, D., Wouters, P., Waltman, L., De-Rijcke, S., y Ràfols, I. (2015). Bibliometrics: the Leiden Manifesto for research metrics. Nature, (520), 429-431. doi: https://doi.org/10.1038/520429a

Hook, D. W., Porter, S. J., y Herzog, C. (2018). Dimensions: Building Context for Search and Evaluation. Frontiers in Research Metrics and Analytics, 3, 23. doi: https://doi.org/10.3389/frma.2018.00023

Martín-Martín, A., Orduña-Malea, E., Thelwall, M. y López-Cózar, E. (2018). Google Scholar, Web of Science, and Scopus: A systematic comparison of citations in 252 subject categories. Journal of Informetrics, 12(4), 1160-1177. doi: https://doi.org/10.1016/j.joi.2018.09.002

Martín-Martín, A., Thelwall, M., Orduña-Malea, E. y Delgado López-Cozar, E. (2021). Google Scholar, Microsoft Academic, Scopus, Dimensions, Web of Science, and OpenCitations’ COCI: a multidisciplinary comparison of coverage via citations. Scientometrics (126), 871–906. doi: https://doi.org/10.1007/s11192-020-03690-4

Mongeon, P. y Paul-Hus, A. (2016). The journal coverage of Web of Science and Scopus: a comparative analysis. Scientometrics, (106), 213–228. doi: https://doi.org/10.1007/s11192-015-1765-5

Orduña-Malea, E., y Delgado-López-Cózar, E. (2018). Dimensions: re-discovering the ecosystem of scientific information. El profesional de la Información, 27(2), 420-431. doi: https://doi.org/10.3145/epi.2018.mar.21

Orduña-Malea, E., Ayllón, J. M., Martín-Martín, A., y Delgado López-Cózar, E. (2015). Methods for estimating the size of Google Scholar. Scientometrics, (104), 931-949. https://doi.org/10.1007/s11192-015-1614-6

Orduña-Malea, E. y Delgado-López-Cózar, E. (2018). ¡Viva la competencia! Nuevas dimensiones para la búsqueda y evaluación de la información científica. Anuario ThinkEPI, (12), 304-310. doi: https://doi.org/10.3145/thinkepi.2018.45

Repiso, R., y Torres-Salinas, D. (2016). Características e implicaciones de la base de datos Emerging Sources Citation Index (Thomson Reuters): las revistas en estado transitorio. Anuario ThinkEPI, (10), 234-236. http://dx.doi.org/10.3145/thinkepi.2016.46

Rijcke, S. D., Wouters, P. F., Rushforth, A. D., Franssen, T. P., y Hammarfelt, B. (2016). Evaluation practices and effects of indicator use—a literature review. Research Evaluation, 25(2), 161-169. doi: https://doi.org/10.1093/reseval/rvv038

Ruiz-Pérez, R., Jiménez-Contreras, E., y Delgado-López-Cózar, E. (2008). Complementos bibliométricos de Thomson Scientific en la Web: buenos, bonitos y gratuitos. El Profesional de la Información, 17(5), 559-563. https://doi.org/10.3145/epi.2008.sep.11

Seglén, P. (1997). Why the impact factor of journals should not be used for evaluating research. BMJ, 314(7079), 498-502. doi: https://doi.org/10.1136/bmj.314.7079.497

Singh, P., Piryani, R., Singh, V. K., y Pinto, D. (2020). Revisiting subject classification in academic databases: a comparison of the classifcation accuracy of Web of Science, Scopus & Dimensions. Journal of Intelligent & Fuzzy Systems, 39(2), 2471–2476. https://doi.org/10.3233/JIFS-179906

Singh, V. K., Singh, P., Karmakar, M., Leta, J., y Mayr, P. (2021). The Journal Coverage of Web of Science, Scopus, and Dimensions: A Comparative Analysis. Scientometrics, (126), 5113–5142. doi: https://doi.org/10.1007/s11192-021-03948-5

Thelwall, M. (2018). Dimensions: A competitor to Scopus and the Web of Science? Journal of Informetrics, 12(2), 430-435. doi: https://doi.org/10.1016/j.joi.2018.03.006

Torres-Salinas, D. y Jiménez-Contreras, E. (2010). Introducción y estudio comparativo de los nuevos indicadores de citación sobre revistas científicas en Journal Citation Reports y Scopus. El profesional de la Información, 19(2), 201-207. doi: https://doi.org/10.3145/epi.2010.mar.12

Torres-Salinas, D., Ruiz-Pérez, R. y Delgado-López-Cózar, E. (2009). Google Scholar como herramienta para la evaluación científica. El profesional de la Información, 18 (5), 501-510. doi: https://doi.org/10.3145/epi.2009.sep.03

Vanderstraeten, R. y Vandermoere, F. (2021). Inequalities in the growth of Web of Science. Scientometrics, (126), 8635–8651. doi: https://doi.org/10.1007/s11192-021-04143-2

Comments

Creative Commons License

This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.

Copyright (c) 2022 Orlando Gregorio Chaviano, Evony Katherine López Mesa, Cesar H. Limaymanta

Downloads

Download data is not yet available.