Factor de impacto agregado según campos científicos

  1. María Isabel Dorta González
  2. Pablo Dorta González
Journal:
Investigación bibliotecológica

ISSN: 2448-8321 0187-358X

Year of publication: 2014

Volume: 28

Issue: 62

Pages: 15-28

Type: Article

DOI: 10.1016/S0187-358X(14)72563-8 DIALNET GOOGLE SCHOLAR lock_openOpen access editor

More publications in: Investigación bibliotecológica

Abstract

Science journal impact indicators are not comparable because of inherent differences in publication and citation behaviors from field to field. A breakdown of the field aggregate impact factor of databases shows that for the 22 fields and four areas considered by Thomson Reuters, the leading provider of science indicators, five variables largely explain variance in impact factor of a given field. Therefore, it is necessary to consider all these sources of variance in the standardization process of the impact indicators. A Principal Component Analysis is employed to find the sources of the variance and a Cluster Analysis is used to detect similarities.

Bibliographic References

  • Althouse, B. M., West, J. D., Bergstrom, C. T., Bergstrom, T. (2009). Differences in impact factor across fields and over time. Journal of the American Society for Information Science and Technology. 60. 27-34
  • Bensman, S. J.. (2007). Garfield and the impact factor. Annual Review of Information Science and Technology. 41. 93-155
  • Bornmann, L., Daniel, H. D.. (2008). What do citation counts measure? A review of studies on citing behavior. Journal of Documentation. 64. 45-80
  • Dorta-González, P., Dorta-González, M. I.. (2010). Indicador bibliométrico basado en el índice h. Revista Española de Documentación Científica. 33. 225
  • Dorta-González, P.. (2011). Aplicación empírica de un indicador bibliométrico basado en el índice h. Cultura y Educación. 23. 297-313
  • Dorta-González, P.. (2011). Central indexes to the citation distribution: A complement to the h-index. Scientometrics. 88. 729
  • Dorta-González, P.. (2013). Comparing journals from different fields of science and social science through a JCR subject categories normalized impact factor. Scientometrics. 95. 645
  • Dorta-González, P.. (2013). Impact maturity times and citation time windows: The 2-year maximum journal impact factor. Journal of Informetrics. 7. 593-602
  • Egghe, L., Rousseau, R.. (2002). A general framework for relative impact indicators. Canadian Journal of Information and Library Science. 27. 29-48
  • Garfield, E.. (1972). Citation analysis as a tool in journal evaluation. Science. 178. 471
  • Garfield, E.. (1979). Citation indexing: Its theory and application in, Science Technology, and Humanities. John Wiley. New York.
  • Garfield, E.. (1979). Is citation analysis a legitimate evaluation tool?. Scientometrics. 1. 359
  • Leydesdorff, L.. (2006). Can scientific journals be classified in terms of aggregated journal-journal citation relations using the Journal Citation Reports?. Journal of the American Society for Information Science & Technology. 57. 601
  • Leydesdorff, L., Bornmann, L.. (2011). How fractional counting of citations affects the Impact Factor: Normalization in terms of differences in citation potentials among fields of science. Journal of the American Society for Information Science & Technology. 62. 217
  • Leydesdorff, L., Opthof, T.. (2010). Normalization at the field level: Fractional counting of citations. Journal of Informetrics. 4. 644
  • Leydesdorff, L., Rafols, I.. (2011). Indicators of the interdisciplinarity of journals: Diversity, centrality, and citations. Journal of Informetrics. 5. 87-100
  • Moed, H. F.. (2010). Measuring contextual citation impact of scientific journals. Journal of Informetrics. 4. 265
  • Opthof, T., Leydesdorff, L.. (2010). Caveats for the journal and field normalizations in the CWTS ('Leiden') evaluations of research performance. Journal of Informetrics. 4. 423
  • Pudovkin, A. I., Garfield, E.. (2002). Algorithmic procedure for finding semantically related journals. Journal of the American Society for Information Science and Technology. 53. 1113
  • Rafols, I., Leydesdorff, L.. (2009). Content-based and algorithmic classifications of journals: Perspectives on the dynamics of scientific communication and indexer effects. Journal of the American Society for Information Science and Technology. 60. 1823
  • Rosvall, M., Bergstrom, C. T.. (2008). Maps of random walks on complex networks reveal community structure. Proceedings of the National Academy of Sciences. 105. 1118
  • Van Raan, A. F. J., Van Leeuwen, T. N., Visser, M. S., Van Eck, N. J., Waltman, L.. (2010). Rivals for the crown: Reply to Opthof and Leydesdorff. Journal of Informetrics. 4. 431
  • Wagner, C., Roessner, J. D., Bobb, K., Klein, J., Boyack, K., Keyton, J., Rafols, I., Börner, K.. (2011). Approaches to understanding and measuring interdisciplinary scientific research (IDR): A review of the literature. Journal of Informetrics. 5. 14-26
  • Zitt, M., Small, H.. (2008). Modifying the journal impact factor by fractional citation weighting: The audience factor. Journal of the American Society for Information Science and Technology. 59. 1856