The operationalization of “fields” as WoS subject categories (WCs) in evaluative bibliometrics: The cases of “library and information science” and “science & technology studies”
Loet Leydesdorff
Amsterdam School of Communication Research (ASCoR), University of Amsterdam, P.O. Box 15793, NL-1001 NG, Amsterdam, The Netherlands
Search for more papers by this authorLutz Bornmann
Division for Science and Innovation Studies, Administrative Headquarters of the Max Planck Society, Hofgartenstr. 8, Munich, 80539 Germany
Search for more papers by this authorLoet Leydesdorff
Amsterdam School of Communication Research (ASCoR), University of Amsterdam, P.O. Box 15793, NL-1001 NG, Amsterdam, The Netherlands
Search for more papers by this authorLutz Bornmann
Division for Science and Innovation Studies, Administrative Headquarters of the Max Planck Society, Hofgartenstr. 8, Munich, 80539 Germany
Search for more papers by this authorAbstract
Normalization of citation scores using reference sets based on Web of Science subject categories (WCs) has become an established (“best”) practice in evaluative bibliometrics. For example, the Times Higher Education World University Rankings are, among other things, based on this operationalization. However, WCs were developed decades ago for the purpose of information retrieval and evolved incrementally with the database; the classification is machine-based and partially manually corrected. Using the WC “information science & library science” and the WCs attributed to journals in the field of “science and technology studies,” we show that WCs do not provide sufficient analytical clarity to carry bibliometric normalization in evaluation practices because of “indexer effects.” Can the compliance with “best practices” be replaced with an ambition to develop “best possible practices”? New research questions can then be envisaged.
References
- Bensman, S.J. (2001). Bradford's Law and fuzzy sets: Statistical implications for library analyses. IFLA Journal, 27(4), 238–246.
10.1177/034003520102700406 Google Scholar
- Bensman, S.J. (2007). Garfield and the impact factor. Annual Review of Information Science and Technology, 41(1), 93–155.
- Bensman, S.J., & Leydesdorff, L. (2009). Definition and identification of journals as bibliographic and subject entities: Librarianship versus ISI Journal Citation Reports (JCR) methods and their effect on citation measures. Journal of the American Society for Information Science and Technology, 60(6), 1097–1117.
- Blondel, V.D., Guillaume, J.L., Lambiotte, R., & Lefebvre, E. (2008). Fast unfolding of communities in large networks. Journal of Statistical Mechanics: Theory and Experiment, 8(10), 10008, 1–12.
- Bordons, M., Morillo, F., & Gómez, I. (2004). Analysis of cross-disciplinary research through bibliometric tools. In H.F. Moed, W. Glänzel, & U. Schmoch (Eds.), Handbook of quantitative science and technology research (pp. 437–456). Dordrecht, The Netherlands: Kluwer.
- Bornmann, L., & Marx, W. (2014). How to evaluate individual researchers working in the natural and life sciences meaningfully? A proposal of methods based on percentiles of citations. Scientometrics, 98(1), 487–509.
- Bornmann, L., Marx, W., Schier, H., Rahm, E., Thor, A., & Daniel, H.D. (2009). Convergent validity of bibliometric Google Scholar data in the field of chemistry—Citation counts for papers that were accepted by Angewandte Chemie International Edition or rejected but published elsewhere, using Google Scholar, Science Citation Index, Scopus, and Chemical Abstracts. Journal of Informetrics, 3(1), 27–35.
- Bornmann, L., Wagner, C., & Leydesdorff, L. (in press). BRICS countries and scientific excellence: A bibliometric analysis of most frequently-cited papers. Journal of the Association for Information Science and Technology.
- Boyack, K.W., & Klavans, R. (2011). Multiple dimensions of journal specifity: Why journals can't be assigned to disciplines. In E. Noyons, P. Ngulube, & J. Leta (Eds.), The 13th Conference of the International Society for Scientometrics and Informetrics (Vol. 1, pp. 123–133). Durban, South Africa: ISSI, Leiden University and the University of Zululand.
- Boyack, K.W., Klavans, R., & Börner, K. (2005). Mapping the backbone of science. Scientometrics, 64(3), 351–374.
- Bradford, S.C. (1934). Sources of information on specific subjects. Engineering, 137, 85–86.
- Braun, T., Glänzel, W., Maczelka, H., & Schubert, A. (1994). World science in the eighties. National performances in publication output and citation impact, 1985–1989 versus 1980–1984 part I. All science fields combined, physics, and chemistry. Scientometrics, 29(3), 299–334.
- Butler, L. (2010). An alternative to WoS subject categories: Redefining journal sets for closer alignment to a national classification scheme. In Book of Abstracts of the 11th International Conference on Science and Technology Indicators, September 9–11 (pp. 62–64). Leiden, The Netherlands: Centre for Science and Technology Studies (CWTS).
- Colliander, C. (in press). A novel approach to citation normalization: A similarity-based method for creating reference sets. Journal of the Association for Information Science and Technology.
- Costas, R., van Leeuwen, T.N., & Bordons, M. (2010). A bibliometric classificatory approach for the study and assessment of research performance at the individual level: The effects of age on productivity and impact. Journal of the American Society for Information Science and Technology, 61(8), 1564–1581.
- Dahler-Larsen, P. (2012). The evaluation society. Stanford, CA: Stanford University Press.
- Garfield, E. (1955). Citation indexes for science: A new dimension in documentation through association of ideas. Science, 122(3159), 108–111.
- Garfield, E. (1971). The mystery of the transposed journal lists—Wherein Bradford's Law of Scattering is generalized according to Garfield's Law of Concentration. Current Contents, 3(33), 5–6.
- Garfield, E. (1972). Citation analysis as a tool in journal evaluation. Science, 178(4060), 471–479.
- Gingras, Y., & Larivière, V. (2011). There are neither “king” nor “crown” in scientometrics: Comments on a supposed “alternative” method of normalization. Journal of Informetrics, 5(1), 226–227.
- Glänzel, W. (2010). On reliability and robustness of scientometrics indicators based on stochastic models. An evidence-based opinion paper. Journal of Informetrics, 4(3), 313–319.
- Glänzel, W., & Schubert, A. (2003). A new classification scheme of science fields and subfields designed for scientometric evaluation purposes. Scientometrics, 56(3), 357–367.
- Haustein, S. (2012). Multidimensional journal evaluation: Analyzing scientific periodicals beyond the impact factor. Berlin, Germany & Boston, MA: Walter de Gruyter.
10.1515/9783110255553 Google Scholar
- Kamada, T., & Kawai, S. (1989). An algorithm for drawing general undirected graphs. Information Processing Letters, 31(1), 7–15.
- Katz, J.S., & Hicks, D. (1995, June). The classification of interdisciplinary journals: A new approach. Paper presented at the Proceedings of the Fifth International Conference of the International Society for Scientometrics and Informetrics, River Forest, IL, (pp. 105–115).
- Kostoff, R.N., & Martinez, W.L. (2005). Is citation normalization realistic? Journal of Information Science, 31(1), 57–61.
- Leydesdorff, L. (2006). Can scientific journals be classified in terms of aggregated journal-journal citation relations using the journal citation reports? Journal of the American Society for Information Science and Technology, 57(5), 601–613.
- Leydesdorff, L. (2007). Mapping interdisciplinarity at the interfaces between the Science Citation Index and the Social Science Citation Index. Scientometrics, 71(3), 391–405.
- Leydesdorff, L. (2008). Caveats for the use of citation indicators in research and journal evaluation. Journal of the American Society for Information Science and Technology, 59(2), 278–287.
- Leydesdorff, L., & Bornmann, L. (2011). How fractional counting affects the impact factor: Normalization in terms of differences in citation potentials among fields of science. Journal of the American Society for Information Science and Technology, 62(2), 217–229.
- Leydesdorff, L., & Opthof, T. (2013). Citation analysis using the Medline Database at the web of knowledge: Searching “Times Cited” with Medical Subject Headings (MeSH). Journal of the American Society for Information Science and Technology, 64(5), 1076–1080.
- Leydesdorff, L., Carley, S., & Rafols, I. (2013). Global Maps of Science based on the new Web-of-Science categories. Scientometrics, 94(2), 589–593. doi: 10.1007/s11192-012-0783-8
- Leydesdorff, L., Hammarfelt, B., & Salah, A. (2011). The structure of the Arts & Humanities Citation Index: A mapping on the basis of aggregated citations among 1,157 journals. Journal of the American Society for Information Science and Technology, 62(12), 2414–2426.
- Leydesdorff, L., & Van den Besselaar, P. (1997). Scientometrics and communication theory: Towards theoretically informed indicators. Scientometrics, 38(1), 155–174.
- Leydesdorff, L., Wagner, C.S., & Bornmann, L. (2014). The European Union, China, and the United States in the top-1% and top-10% layers of most-frequently cited publications: Competition and collaborations. Journal of Informetrics, 8(3), 606–617.
- Martin, B., & Irvine, J. (1983). Assessing basic research: Some partial indicators of scientific progress in radio astronomy. Research Policy, 12(2), 61–90.
- Martin, B.R., Nightingale, P., & Yegros-Yegros, A. (2012). Science and technology studies: Exploring the knowledge base. Research Policy, 41(7), 1182–1204.
- Martyn, J., & Gilchrist, A. (1968). An evaluation of British scientific journals. London, United Kingdom: Aslib.
- Milojević, S., & Leydesdorff, L. (2013). Information Metrics (iMetrics): A research specialty with a socio-cognitive identity? Scientometrics, 95(1), 141–157.
- Milojević, S., Sugimoto, C.R., Larivière, V., Thelwall, M., & Ding, Y. (2014). The role of handbooks in knowledge creation and diffusion: A case of science and technology studies. Journal of Informetrics, 8(3), 693–709.
- Moed, H.F., De Bruin, R.E., & Van Leeuwen, T.N. (1995). New bibliometric tools for the assessment of national research performance: Database description, overview of indicators and first applications. Scientometrics, 33(3), 381–422.
- Morillo, F., Bordons, M., & Gómez, I. (2001). An approach to interdisciplinarity through bibliometric indicators. Scientometrics, 51(1), 203–222.
- National Science Board. (2014). Science and engineering indicators. Washington, DC: NSF. Retrieved from http://www.nsf.gov/statistics/seind14/
- Neuhaus, C., & Daniel, H.-D. (2009). A new reference standard for citation analysis in chemistry and related fields based on the sections of Chemical Abstracts. Scientometrics, 78(2), 219–229.
- Nicolaisen, J. (2007). Citation analysis. Annual Review of Information Science and Technology, 41(1), 609–641.
- Opthof, T., & Leydesdorff, L. (2010). Caveats for the journal and field normalizations in the CWTS (“Leiden”) evaluations of research performance. Journal of Informetrics, 4(3), 423–430.
- Price, D.D.S. (1970). Citation measures of hard science, soft science, technology, and nonscience. In C.E. Nelson & D.K. Pollock (Eds.), Communication among scientists and engineers (pp. 3–22). Lexington, MA: Heath.
- Pudovkin, A.I., & Fuseler, E.A. (1995). Indices of journal citation relatedness and citation relationships among aquatic biology journals. Scientometrics, 32(3), 227–236.
- Pudovkin, A.I., & Garfield, E. (2002). Algorithmic procedure for finding semantically related journals. Journal of the American Society for Information Science and Technology, 53(13), 1113–1119.
- Radicchi, F., Fortunato, S., & Castellano, C. (2008). Universality of citation distributions: Toward an objective measure of scientific impact. Proceedings of the National Academy of Sciences, 105(45), 17268–17272.
- Rafols, I., & Leydesdorff, L. (2009). Content-based and algorithmic classifications of journals: Perspectives on the dynamics of scientific communication and Indexer effects. Journal of the American Society for Information Science and Technology, 60(9), 1823–1835.
- Rafols, I., Leydesdorff, L., O'Hare, A., Nightingale, P., & Stirling, A. (2012). How journal rankings can suppress interdisciplinary research: A comparison between innovation studies and business & management. Research Policy, 41(7), 1262–1282.
- Rehn, C., Gornitzki, C., Larsson, A., & Wadskog, D. (2014). Bibliometric handbook for Karolinska Institutet. Stockholm, Sweden: Karolinska Institute.
- Rotolo, D., & Leydesdorff, L. (in press). Matching MEDLINE/PubMed Data with Web of Science (WoS): A routine in R-language. Journal of the Association for Information Science and Technology.
- Ruiz-Castillo, J., & Waltman, L. (2014). Field-normalized citation impact indicators using algorithmically constructed classification systems of science.
- Schubert, A., & Braun, T. (1986). Relative indicators and relational charts for comparative assessment of publication output and citation impact. Scientometrics, 9(5), 281–291.
- Schubert, A., Glänzel, W., & Braun, T. (1989). Scientometric datafiles. A comprehensive set of indicators on 2,649 journals and 96 countries in all major science fields and subfields 1981–1985. Scientometrics, 16(1), 3–478.
- Van den Besselaar, P. (2001). The cognitive and the social structure of science & technology studies. Scientometrics, 51(2), 441–460.
- van Leeuwen, T.N., & Calero-Medina, C. (2012). Redefining the field of economics: Improving field normalization for the application of bibliometric techniques in the field of economics. Research Evaluation, 21(1), 61–70.
- Waltman, L., & van Eck, N.J. (2012). A new methodology for constructing a publication-level classification system of science. Journal of the American Society for Information Science and Technology, 63(12), 2378–2392.
- Waltman, L., Yan, E., & van Eck, N.J. (2011). A recursive field-normalized bibliometric performance indicator: An application to the field of library and information science. Scientometrics, 89(1), 301–314.
- Waltman, L., van Eck, N.J., van Leeuwen, T.N., Visser, M.S., & Van Raan, A.F.J. (2011). Towards a new crown indicator: Some theoretical considerations. Journal of Informetrics, 5(1), 37–47.
- Waltman, L., Calero-Medina, C., Kosten, J., Noyons, E., Tijssen, R.J.W., van Eck, N.J., & Wouters, P. (2012). The Leiden Ranking 2011/2012: Data collection, indicators, and interpretation. Journal of the American Society for Information Science and Technology, 63(12), 2419–2432.
- Wouters, P. (1998). The signs of science. Scientometrics, 41(1), 225–241.
- Zitt, M., & Small, H. (2008). Modifying the journal impact factor by fractional citation weighting: The audience factor. Journal of the American Society for Information Science and Technology, 59(11), 1856–1860.