Search (3 results, page 1 of 1)

  • × author_ss:"Luwel, M."
  • × author_ss:"Moed, H.F."
  1. Noyons, E.C.M.; Moed, H.F.; Luwel, M.: Combining mapping and citation analysis for evaluative bibliometric purposes : a bibliometric study (1999) 0.00
    0.0028703054 = product of:
      0.005740611 = sum of:
        0.005740611 = product of:
          0.011481222 = sum of:
            0.011481222 = weight(_text_:a in 2941) [ClassicSimilarity], result of:
              0.011481222 = score(doc=2941,freq=16.0), product of:
                0.053105544 = queryWeight, product of:
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.046056706 = queryNorm
                0.2161963 = fieldWeight in 2941, product of:
                  4.0 = tf(freq=16.0), with freq of:
                    16.0 = termFreq=16.0
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.046875 = fieldNorm(doc=2941)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    The general aim of the article is to demonstrate how the results both of a structural analysis, and of a research performance assessment of a research field, can be enriched by combining elements of both into one integrated analysis. In addition, a procedure is discussed to select and analyse candidate benchmark institutes to assess the position of a particular resrach institute, in terms of both its cognitive orientation and its scientific production and impact at the international research front. The combined method is applied in an evaluation of the research scope and performance of the Interuniversity Centre for Micro-Electronics (IMEC) in Leuven, Belgium. On the basis of the comments of an international panel of experts in micro-electronics, the method was discussed in detail. We concluded that the method provides a detailed and useful picture of the position of the institute from an international perspective. Moreover, we found that the results of each of the 2 parts are an added value to the other
    Type
    a
  2. Matia, K.; Nunes Amaral, L.A.; Luwel, M.; Moed, H.F.; Stanley, H.E.: Scaling phenomena in the growth dynamics of scientific output (2005) 0.00
    0.0026849252 = product of:
      0.0053698504 = sum of:
        0.0053698504 = product of:
          0.010739701 = sum of:
            0.010739701 = weight(_text_:a in 3677) [ClassicSimilarity], result of:
              0.010739701 = score(doc=3677,freq=14.0), product of:
                0.053105544 = queryWeight, product of:
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.046056706 = queryNorm
                0.20223314 = fieldWeight in 3677, product of:
                  3.7416575 = tf(freq=14.0), with freq of:
                    14.0 = termFreq=14.0
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.046875 = fieldNorm(doc=3677)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    We analyze a set of three databases at different levels of aggregation: (a) a database of approximately 106 publications from 247 countries published from 1980-2001, (b) a database of 508 academic institutions from the European Union (EU) and 408 institutes from the United States for the 11-year period of 1991-2001, and (c) a database of 2,330 Flemish authors published in the period from 1980-2000. At all levels of aggregation we find that the mean annual growth rates of publications is independent of the number of publications of the various units involved. We also find that the standard deviation of the distribution of annual growth rates decays with the number of publications as a Power law with exponent 0.3. These findings are consistent with those of recent studies of systems such as the size of research and development funding budgets of countries, the research publication volumes of U.S. universities, and the size of business firms.
    Type
    a
  3. Moed, H.F.; Luwel, M.; Nederhof, A.J.: Towards research performance in the humanities (2002) 0.00
    0.0023435948 = product of:
      0.0046871896 = sum of:
        0.0046871896 = product of:
          0.009374379 = sum of:
            0.009374379 = weight(_text_:a in 820) [ClassicSimilarity], result of:
              0.009374379 = score(doc=820,freq=24.0), product of:
                0.053105544 = queryWeight, product of:
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.046056706 = queryNorm
                0.17652355 = fieldWeight in 820, product of:
                  4.8989797 = tf(freq=24.0), with freq of:
                    24.0 = termFreq=24.0
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.03125 = fieldNorm(doc=820)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    This paper describes a general methodology for developing bibliometric performance indicators. Such a description provides a framework or paradigm for application-oriented research in the field of evaluative quantitative science and technology studies, particularly in the humanities and social sciences. It is based on our study of scholarly output in the field of Law at the four major universities in Flanders, the Dutch speaking part of Belgium. The study illustrates that bibliometrics is much more than conducting citation analyses based on the indexes produced by the Institute for Scientific Information (ISI), since citation data do not play a role in the study. Interaction with scholars in the fields under consideration and openness in the presentation of the quantitative outcomes are the basic features of the methodology. Bibliometrics should be used as an instrument to create a mirror. While not a direct reflection, this study provides a thorough analysis of how scholars in the humanities and social sciences structure their activities and their research output. This structure can be examined empirically from the point of view of its consistency and the degree of consensus among scholars. Relevant issues can be raised that are worth considering in more detail in followup studies, and conclusions from our empirical materials may illuminate such issues. We argue that the principal aim of the development and application of bibliometric indicators is to stimulate a debate among scholars in the field under investigation on the nature of scholarly quality, its principal dimensions, and operationalizations. This aim provides a criterion of "productivity" of the development process. We further contend that librarians are not infrequently requested to provide assistance in collecting data related to research performance assessments, and that the methodology described in the paper aims at offering a general framework for such activities, and can be used by librarians as a line of action whenever they become involved.
    Type
    a