Search (3 results, page 1 of 1)

  • × author_ss:"Mingers, J."
  • × theme_ss:"Informetrie"
  1. Mingers, J.; Burrell, Q.L.: Modeling citation behavior in Management Science journals (2006) 0.02
    0.024192145 = product of:
      0.036288217 = sum of:
        0.016003672 = weight(_text_:on in 994) [ClassicSimilarity], result of:
          0.016003672 = score(doc=994,freq=2.0), product of:
            0.109763056 = queryWeight, product of:
              2.199415 = idf(docFreq=13325, maxDocs=44218)
              0.04990557 = queryNorm
            0.14580199 = fieldWeight in 994, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              2.199415 = idf(docFreq=13325, maxDocs=44218)
              0.046875 = fieldNorm(doc=994)
        0.020284547 = product of:
          0.040569093 = sum of:
            0.040569093 = weight(_text_:22 in 994) [ClassicSimilarity], result of:
              0.040569093 = score(doc=994,freq=2.0), product of:
                0.1747608 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.04990557 = queryNorm
                0.23214069 = fieldWeight in 994, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.046875 = fieldNorm(doc=994)
          0.5 = coord(1/2)
      0.6666667 = coord(2/3)
    
    Date
    26.12.2007 19:22:05
    Footnote
    Beitrag in einem "Special Issue on Informetrics"
  2. Leydesdorff, L.; Bornmann, L.; Mingers, J.: Statistical significance and effect sizes of differences among research universities at the level of nations and worldwide based on the Leiden rankings (2019) 0.01
    0.008890929 = product of:
      0.026672786 = sum of:
        0.026672786 = weight(_text_:on in 5225) [ClassicSimilarity], result of:
          0.026672786 = score(doc=5225,freq=8.0), product of:
            0.109763056 = queryWeight, product of:
              2.199415 = idf(docFreq=13325, maxDocs=44218)
              0.04990557 = queryNorm
            0.24300331 = fieldWeight in 5225, product of:
              2.828427 = tf(freq=8.0), with freq of:
                8.0 = termFreq=8.0
              2.199415 = idf(docFreq=13325, maxDocs=44218)
              0.0390625 = fieldNorm(doc=5225)
      0.33333334 = coord(1/3)
    
    Abstract
    The Leiden Rankings can be used for grouping research universities by considering universities which are not statistically significantly different as homogeneous sets. The groups and intergroup relations can be analyzed and visualized using tools from network analysis. Using the so-called "excellence indicator" PPtop-10%-the proportion of the top-10% most-highly-cited papers assigned to a university-we pursue a classification using (a) overlapping stability intervals, (b) statistical-significance tests, and (c) effect sizes of differences among 902 universities in 54 countries; we focus on the UK, Germany, Brazil, and the USA as national examples. Although the groupings remain largely the same using different statistical significance levels or overlapping stability intervals, these classifications are uncorrelated with those based on effect sizes. Effect sizes for the differences between universities are small (w < .2). The more detailed analysis of universities at the country level suggests that distinctions beyond three or perhaps four groups of universities (high, middle, low) may not be meaningful. Given similar institutional incentives, isomorphism within each eco-system of universities should not be underestimated. Our results suggest that networks based on overlapping stability intervals can provide a first impression of the relevant groupings among universities. However, the clusters are not well-defined divisions between groups of universities.
  3. Xu, F.; Liu, W.B.; Mingers, J.: New journal classification methods based on the global h-index (2015) 0.01
    0.0062868367 = product of:
      0.01886051 = sum of:
        0.01886051 = weight(_text_:on in 2684) [ClassicSimilarity], result of:
          0.01886051 = score(doc=2684,freq=4.0), product of:
            0.109763056 = queryWeight, product of:
              2.199415 = idf(docFreq=13325, maxDocs=44218)
              0.04990557 = queryNorm
            0.1718293 = fieldWeight in 2684, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              2.199415 = idf(docFreq=13325, maxDocs=44218)
              0.0390625 = fieldNorm(doc=2684)
      0.33333334 = coord(1/3)
    
    Abstract
    In this work we develop new journal classification methods based on the h-index. The introduction of the h-index for research evaluation has attracted much attention in the bibliometric study and research quality evaluation. The main purpose of using an h-index is to compare the index for different research units (e.g. researchers, journals, etc.) to differentiate their research performance. However the h-index is defined by only comparing citations counts of one's own publications, it is doubtful that the h index alone should be used for reliable comparisons among different research units, like researchers or journals. In this paper we propose a new global h-index (Gh-index), where the publications in the core are selected in comparison with all the publications of the units to be evaluated. Furthermore, we introduce some variants of the Gh-index to address the issue of discrimination power. We show that together with the original h-index, they can be used to evaluate and classify academic journals with some distinct advantages, in particular that they can produce an automatic classification into a number of categories without arbitrary cut-off points. We then carry out an empirical study for classification of operations research and management science (OR/MS) journals using this index, and compare it with other well-known journal ranking results such as the Association of Business Schools (ABS) Journal Quality Guide and the Committee of Professors in OR (COPIOR) ranking lists.