Search (3 results, page 1 of 1)

  • × author_ss:"Waltman, L."
  • × year_i:[2000 TO 2010}
  1. Eck, N.J. van; Waltman, L.: Appropriate similarity measures for author co-citation analysis (2008) 0.04
    0.040854752 = sum of:
      0.02029019 = product of:
        0.08116076 = sum of:
          0.08116076 = weight(_text_:authors in 2008) [ClassicSimilarity], result of:
            0.08116076 = score(doc=2008,freq=2.0), product of:
              0.2301925 = queryWeight, product of:
                4.558814 = idf(docFreq=1258, maxDocs=44218)
                0.050493944 = queryNorm
              0.35257778 = fieldWeight in 2008, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                4.558814 = idf(docFreq=1258, maxDocs=44218)
                0.0546875 = fieldNorm(doc=2008)
        0.25 = coord(1/4)
      0.020564564 = product of:
        0.061693687 = sum of:
          0.061693687 = weight(_text_:l in 2008) [ClassicSimilarity], result of:
            0.061693687 = score(doc=2008,freq=2.0), product of:
              0.20069589 = queryWeight, product of:
                3.9746525 = idf(docFreq=2257, maxDocs=44218)
                0.050493944 = queryNorm
              0.30739886 = fieldWeight in 2008, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                3.9746525 = idf(docFreq=2257, maxDocs=44218)
                0.0546875 = fieldNorm(doc=2008)
        0.33333334 = coord(1/3)
    
    Abstract
    We provide in this article a number of new insights into the methodological discussion about author co-citation analysis. We first argue that the use of the Pearson correlation for measuring the similarity between authors' co-citation profiles is not very satisfactory. We then discuss what kind of similarity measures may be used as an alternative to the Pearson correlation. We consider three similarity measures in particular. One is the well-known cosine. The other two similarity measures have not been used before in the bibliometric literature. We show by means of an example that the choice of an appropriate similarity measure has a high practical relevance. Finally, we discuss the use of similarity measures for statistical inference.
  2. Waltman, L.; Eck, N.J. van: Some comments on the question whether co-occurrence data should be normalized (2007) 0.02
    0.017809436 = product of:
      0.03561887 = sum of:
        0.03561887 = product of:
          0.10685661 = sum of:
            0.10685661 = weight(_text_:l in 583) [ClassicSimilarity], result of:
              0.10685661 = score(doc=583,freq=6.0), product of:
                0.20069589 = queryWeight, product of:
                  3.9746525 = idf(docFreq=2257, maxDocs=44218)
                  0.050493944 = queryNorm
                0.53243047 = fieldWeight in 583, product of:
                  2.4494898 = tf(freq=6.0), with freq of:
                    6.0 = termFreq=6.0
                  3.9746525 = idf(docFreq=2257, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=583)
          0.33333334 = coord(1/3)
      0.5 = coord(1/2)
    
    Abstract
    In a recent article in JASIST, L. Leydesdorff and L. Vaughan (2006) asserted that raw cocitation data should be analyzed directly, without first applying a normalization such as the Pearson correlation. In this communication, it is argued that there is nothing wrong with the widely adopted practice of normalizing cocitation data. One of the arguments put forward by Leydesdorff and Vaughan turns out to depend crucially on incorrect multidimensional scaling maps that are due to an error in the PROXSCAL program in SPSS.
  3. Eck, N.J. van; Waltman, L.: How to normalize cooccurrence data? : an analysis of some well-known similarity measures (2009) 0.01
    0.008813383 = product of:
      0.017626766 = sum of:
        0.017626766 = product of:
          0.0528803 = sum of:
            0.0528803 = weight(_text_:l in 2942) [ClassicSimilarity], result of:
              0.0528803 = score(doc=2942,freq=2.0), product of:
                0.20069589 = queryWeight, product of:
                  3.9746525 = idf(docFreq=2257, maxDocs=44218)
                  0.050493944 = queryNorm
                0.26348472 = fieldWeight in 2942, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.9746525 = idf(docFreq=2257, maxDocs=44218)
                  0.046875 = fieldNorm(doc=2942)
          0.33333334 = coord(1/3)
      0.5 = coord(1/2)