Search (3 results, page 1 of 1)

  • × year_i:[2000 TO 2010}
  • × author_ss:"Waltman, L."
  1. Waltman, L.; Eck, N.J. van: Some comments on the question whether co-occurrence data should be normalized (2007) 0.02
    0.02400108 = product of:
      0.04800216 = sum of:
        0.04800216 = product of:
          0.09600432 = sum of:
            0.09600432 = weight(_text_:l in 583) [ClassicSimilarity], result of:
              0.09600432 = score(doc=583,freq=6.0), product of:
                0.18031335 = queryWeight, product of:
                  3.9746525 = idf(docFreq=2257, maxDocs=44218)
                  0.045365814 = queryNorm
                0.53243047 = fieldWeight in 583, product of:
                  2.4494898 = tf(freq=6.0), with freq of:
                    6.0 = termFreq=6.0
                  3.9746525 = idf(docFreq=2257, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=583)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    In a recent article in JASIST, L. Leydesdorff and L. Vaughan (2006) asserted that raw cocitation data should be analyzed directly, without first applying a normalization such as the Pearson correlation. In this communication, it is argued that there is nothing wrong with the widely adopted practice of normalizing cocitation data. One of the arguments put forward by Leydesdorff and Vaughan turns out to depend crucially on incorrect multidimensional scaling maps that are due to an error in the PROXSCAL program in SPSS.
  2. Eck, N.J. van; Waltman, L.: Appropriate similarity measures for author co-citation analysis (2008) 0.01
    0.013857029 = product of:
      0.027714059 = sum of:
        0.027714059 = product of:
          0.055428118 = sum of:
            0.055428118 = weight(_text_:l in 2008) [ClassicSimilarity], result of:
              0.055428118 = score(doc=2008,freq=2.0), product of:
                0.18031335 = queryWeight, product of:
                  3.9746525 = idf(docFreq=2257, maxDocs=44218)
                  0.045365814 = queryNorm
                0.30739886 = fieldWeight in 2008, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.9746525 = idf(docFreq=2257, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=2008)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
  3. Eck, N.J. van; Waltman, L.: How to normalize cooccurrence data? : an analysis of some well-known similarity measures (2009) 0.01
    0.011877453 = product of:
      0.023754906 = sum of:
        0.023754906 = product of:
          0.04750981 = sum of:
            0.04750981 = weight(_text_:l in 2942) [ClassicSimilarity], result of:
              0.04750981 = score(doc=2942,freq=2.0), product of:
                0.18031335 = queryWeight, product of:
                  3.9746525 = idf(docFreq=2257, maxDocs=44218)
                  0.045365814 = queryNorm
                0.26348472 = fieldWeight in 2942, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.9746525 = idf(docFreq=2257, maxDocs=44218)
                  0.046875 = fieldNorm(doc=2942)
          0.5 = coord(1/2)
      0.5 = coord(1/2)