Search (1 results, page 1 of 1)

  • × year_i:[2000 TO 2010}
  • × author_ss:"Eck, N.J. van"
  1. Waltman, L.; Eck, N.J. van: Some comments on the question whether co-occurrence data should be normalized (2007) 0.03
    0.030940626 = product of:
      0.06188125 = sum of:
        0.06188125 = product of:
          0.1237625 = sum of:
            0.1237625 = weight(_text_:maps in 583) [ClassicSimilarity], result of:
              0.1237625 = score(doc=583,freq=2.0), product of:
                0.28477904 = queryWeight, product of:
                  5.619245 = idf(docFreq=435, maxDocs=44218)
                  0.050679237 = queryNorm
                0.43459132 = fieldWeight in 583, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  5.619245 = idf(docFreq=435, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=583)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    In a recent article in JASIST, L. Leydesdorff and L. Vaughan (2006) asserted that raw cocitation data should be analyzed directly, without first applying a normalization such as the Pearson correlation. In this communication, it is argued that there is nothing wrong with the widely adopted practice of normalizing cocitation data. One of the arguments put forward by Leydesdorff and Vaughan turns out to depend crucially on incorrect multidimensional scaling maps that are due to an error in the PROXSCAL program in SPSS.