-
Waltman, L.; Eck, N.J. van: Some comments on the question whether co-occurrence data should be normalized (2007)
0.02
0.01664811 = product of:
0.049944326 = sum of:
0.049944326 = product of:
0.09988865 = sum of:
0.09988865 = weight(_text_:van in 583) [ClassicSimilarity], result of:
0.09988865 = score(doc=583,freq=2.0), product of:
0.23160313 = queryWeight, product of:
5.5765896 = idf(docFreq=454, maxDocs=44218)
0.04153132 = queryNorm
0.43129233 = fieldWeight in 583, product of:
1.4142135 = tf(freq=2.0), with freq of:
2.0 = termFreq=2.0
5.5765896 = idf(docFreq=454, maxDocs=44218)
0.0546875 = fieldNorm(doc=583)
0.5 = coord(1/2)
0.33333334 = coord(1/3)
-
Eck, N.J. van; Waltman, L.: Appropriate similarity measures for author co-citation analysis (2008)
0.02
0.01664811 = product of:
0.049944326 = sum of:
0.049944326 = product of:
0.09988865 = sum of:
0.09988865 = weight(_text_:van in 2008) [ClassicSimilarity], result of:
0.09988865 = score(doc=2008,freq=2.0), product of:
0.23160313 = queryWeight, product of:
5.5765896 = idf(docFreq=454, maxDocs=44218)
0.04153132 = queryNorm
0.43129233 = fieldWeight in 2008, product of:
1.4142135 = tf(freq=2.0), with freq of:
2.0 = termFreq=2.0
5.5765896 = idf(docFreq=454, maxDocs=44218)
0.0546875 = fieldNorm(doc=2008)
0.5 = coord(1/2)
0.33333334 = coord(1/3)
-
Eck, N.J. van; Waltman, L.: How to normalize cooccurrence data? : an analysis of some well-known similarity measures (2009)
0.01
0.014269808 = product of:
0.042809423 = sum of:
0.042809423 = product of:
0.085618846 = sum of:
0.085618846 = weight(_text_:van in 2942) [ClassicSimilarity], result of:
0.085618846 = score(doc=2942,freq=2.0), product of:
0.23160313 = queryWeight, product of:
5.5765896 = idf(docFreq=454, maxDocs=44218)
0.04153132 = queryNorm
0.36967915 = fieldWeight in 2942, product of:
1.4142135 = tf(freq=2.0), with freq of:
2.0 = termFreq=2.0
5.5765896 = idf(docFreq=454, maxDocs=44218)
0.046875 = fieldNorm(doc=2942)
0.5 = coord(1/2)
0.33333334 = coord(1/3)