-
Waltman, L.; Eck, N.J. van: Some comments on the question whether co-occurrence data should be normalized (2007)
0.04
0.03643254 = product of:
0.07286508 = sum of:
0.07286508 = product of:
0.14573015 = sum of:
0.14573015 = weight(_text_:n.j in 583) [ClassicSimilarity], result of:
0.14573015 = score(doc=583,freq=2.0), product of:
0.2884652 = queryWeight, product of:
6.532101 = idf(docFreq=174, maxDocs=44218)
0.044161167 = queryNorm
0.50519145 = fieldWeight in 583, product of:
1.4142135 = tf(freq=2.0), with freq of:
2.0 = termFreq=2.0
6.532101 = idf(docFreq=174, maxDocs=44218)
0.0546875 = fieldNorm(doc=583)
0.5 = coord(1/2)
0.5 = coord(1/2)
-
Eck, N.J. van; Waltman, L.: Appropriate similarity measures for author co-citation analysis (2008)
0.04
0.03643254 = product of:
0.07286508 = sum of:
0.07286508 = product of:
0.14573015 = sum of:
0.14573015 = weight(_text_:n.j in 2008) [ClassicSimilarity], result of:
0.14573015 = score(doc=2008,freq=2.0), product of:
0.2884652 = queryWeight, product of:
6.532101 = idf(docFreq=174, maxDocs=44218)
0.044161167 = queryNorm
0.50519145 = fieldWeight in 2008, product of:
1.4142135 = tf(freq=2.0), with freq of:
2.0 = termFreq=2.0
6.532101 = idf(docFreq=174, maxDocs=44218)
0.0546875 = fieldNorm(doc=2008)
0.5 = coord(1/2)
0.5 = coord(1/2)
-
Eck, N.J. van; Waltman, L.: How to normalize cooccurrence data? : an analysis of some well-known similarity measures (2009)
0.03
0.03122789 = product of:
0.06245578 = sum of:
0.06245578 = product of:
0.12491156 = sum of:
0.12491156 = weight(_text_:n.j in 2942) [ClassicSimilarity], result of:
0.12491156 = score(doc=2942,freq=2.0), product of:
0.2884652 = queryWeight, product of:
6.532101 = idf(docFreq=174, maxDocs=44218)
0.044161167 = queryNorm
0.43302125 = fieldWeight in 2942, product of:
1.4142135 = tf(freq=2.0), with freq of:
2.0 = termFreq=2.0
6.532101 = idf(docFreq=174, maxDocs=44218)
0.046875 = fieldNorm(doc=2942)
0.5 = coord(1/2)
0.5 = coord(1/2)