Kimmel, S.: Robot-generated databases on the World Wide Web (1996)
0.01
0.0056083994 = product of:
0.014020998 = sum of:
0.00770594 = weight(_text_:a in 4724) [ClassicSimilarity], result of:
0.00770594 = score(doc=4724,freq=4.0), product of:
0.053464882 = queryWeight, product of:
1.153047 = idf(docFreq=37942, maxDocs=44218)
0.046368346 = queryNorm
0.14413087 = fieldWeight in 4724, product of:
2.0 = tf(freq=4.0), with freq of:
4.0 = termFreq=4.0
1.153047 = idf(docFreq=37942, maxDocs=44218)
0.0625 = fieldNorm(doc=4724)
0.006315058 = product of:
0.012630116 = sum of:
0.012630116 = weight(_text_:information in 4724) [ClassicSimilarity], result of:
0.012630116 = score(doc=4724,freq=2.0), product of:
0.08139861 = queryWeight, product of:
1.7554779 = idf(docFreq=20772, maxDocs=44218)
0.046368346 = queryNorm
0.1551638 = fieldWeight in 4724, product of:
1.4142135 = tf(freq=2.0), with freq of:
2.0 = termFreq=2.0
1.7554779 = idf(docFreq=20772, maxDocs=44218)
0.0625 = fieldNorm(doc=4724)
0.5 = coord(1/2)
0.4 = coord(2/5)
- Abstract
- WWW robots are programs that attempt to gather and index WWW resources. They reside on a host computer and retrieve information from sites on the WWW using standrad protocols. Gives an overview of robots, and robot generated databases. Covers: WWW Worm; Lycos, WebCrawler; AliWeb; Harvest; Jumpstation II, and Open Text Index. Also discusses Yahoo and Trade Wave which are comparable tools for resource discovery
- Type
- a