Kimmel, S.: Robot-generated databases on the World Wide Web (1996)
0.11
0.11032256 = product of:
0.22064511 = sum of:
0.22064511 = product of:
0.44129023 = sum of:
0.44129023 = weight(_text_:worm in 4724) [ClassicSimilarity], result of:
0.44129023 = score(doc=4724,freq=4.0), product of:
0.4123537 = queryWeight, product of:
8.561393 = idf(docFreq=22, maxDocs=44218)
0.048164323 = queryNorm
1.0701741 = fieldWeight in 4724, product of:
2.0 = tf(freq=4.0), with freq of:
4.0 = termFreq=4.0
8.561393 = idf(docFreq=22, maxDocs=44218)
0.0625 = fieldNorm(doc=4724)
0.5 = coord(1/2)
0.5 = coord(1/2)
- Abstract
- WWW robots are programs that attempt to gather and index WWW resources. They reside on a host computer and retrieve information from sites on the WWW using standrad protocols. Gives an overview of robots, and robot generated databases. Covers: WWW Worm; Lycos, WebCrawler; AliWeb; Harvest; Jumpstation II, and Open Text Index. Also discusses Yahoo and Trade Wave which are comparable tools for resource discovery
- Object
- WWW Worm