Kimmel, S.: Robot-generated databases on the World Wide Web (1996)
0.02
0.024590136 = product of:
0.12295068 = sum of:
0.12295068 = weight(_text_:index in 4724) [ClassicSimilarity], result of:
0.12295068 = score(doc=4724,freq=4.0), product of:
0.2250935 = queryWeight, product of:
4.369764 = idf(docFreq=1520, maxDocs=44218)
0.051511593 = queryNorm
0.5462205 = fieldWeight in 4724, product of:
2.0 = tf(freq=4.0), with freq of:
4.0 = termFreq=4.0
4.369764 = idf(docFreq=1520, maxDocs=44218)
0.0625 = fieldNorm(doc=4724)
0.2 = coord(1/5)
- Abstract
- WWW robots are programs that attempt to gather and index WWW resources. They reside on a host computer and retrieve information from sites on the WWW using standrad protocols. Gives an overview of robots, and robot generated databases. Covers: WWW Worm; Lycos, WebCrawler; AliWeb; Harvest; Jumpstation II, and Open Text Index. Also discusses Yahoo and Trade Wave which are comparable tools for resource discovery