Kimmel, S.: Robot-generated databases on the World Wide Web (1996)
0.03
0.03277594 = product of:
0.06555188 = sum of:
0.06555188 = product of:
0.13110375 = sum of:
0.13110375 = weight(_text_:ii in 4724) [ClassicSimilarity], result of:
0.13110375 = score(doc=4724,freq=2.0), product of:
0.2745971 = queryWeight, product of:
5.4016213 = idf(docFreq=541, maxDocs=44218)
0.050836053 = queryNorm
0.4774404 = fieldWeight in 4724, product of:
1.4142135 = tf(freq=2.0), with freq of:
2.0 = termFreq=2.0
5.4016213 = idf(docFreq=541, maxDocs=44218)
0.0625 = fieldNorm(doc=4724)
0.5 = coord(1/2)
0.5 = coord(1/2)
- Abstract
- WWW robots are programs that attempt to gather and index WWW resources. They reside on a host computer and retrieve information from sites on the WWW using standrad protocols. Gives an overview of robots, and robot generated databases. Covers: WWW Worm; Lycos, WebCrawler; AliWeb; Harvest; Jumpstation II, and Open Text Index. Also discusses Yahoo and Trade Wave which are comparable tools for resource discovery