Search (2 results, page 1 of 1)

  • × author_ss:"Thelwall, M."
  • × theme_ss:"Suchmaschinen"
  1. Thelwall, M.; Stuart, D.: Web crawling ethics revisited : cost, privacy, and denial of service (2006) 0.01
    0.009149151 = product of:
      0.036596604 = sum of:
        0.036596604 = product of:
          0.07319321 = sum of:
            0.07319321 = weight(_text_:aspects in 6098) [ClassicSimilarity], result of:
              0.07319321 = score(doc=6098,freq=2.0), product of:
                0.20938325 = queryWeight, product of:
                  4.5198684 = idf(docFreq=1308, maxDocs=44218)
                  0.046325076 = queryNorm
                0.3495657 = fieldWeight in 6098, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  4.5198684 = idf(docFreq=1308, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=6098)
          0.5 = coord(1/2)
      0.25 = coord(1/4)
    
    Abstract
    Ethical aspects of the employment of Web crawlers for information science research and other contexts are reviewed. The difference between legal and ethical uses of communications technologies is emphasized as well as the changing boundary between ethical and unethical conduct. A review of the potential impacts on Web site owners is used to underpin a new framework for ethical crawling, and it is argued that delicate human judgment is required for each individual case, with verdicts likely to change over time. Decisions can be based upon an approximate cost-benefit analysis, but it is crucial that crawler owners find out about the technological issues affecting the owners of the sites being crawled in order to produce an informed assessment.
  2. Thelwall, M.: Assessing web search engines : a webometric approach (2011) 0.01
    0.007842129 = product of:
      0.031368516 = sum of:
        0.031368516 = product of:
          0.06273703 = sum of:
            0.06273703 = weight(_text_:aspects in 10) [ClassicSimilarity], result of:
              0.06273703 = score(doc=10,freq=2.0), product of:
                0.20938325 = queryWeight, product of:
                  4.5198684 = idf(docFreq=1308, maxDocs=44218)
                  0.046325076 = queryNorm
                0.29962775 = fieldWeight in 10, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  4.5198684 = idf(docFreq=1308, maxDocs=44218)
                  0.046875 = fieldNorm(doc=10)
          0.5 = coord(1/2)
      0.25 = coord(1/4)
    
    Abstract
    Information Retrieval (IR) research typically evaluates search systems in terms of the standard precision, recall and F-measures to weight the relative importance of precision and recall (e.g. van Rijsbergen, 1979). All of these assess the extent to which the system returns good matches for a query. In contrast, webometric measures are designed specifically for web search engines and are designed to monitor changes in results over time and various aspects of the internal logic of the way in which search engine select the results to be returned. This chapter introduces a range of webometric measurements and illustrates them with case studies of Google, Bing and Yahoo! This is a very fertile area for simple and complex new investigations into search engine results.