Search (8 results, page 1 of 1)

  • × author_ss:"Bookstein, A."
  • × year_i:[1990 TO 2000}
  1. Bookstein, A.: Informetric distributions : I. Unified overview (1990) 0.05
    0.05037771 = product of:
      0.10075542 = sum of:
        0.10075542 = sum of:
          0.01339476 = weight(_text_:a in 6902) [ClassicSimilarity], result of:
            0.01339476 = score(doc=6902,freq=4.0), product of:
              0.053105544 = queryWeight, product of:
                1.153047 = idf(docFreq=37942, maxDocs=44218)
                0.046056706 = queryNorm
              0.25222903 = fieldWeight in 6902, product of:
                2.0 = tf(freq=4.0), with freq of:
                  4.0 = termFreq=4.0
                1.153047 = idf(docFreq=37942, maxDocs=44218)
                0.109375 = fieldNorm(doc=6902)
          0.087360665 = weight(_text_:22 in 6902) [ClassicSimilarity], result of:
            0.087360665 = score(doc=6902,freq=2.0), product of:
              0.16128273 = queryWeight, product of:
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.046056706 = queryNorm
              0.5416616 = fieldWeight in 6902, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.109375 = fieldNorm(doc=6902)
      0.5 = coord(1/2)
    
    Date
    22. 7.2006 18:55:29
    Type
    a
  2. Bookstein, A.: Informetric distributions : II. Resilience to ambiguity (1990) 0.05
    0.05037771 = product of:
      0.10075542 = sum of:
        0.10075542 = sum of:
          0.01339476 = weight(_text_:a in 4689) [ClassicSimilarity], result of:
            0.01339476 = score(doc=4689,freq=4.0), product of:
              0.053105544 = queryWeight, product of:
                1.153047 = idf(docFreq=37942, maxDocs=44218)
                0.046056706 = queryNorm
              0.25222903 = fieldWeight in 4689, product of:
                2.0 = tf(freq=4.0), with freq of:
                  4.0 = termFreq=4.0
                1.153047 = idf(docFreq=37942, maxDocs=44218)
                0.109375 = fieldNorm(doc=4689)
          0.087360665 = weight(_text_:22 in 4689) [ClassicSimilarity], result of:
            0.087360665 = score(doc=4689,freq=2.0), product of:
              0.16128273 = queryWeight, product of:
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.046056706 = queryNorm
              0.5416616 = fieldWeight in 4689, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.109375 = fieldNorm(doc=4689)
      0.5 = coord(1/2)
    
    Date
    22. 7.2006 18:55:55
    Type
    a
  3. Bookstein, A.; Klein, S.T.: Compression, information theory, and grammars : a unified approach (1990) 0.00
    0.0046871896 = product of:
      0.009374379 = sum of:
        0.009374379 = product of:
          0.018748758 = sum of:
            0.018748758 = weight(_text_:a in 2970) [ClassicSimilarity], result of:
              0.018748758 = score(doc=2970,freq=6.0), product of:
                0.053105544 = queryWeight, product of:
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.046056706 = queryNorm
                0.3530471 = fieldWeight in 2970, product of:
                  2.4494898 = tf(freq=6.0), with freq of:
                    6.0 = termFreq=6.0
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.125 = fieldNorm(doc=2970)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Type
    a
  4. Bookstein, A.: Scientometrics: new opportunities (1994) 0.00
    0.003827074 = product of:
      0.007654148 = sum of:
        0.007654148 = product of:
          0.015308296 = sum of:
            0.015308296 = weight(_text_:a in 5062) [ClassicSimilarity], result of:
              0.015308296 = score(doc=5062,freq=4.0), product of:
                0.053105544 = queryWeight, product of:
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.046056706 = queryNorm
                0.28826174 = fieldWeight in 5062, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.125 = fieldNorm(doc=5062)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Type
    a
  5. Bookstein, A.: Informetric distributions : III. Ambiguity (1997) 0.00
    0.00270615 = product of:
      0.0054123 = sum of:
        0.0054123 = product of:
          0.0108246 = sum of:
            0.0108246 = weight(_text_:a in 6489) [ClassicSimilarity], result of:
              0.0108246 = score(doc=6489,freq=8.0), product of:
                0.053105544 = queryWeight, product of:
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.046056706 = queryNorm
                0.20383182 = fieldWeight in 6489, product of:
                  2.828427 = tf(freq=8.0), with freq of:
                    8.0 = termFreq=8.0
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.0625 = fieldNorm(doc=6489)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    This article examines various kinds of uncertainty. The notion of ambiguity is defined and contratsed with the more familiar notions of randomness and fuzziness. Functional forms resistant to ambiguity are defined, and it is shown how to incorporate a random component, that is itself also rsistant to ambiguity, into a resilent, but deterministic model
    Type
    a
  6. Bookstein, A.: Bibliocryptography (1996) 0.00
    0.0026473717 = product of:
      0.0052947435 = sum of:
        0.0052947435 = product of:
          0.010589487 = sum of:
            0.010589487 = weight(_text_:a in 6502) [ClassicSimilarity], result of:
              0.010589487 = score(doc=6502,freq=10.0), product of:
                0.053105544 = queryWeight, product of:
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.046056706 = queryNorm
                0.19940455 = fieldWeight in 6502, product of:
                  3.1622777 = tf(freq=10.0), with freq of:
                    10.0 = termFreq=10.0
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=6502)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    Because of concerns about the privacy of its patrons, it is common for libraries to systematically destroy historic information about book circulation. I argue that this information has great potential value for improving retrieval effectiveness, and give 2 examples of how this information can be used. Further, I show how use-data can be preserved and exploited while still giving a high degree of protection for patron privacy. The methods are analyzed and formulae are derived indicating the tradeoff between retrieval effectiveness and security. A second, contrasting application, indicating how to introduce 'fingerprints' into digitized audio-visual material in a tamper-resistant manner, is described
    Type
    a
  7. Bookstein, A.; Klein, S.T.; Raita, T.: Clumping properties of content-bearing words (1998) 0.00
    0.0023678814 = product of:
      0.0047357627 = sum of:
        0.0047357627 = product of:
          0.009471525 = sum of:
            0.009471525 = weight(_text_:a in 442) [ClassicSimilarity], result of:
              0.009471525 = score(doc=442,freq=8.0), product of:
                0.053105544 = queryWeight, product of:
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.046056706 = queryNorm
                0.17835285 = fieldWeight in 442, product of:
                  2.828427 = tf(freq=8.0), with freq of:
                    8.0 = termFreq=8.0
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=442)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    Information Retrieval Systems identify content bearing words, and possibly also assign weights, as part of the process of formulating requests. For optimal retrieval efficiency, it is desirable that this be done automatically. This article defines the notion of serial clustering of words in text, and explores the value of such clustering as an indicator of a word's bearing content. This approach is flexible in the sense that it is sensitive to context: a term may be assessed as content-bearing within one collection, but not another. Our approach, being numerical, may also be of value in assigning weights to terms in requests. Experimental support is obtained from natural text databases in three different languages
    Type
    a
  8. Bookstein, A.; Wright, B.: Ambiguity in measurement (1997) 0.00
    0.0023435948 = product of:
      0.0046871896 = sum of:
        0.0046871896 = product of:
          0.009374379 = sum of:
            0.009374379 = weight(_text_:a in 1036) [ClassicSimilarity], result of:
              0.009374379 = score(doc=1036,freq=6.0), product of:
                0.053105544 = queryWeight, product of:
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.046056706 = queryNorm
                0.17652355 = fieldWeight in 1036, product of:
                  2.4494898 = tf(freq=6.0), with freq of:
                    6.0 = termFreq=6.0
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.0625 = fieldNorm(doc=1036)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    Gives an overview of the role of ambiguity in measurement and explores analytical methods for exploring its impact. Argues that certain functional forms are more resilient than others to problems of ambiguity, and that these should be preferred when ambiguity is a serious concern
    Type
    a