Search (2 results, page 1 of 1)

  • × year_i:[1950 TO 1960}
  • × theme_ss:"Information"
  1. Wiener, N.: ¬The human use of human beings : cybernetics and society (1950) 0.02
    0.018469224 = product of:
      0.036938448 = sum of:
        0.036938448 = product of:
          0.073876895 = sum of:
            0.073876895 = weight(_text_:22 in 4857) [ClassicSimilarity], result of:
              0.073876895 = score(doc=4857,freq=2.0), product of:
                0.15912095 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.045439374 = queryNorm
                0.46428138 = fieldWeight in 4857, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.09375 = fieldNorm(doc=4857)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Date
    8. 7.2000 18:17:22
  2. Bar-Hillel, Y.; Carnap, R.: ¬An outline of a theory of semantic information (1952) 0.01
    0.008927471 = product of:
      0.017854942 = sum of:
        0.017854942 = product of:
          0.035709884 = sum of:
            0.035709884 = weight(_text_:i in 3369) [ClassicSimilarity], result of:
              0.035709884 = score(doc=3369,freq=2.0), product of:
                0.17138503 = queryWeight, product of:
                  3.7717297 = idf(docFreq=2765, maxDocs=44218)
                  0.045439374 = queryNorm
                0.20836058 = fieldWeight in 3369, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.7717297 = idf(docFreq=2765, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=3369)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    In distinction to current Theory of Communication which treats amount of information as a measure of the statistical rarity of a message, a Theory of Semantic Information is outlined, in which the concept of information carried by a sentence within a given language system is treated as synonymous with the content of this sentence, normalized in a certain way, and the concept of amount of semantic information is explicated by various measures of this content, all based on logical probability functions ranging over the contents. Absolute and relative measures are distinguished, so are D-functions suitable for contexts where deductive reasoning alone is relevant and I-functions suitable for contexts where inductive reasoning is adequate. Of the two major types of amount of information investigated, the one, cont, is additive with respect to sentences whose contents are exclusive, the other, inf, with respect to sentences which are inductively independent. The latter turns out to be formally analogous to the customary information measure function. Various estimate functions of amount of information are investigated leading to generalized semantic correlates of concepts and theorems of current Communication Theory. A concept of semantic noise is tentatively defined, so are efficiency and redundancy of the conceptual framework of a language system. It is suggested that semantic information is a concept more readily applicable to psychological and other investigations than its communicational counterpart.

Languages

Types