Search (2 results, page 1 of 1)

  • × author_ss:"Carnap, R."
  • × year_i:[1950 TO 1960}
  1. Bar-Hillel, Y.; Carnap, R.: ¬An outline of a theory of semantic information (1953) 0.01
    0.0067306077 = product of:
      0.02692243 = sum of:
        0.02692243 = weight(_text_:information in 5512) [ClassicSimilarity], result of:
          0.02692243 = score(doc=5512,freq=4.0), product of:
            0.06134496 = queryWeight, product of:
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.034944877 = queryNorm
            0.43886948 = fieldWeight in 5512, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.125 = fieldNorm(doc=5512)
      0.25 = coord(1/4)
    
    Theme
    Information
  2. Bar-Hillel, Y.; Carnap, R.: ¬An outline of a theory of semantic information (1952) 0.00
    0.0047031553 = product of:
      0.018812621 = sum of:
        0.018812621 = weight(_text_:information in 3369) [ClassicSimilarity], result of:
          0.018812621 = score(doc=3369,freq=20.0), product of:
            0.06134496 = queryWeight, product of:
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.034944877 = queryNorm
            0.30666938 = fieldWeight in 3369, product of:
              4.472136 = tf(freq=20.0), with freq of:
                20.0 = termFreq=20.0
              1.7554779 = idf(docFreq=20772, maxDocs=44218)
              0.0390625 = fieldNorm(doc=3369)
      0.25 = coord(1/4)
    
    Abstract
    In distinction to current Theory of Communication which treats amount of information as a measure of the statistical rarity of a message, a Theory of Semantic Information is outlined, in which the concept of information carried by a sentence within a given language system is treated as synonymous with the content of this sentence, normalized in a certain way, and the concept of amount of semantic information is explicated by various measures of this content, all based on logical probability functions ranging over the contents. Absolute and relative measures are distinguished, so are D-functions suitable for contexts where deductive reasoning alone is relevant and I-functions suitable for contexts where inductive reasoning is adequate. Of the two major types of amount of information investigated, the one, cont, is additive with respect to sentences whose contents are exclusive, the other, inf, with respect to sentences which are inductively independent. The latter turns out to be formally analogous to the customary information measure function. Various estimate functions of amount of information are investigated leading to generalized semantic correlates of concepts and theorems of current Communication Theory. A concept of semantic noise is tentatively defined, so are efficiency and redundancy of the conceptual framework of a language system. It is suggested that semantic information is a concept more readily applicable to psychological and other investigations than its communicational counterpart.
    Theme
    Information