Search (5 results, page 1 of 1)

  • × classification_ss:"ST 300"
  1. Koenig, G.: ¬Das Ende des Individuums : Reise eines Philosophen in die Welt der künstlichen Intelligenz 0.02
    0.022791695 = product of:
      0.22791694 = sum of:
        0.22791694 = weight(_text_:willensfreiheit in 503) [ClassicSimilarity], result of:
          0.22791694 = score(doc=503,freq=8.0), product of:
            0.2515577 = queryWeight, product of:
              8.200379 = idf(docFreq=32, maxDocs=44218)
              0.03067635 = queryNorm
            0.9060225 = fieldWeight in 503, product of:
              2.828427 = tf(freq=8.0), with freq of:
                8.0 = termFreq=8.0
              8.200379 = idf(docFreq=32, maxDocs=44218)
              0.0390625 = fieldNorm(doc=503)
      0.1 = coord(1/10)
    
    RSWK
    Künstliche Intelligenz / Individuum / Willensfreiheit
    Koenig, Gaspard [1982-] / Künstliche Intelligenz / Eigenständigkeit / Willensfreiheit
    Subject
    Künstliche Intelligenz / Individuum / Willensfreiheit
    Koenig, Gaspard [1982-] / Künstliche Intelligenz / Eigenständigkeit / Willensfreiheit
  2. Euzenat, J.; Shvaiko, P.: Ontology matching (2010) 0.01
    0.009584397 = product of:
      0.047921985 = sum of:
        0.011513635 = product of:
          0.034540903 = sum of:
            0.034540903 = weight(_text_:problem in 168) [ClassicSimilarity], result of:
              0.034540903 = score(doc=168,freq=4.0), product of:
                0.1302053 = queryWeight, product of:
                  4.244485 = idf(docFreq=1723, maxDocs=44218)
                  0.03067635 = queryNorm
                0.2652803 = fieldWeight in 168, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  4.244485 = idf(docFreq=1723, maxDocs=44218)
                  0.03125 = fieldNorm(doc=168)
          0.33333334 = coord(1/3)
        0.03640835 = product of:
          0.054612525 = sum of:
            0.03798764 = weight(_text_:2010 in 168) [ClassicSimilarity], result of:
              0.03798764 = score(doc=168,freq=3.0), product of:
                0.14672957 = queryWeight, product of:
                  4.7831497 = idf(docFreq=1005, maxDocs=44218)
                  0.03067635 = queryNorm
                0.25889558 = fieldWeight in 168, product of:
                  1.7320508 = tf(freq=3.0), with freq of:
                    3.0 = termFreq=3.0
                  4.7831497 = idf(docFreq=1005, maxDocs=44218)
                  0.03125 = fieldNorm(doc=168)
            0.016624888 = weight(_text_:22 in 168) [ClassicSimilarity], result of:
              0.016624888 = score(doc=168,freq=2.0), product of:
                0.10742335 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.03067635 = queryNorm
                0.15476047 = fieldWeight in 168, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.03125 = fieldNorm(doc=168)
          0.6666667 = coord(2/3)
      0.2 = coord(2/10)
    
    Abstract
    Ontologies are viewed as the silver bullet for many applications, but in open or evolving systems, different parties can adopt different ontologies. This increases heterogeneity problems rather than reducing heterogeneity. This book proposes ontology matching as a solution to the problem of semantic heterogeneity, offering researchers and practitioners a uniform framework of reference to currently available work. The techniques presented apply to database schema matching, catalog integration, XML schema matching and more. Ontologies tend to be found everywhere. They are viewed as the silver bullet for many applications, such as database integration, peer-to-peer systems, e-commerce, semantic web services, or social networks. However, in open or evolving systems, such as the semantic web, different parties would, in general, adopt different ontologies. Thus, merely using ontologies, like using XML, does not reduce heterogeneity: it just raises heterogeneity problems to a higher level. Euzenat and Shvaiko's book is devoted to ontology matching as a solution to the semantic heterogeneity problem faced by computer systems. Ontology matching aims at finding correspondences between semantically related entities of different ontologies. These correspondences may stand for equivalence as well as other relations, such as consequence, subsumption, or disjointness, between ontology entities. Many different matching solutions have been proposed so far from various viewpoints, e.g., databases, information systems, artificial intelligence. With Ontology Matching, researchers and practitioners will find a reference book which presents currently available work in a uniform framework. In particular, the work and the techniques presented in this book can equally be applied to database schema matching, catalog integration, XML schema matching and other related problems. The objectives of the book include presenting (i) the state of the art and (ii) the latest research results in ontology matching by providing a detailed account of matching techniques and matching systems in a systematic way from theoretical, practical and application perspectives.
    Date
    20. 6.2012 19:08:22
    Year
    2010
  3. Moravec, H.P.: Mind children : der Wettlauf zwischen menschlicher und künstlicher Intelligenz (1990) 0.00
    0.0016863694 = product of:
      0.016863694 = sum of:
        0.016863694 = product of:
          0.05059108 = sum of:
            0.05059108 = weight(_text_:1990 in 5338) [ClassicSimilarity], result of:
              0.05059108 = score(doc=5338,freq=3.0), product of:
                0.13825724 = queryWeight, product of:
                  4.506965 = idf(docFreq=1325, maxDocs=44218)
                  0.03067635 = queryNorm
                0.36591995 = fieldWeight in 5338, product of:
                  1.7320508 = tf(freq=3.0), with freq of:
                    3.0 = termFreq=3.0
                  4.506965 = idf(docFreq=1325, maxDocs=44218)
                  0.046875 = fieldNorm(doc=5338)
          0.33333334 = coord(1/3)
      0.1 = coord(1/10)
    
    Year
    1990
  4. Allman, W.F.: Menschliches Denken - Künstliche Intelligenz : von der Gehirnforschung zur nächsten Computer-Generation (1990) 0.00
    0.0014053079 = product of:
      0.014053079 = sum of:
        0.014053079 = product of:
          0.042159237 = sum of:
            0.042159237 = weight(_text_:1990 in 3948) [ClassicSimilarity], result of:
              0.042159237 = score(doc=3948,freq=3.0), product of:
                0.13825724 = queryWeight, product of:
                  4.506965 = idf(docFreq=1325, maxDocs=44218)
                  0.03067635 = queryNorm
                0.3049333 = fieldWeight in 3948, product of:
                  1.7320508 = tf(freq=3.0), with freq of:
                    3.0 = termFreq=3.0
                  4.506965 = idf(docFreq=1325, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=3948)
          0.33333334 = coord(1/3)
      0.1 = coord(1/10)
    
    Year
    1990
  5. Lenzen, M.: Künstliche Intelligenz : was sie kann & was uns erwartet (2018) 0.00
    6.927037E-4 = product of:
      0.0069270367 = sum of:
        0.0069270367 = product of:
          0.02078111 = sum of:
            0.02078111 = weight(_text_:22 in 4295) [ClassicSimilarity], result of:
              0.02078111 = score(doc=4295,freq=2.0), product of:
                0.10742335 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.03067635 = queryNorm
                0.19345059 = fieldWeight in 4295, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=4295)
          0.33333334 = coord(1/3)
      0.1 = coord(1/10)
    
    Date
    18. 6.2018 19:22:02

Languages