Search (1193 results, page 1 of 60)

  • × language_ss:"e"
  • × year_i:[2010 TO 2020}
  1. Kleineberg, M.: Context analysis and context indexing : formal pragmatics in knowledge organization (2014) 0.53
    0.52782357 = product of:
      1.0556471 = sum of:
        0.08120363 = product of:
          0.24361089 = sum of:
            0.24361089 = weight(_text_:3a in 1826) [ClassicSimilarity], result of:
              0.24361089 = score(doc=1826,freq=2.0), product of:
                0.26007444 = queryWeight, product of:
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.03067635 = queryNorm
                0.93669677 = fieldWeight in 1826, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.078125 = fieldNorm(doc=1826)
          0.33333334 = coord(1/3)
        0.24361089 = weight(_text_:2f in 1826) [ClassicSimilarity], result of:
          0.24361089 = score(doc=1826,freq=2.0), product of:
            0.26007444 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03067635 = queryNorm
            0.93669677 = fieldWeight in 1826, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.078125 = fieldNorm(doc=1826)
        0.24361089 = weight(_text_:2f in 1826) [ClassicSimilarity], result of:
          0.24361089 = score(doc=1826,freq=2.0), product of:
            0.26007444 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03067635 = queryNorm
            0.93669677 = fieldWeight in 1826, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.078125 = fieldNorm(doc=1826)
        0.24361089 = weight(_text_:2f in 1826) [ClassicSimilarity], result of:
          0.24361089 = score(doc=1826,freq=2.0), product of:
            0.26007444 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03067635 = queryNorm
            0.93669677 = fieldWeight in 1826, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.078125 = fieldNorm(doc=1826)
        0.24361089 = weight(_text_:2f in 1826) [ClassicSimilarity], result of:
          0.24361089 = score(doc=1826,freq=2.0), product of:
            0.26007444 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03067635 = queryNorm
            0.93669677 = fieldWeight in 1826, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.078125 = fieldNorm(doc=1826)
      0.5 = coord(5/10)
    
    Source
    http://www.google.de/url?sa=t&rct=j&q=&esrc=s&source=web&cd=5&ved=0CDQQFjAE&url=http%3A%2F%2Fdigbib.ubka.uni-karlsruhe.de%2Fvolltexte%2Fdocuments%2F3131107&ei=HzFWVYvGMsiNsgGTyoFI&usg=AFQjCNE2FHUeR9oQTQlNC4TPedv4Mo3DaQ&sig2=Rlzpr7a3BLZZkqZCXXN_IA&bvm=bv.93564037,d.bGg&cad=rja
  2. Farazi, M.: Faceted lightweight ontologies : a formalization and some experiments (2010) 0.33
    0.32619107 = product of:
      0.54365176 = sum of:
        0.040601816 = product of:
          0.121805444 = sum of:
            0.121805444 = weight(_text_:3a in 4997) [ClassicSimilarity], result of:
              0.121805444 = score(doc=4997,freq=2.0), product of:
                0.26007444 = queryWeight, product of:
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.03067635 = queryNorm
                0.46834838 = fieldWeight in 4997, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=4997)
          0.33333334 = coord(1/3)
        0.121805444 = weight(_text_:2f in 4997) [ClassicSimilarity], result of:
          0.121805444 = score(doc=4997,freq=2.0), product of:
            0.26007444 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03067635 = queryNorm
            0.46834838 = fieldWeight in 4997, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.0390625 = fieldNorm(doc=4997)
        0.121805444 = weight(_text_:2f in 4997) [ClassicSimilarity], result of:
          0.121805444 = score(doc=4997,freq=2.0), product of:
            0.26007444 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03067635 = queryNorm
            0.46834838 = fieldWeight in 4997, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.0390625 = fieldNorm(doc=4997)
        0.121805444 = weight(_text_:2f in 4997) [ClassicSimilarity], result of:
          0.121805444 = score(doc=4997,freq=2.0), product of:
            0.26007444 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03067635 = queryNorm
            0.46834838 = fieldWeight in 4997, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.0390625 = fieldNorm(doc=4997)
        0.121805444 = weight(_text_:2f in 4997) [ClassicSimilarity], result of:
          0.121805444 = score(doc=4997,freq=2.0), product of:
            0.26007444 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03067635 = queryNorm
            0.46834838 = fieldWeight in 4997, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.0390625 = fieldNorm(doc=4997)
        0.015828185 = product of:
          0.04748455 = sum of:
            0.04748455 = weight(_text_:2010 in 4997) [ClassicSimilarity], result of:
              0.04748455 = score(doc=4997,freq=3.0), product of:
                0.14672957 = queryWeight, product of:
                  4.7831497 = idf(docFreq=1005, maxDocs=44218)
                  0.03067635 = queryNorm
                0.32361948 = fieldWeight in 4997, product of:
                  1.7320508 = tf(freq=3.0), with freq of:
                    3.0 = termFreq=3.0
                  4.7831497 = idf(docFreq=1005, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=4997)
          0.33333334 = coord(1/3)
      0.6 = coord(6/10)
    
    Content
    PhD Dissertation at International Doctorate School in Information and Communication Technology. Vgl.: https%3A%2F%2Fcore.ac.uk%2Fdownload%2Fpdf%2F150083013.pdf&usg=AOvVaw2n-qisNagpyT0lli_6QbAQ.
    Year
    2010
  3. Zeng, Q.; Yu, M.; Yu, W.; Xiong, J.; Shi, Y.; Jiang, M.: Faceted hierarchy : a new graph type to organize scientific concepts and a construction method (2019) 0.32
    0.3166941 = product of:
      0.6333882 = sum of:
        0.048722174 = product of:
          0.14616652 = sum of:
            0.14616652 = weight(_text_:3a in 400) [ClassicSimilarity], result of:
              0.14616652 = score(doc=400,freq=2.0), product of:
                0.26007444 = queryWeight, product of:
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.03067635 = queryNorm
                0.56201804 = fieldWeight in 400, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.046875 = fieldNorm(doc=400)
          0.33333334 = coord(1/3)
        0.14616652 = weight(_text_:2f in 400) [ClassicSimilarity], result of:
          0.14616652 = score(doc=400,freq=2.0), product of:
            0.26007444 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03067635 = queryNorm
            0.56201804 = fieldWeight in 400, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.046875 = fieldNorm(doc=400)
        0.14616652 = weight(_text_:2f in 400) [ClassicSimilarity], result of:
          0.14616652 = score(doc=400,freq=2.0), product of:
            0.26007444 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03067635 = queryNorm
            0.56201804 = fieldWeight in 400, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.046875 = fieldNorm(doc=400)
        0.14616652 = weight(_text_:2f in 400) [ClassicSimilarity], result of:
          0.14616652 = score(doc=400,freq=2.0), product of:
            0.26007444 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03067635 = queryNorm
            0.56201804 = fieldWeight in 400, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.046875 = fieldNorm(doc=400)
        0.14616652 = weight(_text_:2f in 400) [ClassicSimilarity], result of:
          0.14616652 = score(doc=400,freq=2.0), product of:
            0.26007444 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03067635 = queryNorm
            0.56201804 = fieldWeight in 400, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.046875 = fieldNorm(doc=400)
      0.5 = coord(5/10)
    
    Content
    Vgl.: https%3A%2F%2Faclanthology.org%2FD19-5317.pdf&usg=AOvVaw0ZZFyq5wWTtNTvNkrvjlGA.
  4. Huo, W.: Automatic multi-word term extraction and its application to Web-page summarization (2012) 0.30
    0.29648927 = product of:
      0.59297854 = sum of:
        0.14616652 = weight(_text_:2f in 563) [ClassicSimilarity], result of:
          0.14616652 = score(doc=563,freq=2.0), product of:
            0.26007444 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03067635 = queryNorm
            0.56201804 = fieldWeight in 563, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.046875 = fieldNorm(doc=563)
        0.14616652 = weight(_text_:2f in 563) [ClassicSimilarity], result of:
          0.14616652 = score(doc=563,freq=2.0), product of:
            0.26007444 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03067635 = queryNorm
            0.56201804 = fieldWeight in 563, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.046875 = fieldNorm(doc=563)
        0.14616652 = weight(_text_:2f in 563) [ClassicSimilarity], result of:
          0.14616652 = score(doc=563,freq=2.0), product of:
            0.26007444 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03067635 = queryNorm
            0.56201804 = fieldWeight in 563, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.046875 = fieldNorm(doc=563)
        0.14616652 = weight(_text_:2f in 563) [ClassicSimilarity], result of:
          0.14616652 = score(doc=563,freq=2.0), product of:
            0.26007444 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03067635 = queryNorm
            0.56201804 = fieldWeight in 563, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.046875 = fieldNorm(doc=563)
        0.008312443 = product of:
          0.02493733 = sum of:
            0.02493733 = weight(_text_:22 in 563) [ClassicSimilarity], result of:
              0.02493733 = score(doc=563,freq=2.0), product of:
                0.10742335 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.03067635 = queryNorm
                0.23214069 = fieldWeight in 563, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.046875 = fieldNorm(doc=563)
          0.33333334 = coord(1/3)
      0.5 = coord(5/10)
    
    Content
    A Thesis presented to The University of Guelph In partial fulfilment of requirements for the degree of Master of Science in Computer Science. Vgl. Unter: http://www.inf.ufrgs.br%2F~ceramisch%2Fdownload_files%2Fpublications%2F2009%2Fp01.pdf.
    Date
    10. 1.2013 19:22:47
  5. Xiong, C.: Knowledge based text representations for information retrieval (2016) 0.29
    0.29185498 = product of:
      0.58370996 = sum of:
        0.032481454 = product of:
          0.097444355 = sum of:
            0.097444355 = weight(_text_:3a in 5820) [ClassicSimilarity], result of:
              0.097444355 = score(doc=5820,freq=2.0), product of:
                0.26007444 = queryWeight, product of:
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.03067635 = queryNorm
                0.3746787 = fieldWeight in 5820, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.03125 = fieldNorm(doc=5820)
          0.33333334 = coord(1/3)
        0.13780712 = weight(_text_:2f in 5820) [ClassicSimilarity], result of:
          0.13780712 = score(doc=5820,freq=4.0), product of:
            0.26007444 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03067635 = queryNorm
            0.5298757 = fieldWeight in 5820, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03125 = fieldNorm(doc=5820)
        0.13780712 = weight(_text_:2f in 5820) [ClassicSimilarity], result of:
          0.13780712 = score(doc=5820,freq=4.0), product of:
            0.26007444 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03067635 = queryNorm
            0.5298757 = fieldWeight in 5820, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03125 = fieldNorm(doc=5820)
        0.13780712 = weight(_text_:2f in 5820) [ClassicSimilarity], result of:
          0.13780712 = score(doc=5820,freq=4.0), product of:
            0.26007444 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03067635 = queryNorm
            0.5298757 = fieldWeight in 5820, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03125 = fieldNorm(doc=5820)
        0.13780712 = weight(_text_:2f in 5820) [ClassicSimilarity], result of:
          0.13780712 = score(doc=5820,freq=4.0), product of:
            0.26007444 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03067635 = queryNorm
            0.5298757 = fieldWeight in 5820, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03125 = fieldNorm(doc=5820)
      0.5 = coord(5/10)
    
    Content
    Submitted in partial fulfillment of the requirements for the degree of Doctor of Philosophy in Language and Information Technologies. Vgl.: https%3A%2F%2Fwww.cs.cmu.edu%2F~cx%2Fpapers%2Fknowledge_based_text_representation.pdf&usg=AOvVaw0SaTSvhWLTh__Uz_HtOtl3.
  6. Koch, C.: Consciousness : confessions of a romantic reductionist (2012) 0.24
    0.24214846 = product of:
      0.48429692 = sum of:
        0.09116678 = weight(_text_:willensfreiheit in 4561) [ClassicSimilarity], result of:
          0.09116678 = score(doc=4561,freq=8.0), product of:
            0.2515577 = queryWeight, product of:
              8.200379 = idf(docFreq=32, maxDocs=44218)
              0.03067635 = queryNorm
            0.362409 = fieldWeight in 4561, product of:
              2.828427 = tf(freq=8.0), with freq of:
                8.0 = termFreq=8.0
              8.200379 = idf(docFreq=32, maxDocs=44218)
              0.015625 = fieldNorm(doc=4561)
        0.19109449 = sum of:
          0.08864062 = weight(_text_:leib in 4561) [ClassicSimilarity], result of:
            0.08864062 = score(doc=4561,freq=8.0), product of:
              0.24804801 = queryWeight, product of:
                8.085969 = idf(docFreq=36, maxDocs=44218)
                0.03067635 = queryNorm
              0.3573527 = fieldWeight in 4561, product of:
                2.828427 = tf(freq=8.0), with freq of:
                  8.0 = termFreq=8.0
                8.085969 = idf(docFreq=36, maxDocs=44218)
                0.015625 = fieldNorm(doc=4561)
          0.07254055 = weight(_text_:seele in 4561) [ClassicSimilarity], result of:
            0.07254055 = score(doc=4561,freq=8.0), product of:
              0.22439323 = queryWeight, product of:
                7.314861 = idf(docFreq=79, maxDocs=44218)
                0.03067635 = queryNorm
              0.32327422 = fieldWeight in 4561, product of:
                2.828427 = tf(freq=8.0), with freq of:
                  8.0 = termFreq=8.0
                7.314861 = idf(docFreq=79, maxDocs=44218)
                0.015625 = fieldNorm(doc=4561)
          0.029913299 = weight(_text_:problem in 4561) [ClassicSimilarity], result of:
            0.029913299 = score(doc=4561,freq=12.0), product of:
              0.1302053 = queryWeight, product of:
                4.244485 = idf(docFreq=1723, maxDocs=44218)
                0.03067635 = queryNorm
              0.22973949 = fieldWeight in 4561, product of:
                3.4641016 = tf(freq=12.0), with freq of:
                  12.0 = termFreq=12.0
                4.244485 = idf(docFreq=1723, maxDocs=44218)
                0.015625 = fieldNorm(doc=4561)
        0.065451376 = weight(_text_:neurowissenschaftler in 4561) [ClassicSimilarity], result of:
          0.065451376 = score(doc=4561,freq=4.0), product of:
            0.25347564 = queryWeight, product of:
              8.2629 = idf(docFreq=30, maxDocs=44218)
              0.03067635 = queryNorm
            0.25821564 = fieldWeight in 4561, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              8.2629 = idf(docFreq=30, maxDocs=44218)
              0.015625 = fieldNorm(doc=4561)
        0.09754756 = weight(_text_:erlebnisbericht in 4561) [ClassicSimilarity], result of:
          0.09754756 = score(doc=4561,freq=4.0), product of:
            0.30944613 = queryWeight, product of:
              10.087449 = idf(docFreq=4, maxDocs=44218)
              0.03067635 = queryNorm
            0.31523278 = fieldWeight in 4561, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              10.087449 = idf(docFreq=4, maxDocs=44218)
              0.015625 = fieldNorm(doc=4561)
        0.039036717 = product of:
          0.058555074 = sum of:
            0.027538298 = weight(_text_:1990 in 4561) [ClassicSimilarity], result of:
              0.027538298 = score(doc=4561,freq=8.0), product of:
                0.13825724 = queryWeight, product of:
                  4.506965 = idf(docFreq=1325, maxDocs=44218)
                  0.03067635 = queryNorm
                0.1991816 = fieldWeight in 4561, product of:
                  2.828427 = tf(freq=8.0), with freq of:
                    8.0 = termFreq=8.0
                  4.506965 = idf(docFreq=1325, maxDocs=44218)
                  0.015625 = fieldNorm(doc=4561)
            0.031016776 = weight(_text_:2010 in 4561) [ClassicSimilarity], result of:
              0.031016776 = score(doc=4561,freq=8.0), product of:
                0.14672957 = queryWeight, product of:
                  4.7831497 = idf(docFreq=1005, maxDocs=44218)
                  0.03067635 = queryNorm
                0.21138735 = fieldWeight in 4561, product of:
                  2.828427 = tf(freq=8.0), with freq of:
                    8.0 = termFreq=8.0
                  4.7831497 = idf(docFreq=1005, maxDocs=44218)
                  0.015625 = fieldNorm(doc=4561)
          0.6666667 = coord(2/3)
      0.5 = coord(5/10)
    
    Content
    In which I introduce the ancient mind-body problem, explain why I am on a quest to use reason and empirical inquiry to solve it, acquaint you with Francis Crick, explain how he relates to this quest, make a confession, and end on a sad note -- In which I write about the wellsprings of my inner conflict between religion and reason, why I grew up wanting to be a scientist, why I wear a lapel pin of Professor Calculus, and how I acquired a second mentor late in life -- In which I explain why consciousness challenges the scientific view of the world, how consciousness can be investigated empirically with both feet firmly planted on the ground, why animals share consciousness with humans, and why self-consciousness is not as important as many people think it is -- In which you hear tales of scientist-magicians that make you look but not see, how they track the footprints of consciousness by peering into your skull, why you don't see with your eyes, and why attention and consciousness are not the same -- In which you learn from neurologists and neurosurgeons that some neurons care a great deal about celebrities, that cutting the cerebral cortex in two does not reduce consciousness by half, that color is leached from the world by the loss of a small cortical region, and that the destruction of a sugar cube-sized chunk of brain stem or thalamic tissue leaves you undead -- In which I defend two propositions that my younger self found nonsense--you are unaware of most of the things that go on in your head, and zombie agents control much of your life, even though you confidently believe that you are in charge -- In which I throw caution to the wind, bring up free will, Der ring des Nibelungen, and what physics says about determinism, explain the impoverished ability of your mind to choose, show that your will lags behind your brain's decision, and that freedom is just another word for feeling -- In which I argue that consciousness is a fundamental property of complex things, rhapsodize about integrated information theory, how it explains many puzzling facts about consciousness and provides a blueprint for building sentient machines -- In which I outline an electromagnetic gadget to measure consciousness, describe efforts to harness the power of genetic engineering to track consciousness in mice, and find myself building cortical observatories -- In which I muse about final matters considered off-limits to polite scientific discourse: to wit, the relationship between science and religion, the existence of God, whether this God can intervene in the universe, the death of my mentor, and my recent tribulations.
    Footnote
    Rez. in: The New York Review of Books, 10.01.2013 ( J. Searle): "The problem of consciousness remains with us. What exactly is it and why is it still with us? The single most important question is: How exactly do neurobiological processes in the brain cause human and animal consciousness? Related problems are: How exactly is consciousness realized in the brain? That is, where is it and how does it exist in the brain? Also, how does it function causally in our behavior? To answer these questions we have to ask: What is it? Without attempting an elaborate definition, we can say the central feature of consciousness is that for any conscious state there is something that it feels like to be in that state, some qualitative character to the state. For example, the qualitative character of drinking beer is different from that of listening to music or thinking about your income tax. This qualitative character is subjective in that it only exists as experienced by a human or animal subject. It has a subjective or first-person existence (or "ontology"), unlike mountains, molecules, and tectonic plates that have an objective or third-person existence. Furthermore, qualitative subjectivity always comes to us as part of a unified conscious field. At any moment you do not just experience the sound of the music and the taste of the beer, but you have both as part of a single, unified conscious field, a subjective awareness of the total conscious experience. So the feature we are trying to explain is qualitative, unified subjectivity.
    RSWK
    Bewusstsein / Willensfreiheit / Leib-Seele-Problem / Neurowissenschaftler / Erlebnisbericht 1990-2010
    Koch, Christof / Autobiographie 1990-2010
    Koch, Christof *1956-* / Bewusstsein / Willensfreiheit / Leib-Seele-Problem / Neurowissenschaften / Autobiographie
    Subject
    Bewusstsein / Willensfreiheit / Leib-Seele-Problem / Neurowissenschaftler / Erlebnisbericht 1990-2010
    Koch, Christof / Autobiographie 1990-2010
    Koch, Christof *1956-* / Bewusstsein / Willensfreiheit / Leib-Seele-Problem / Neurowissenschaften / Autobiographie
  7. Tononi, G.: Phi : a voyage from the brain to the soul (2012) 0.03
    0.02624855 = product of:
      0.2624855 = sum of:
        0.2624855 = sum of:
          0.12535678 = weight(_text_:leib in 4971) [ClassicSimilarity], result of:
            0.12535678 = score(doc=4971,freq=4.0), product of:
              0.24804801 = queryWeight, product of:
                8.085969 = idf(docFreq=36, maxDocs=44218)
                0.03067635 = queryNorm
              0.50537306 = fieldWeight in 4971, product of:
                2.0 = tf(freq=4.0), with freq of:
                  4.0 = termFreq=4.0
                8.085969 = idf(docFreq=36, maxDocs=44218)
                0.03125 = fieldNorm(doc=4971)
          0.10258783 = weight(_text_:seele in 4971) [ClassicSimilarity], result of:
            0.10258783 = score(doc=4971,freq=4.0), product of:
              0.22439323 = queryWeight, product of:
                7.314861 = idf(docFreq=79, maxDocs=44218)
                0.03067635 = queryNorm
              0.4571788 = fieldWeight in 4971, product of:
                2.0 = tf(freq=4.0), with freq of:
                  4.0 = termFreq=4.0
                7.314861 = idf(docFreq=79, maxDocs=44218)
                0.03125 = fieldNorm(doc=4971)
          0.034540903 = weight(_text_:problem in 4971) [ClassicSimilarity], result of:
            0.034540903 = score(doc=4971,freq=4.0), product of:
              0.1302053 = queryWeight, product of:
                4.244485 = idf(docFreq=1723, maxDocs=44218)
                0.03067635 = queryNorm
              0.2652803 = fieldWeight in 4971, product of:
                2.0 = tf(freq=4.0), with freq of:
                  4.0 = termFreq=4.0
                4.244485 = idf(docFreq=1723, maxDocs=44218)
                0.03125 = fieldNorm(doc=4971)
      0.1 = coord(1/10)
    
    RSWK
    Bewusstsein / Gehirn / Physiologie / Leib-Seele-Problem
    Subject
    Bewusstsein / Gehirn / Physiologie / Leib-Seele-Problem
  8. Friedman, A.; Smiraglia, R.P.: Nodes and arcs : concept map, semiotics, and knowledge organization (2013) 0.02
    0.016664267 = product of:
      0.08332133 = sum of:
        0.008141369 = product of:
          0.024424106 = sum of:
            0.024424106 = weight(_text_:problem in 770) [ClassicSimilarity], result of:
              0.024424106 = score(doc=770,freq=2.0), product of:
                0.1302053 = queryWeight, product of:
                  4.244485 = idf(docFreq=1723, maxDocs=44218)
                  0.03067635 = queryNorm
                0.1875815 = fieldWeight in 770, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  4.244485 = idf(docFreq=1723, maxDocs=44218)
                  0.03125 = fieldNorm(doc=770)
          0.33333334 = coord(1/3)
        0.075179964 = sum of:
          0.027538298 = weight(_text_:1990 in 770) [ClassicSimilarity], result of:
            0.027538298 = score(doc=770,freq=2.0), product of:
              0.13825724 = queryWeight, product of:
                4.506965 = idf(docFreq=1325, maxDocs=44218)
                0.03067635 = queryNorm
              0.1991816 = fieldWeight in 770, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                4.506965 = idf(docFreq=1325, maxDocs=44218)
                0.03125 = fieldNorm(doc=770)
          0.031016776 = weight(_text_:2010 in 770) [ClassicSimilarity], result of:
            0.031016776 = score(doc=770,freq=2.0), product of:
              0.14672957 = queryWeight, product of:
                4.7831497 = idf(docFreq=1005, maxDocs=44218)
                0.03067635 = queryNorm
              0.21138735 = fieldWeight in 770, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                4.7831497 = idf(docFreq=1005, maxDocs=44218)
                0.03125 = fieldNorm(doc=770)
          0.016624888 = weight(_text_:22 in 770) [ClassicSimilarity], result of:
            0.016624888 = score(doc=770,freq=2.0), product of:
              0.10742335 = queryWeight, product of:
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.03067635 = queryNorm
              0.15476047 = fieldWeight in 770, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.03125 = fieldNorm(doc=770)
      0.2 = coord(2/10)
    
    Abstract
    Purpose - The purpose of the research reported here is to improve comprehension of the socially-negotiated identity of concepts in the domain of knowledge organization. Because knowledge organization as a domain has as its focus the order of concepts, both from a theoretical perspective and from an applied perspective, it is important to understand how the domain itself understands the meaning of a concept. Design/methodology/approach - The paper provides an empirical demonstration of how the domain itself understands the meaning of a concept. The paper employs content analysis to demonstrate the ways in which concepts are portrayed in KO concept maps as signs, and they are subjected to evaluative semiotic analysis as a way to understand their meaning. The frame was the entire population of formal proceedings in knowledge organization - all proceedings of the International Society for Knowledge Organization's international conferences (1990-2010) and those of the annual classification workshops of the Special Interest Group for Classification Research of the American Society for Information Science and Technology (SIG/CR). Findings - A total of 344 concept maps were analyzed. There was no discernible chronological pattern. Most concept maps were created by authors who were professors from the USA, Germany, France, or Canada. Roughly half were judged to contain semiotic content. Peirceian semiotics predominated, and tended to convey greater granularity and complexity in conceptual terminology. Nodes could be identified as anchors of conceptual clusters in the domain; the arcs were identifiable as verbal relationship indicators. Saussurian concept maps were more applied than theoretical; Peirceian concept maps had more theoretical content. Originality/value - The paper demonstrates important empirical evidence about the coherence of the domain of knowledge organization. Core values are conveyed across time through the concept maps in this population of conference papers.
    Content
    Vgl. auch den Beitrag: Treude, L.: Das Problem der Konzeptdefinition in der Wissensorganisation: über einen missglückten Versuch der Klärung. In: LIBREAS: Library ideas. no.22, 2013, S.xx-xx.
  9. Bawden, D.: Encountering on the road to serendip? : Browsing in new information environments (2011) 0.01
    0.013156493 = product of:
      0.13156493 = sum of:
        0.13156493 = sum of:
          0.04819202 = weight(_text_:1990 in 3361) [ClassicSimilarity], result of:
            0.04819202 = score(doc=3361,freq=2.0), product of:
              0.13825724 = queryWeight, product of:
                4.506965 = idf(docFreq=1325, maxDocs=44218)
                0.03067635 = queryNorm
              0.3485678 = fieldWeight in 3361, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                4.506965 = idf(docFreq=1325, maxDocs=44218)
                0.0546875 = fieldNorm(doc=3361)
          0.054279357 = weight(_text_:2010 in 3361) [ClassicSimilarity], result of:
            0.054279357 = score(doc=3361,freq=2.0), product of:
              0.14672957 = queryWeight, product of:
                4.7831497 = idf(docFreq=1005, maxDocs=44218)
                0.03067635 = queryNorm
              0.36992785 = fieldWeight in 3361, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                4.7831497 = idf(docFreq=1005, maxDocs=44218)
                0.0546875 = fieldNorm(doc=3361)
          0.029093552 = weight(_text_:22 in 3361) [ClassicSimilarity], result of:
            0.029093552 = score(doc=3361,freq=2.0), product of:
              0.10742335 = queryWeight, product of:
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.03067635 = queryNorm
              0.2708308 = fieldWeight in 3361, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.0546875 = fieldNorm(doc=3361)
      0.1 = coord(1/10)
    
    Abstract
    This chapter considers the continuing relevance of the ideas of browsing, serendipity, information encountering and literature discovery in the context of the information retrieval (IR) environment of 2010, though its scope extends to the ideas in the broader contexts of information seeking and information-related behaviour. It is based around a selective review of the literature since 1990 and reflection and speculation on the results. The central focus is on questions of how the concept of browsing, serendipity and related ideas have changed in the new IR environment of the web and whether, indeed, they are still meaningfull concepts.
    Pages
    S.1-22
  10. Norris, M.; Oppenheim, C.: ¬The h-index : a broad review of a new bibliometric indicator (2010) 0.01
    0.0116506275 = product of:
      0.11650627 = sum of:
        0.11650627 = sum of:
          0.03442287 = weight(_text_:1990 in 4147) [ClassicSimilarity], result of:
            0.03442287 = score(doc=4147,freq=2.0), product of:
              0.13825724 = queryWeight, product of:
                4.506965 = idf(docFreq=1325, maxDocs=44218)
                0.03067635 = queryNorm
              0.248977 = fieldWeight in 4147, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                4.506965 = idf(docFreq=1325, maxDocs=44218)
                0.0390625 = fieldNorm(doc=4147)
          0.06130229 = weight(_text_:2010 in 4147) [ClassicSimilarity], result of:
            0.06130229 = score(doc=4147,freq=5.0), product of:
              0.14672957 = queryWeight, product of:
                4.7831497 = idf(docFreq=1005, maxDocs=44218)
                0.03067635 = queryNorm
              0.41779095 = fieldWeight in 4147, product of:
                2.236068 = tf(freq=5.0), with freq of:
                  5.0 = termFreq=5.0
                4.7831497 = idf(docFreq=1005, maxDocs=44218)
                0.0390625 = fieldNorm(doc=4147)
          0.02078111 = weight(_text_:22 in 4147) [ClassicSimilarity], result of:
            0.02078111 = score(doc=4147,freq=2.0), product of:
              0.10742335 = queryWeight, product of:
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.03067635 = queryNorm
              0.19345059 = fieldWeight in 4147, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.0390625 = fieldNorm(doc=4147)
      0.1 = coord(1/10)
    
    Abstract
    Purpose - This review aims to show, broadly, how the h-index has become a subject of widespread debate, how it has spawned many variants and diverse applications since first introduced in 2005 and some of the issues in its use. Design/methodology/approach - The review drew on a range of material published in 1990 or so sources published since 2005. From these sources, a number of themes were identified and discussed ranging from the h-index's advantages to which citation database might be selected for its calculation. Findings - The analysis shows how the h-index has quickly established itself as a major subject of interest in the field of bibliometrics. Study of the index ranges from its mathematical underpinning to a range of variants perceived to address the indexes' shortcomings. The review illustrates how widely the index has been applied but also how care must be taken in its application. Originality/value - The use of bibliometric indicators to measure research performance continues, with the h-index as its latest addition. The use of the h-index, its variants and many applications to which it has been put are still at the exploratory stage. The review shows the breadth and diversity of this research and the need to verify the veracity of the h-index by more studies.
    Date
    8. 1.2011 19:22:13
    Source
    Journal of documentation. 66(2010) no.5, S.681-705
    Year
    2010
  11. Hudon, M.: Teaching Classification, 1990-2010 (2010) 0.01
    0.009982627 = product of:
      0.09982627 = sum of:
        0.09982627 = product of:
          0.1497394 = sum of:
            0.04819202 = weight(_text_:1990 in 3569) [ClassicSimilarity], result of:
              0.04819202 = score(doc=3569,freq=2.0), product of:
                0.13825724 = queryWeight, product of:
                  4.506965 = idf(docFreq=1325, maxDocs=44218)
                  0.03067635 = queryNorm
                0.3485678 = fieldWeight in 3569, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  4.506965 = idf(docFreq=1325, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=3569)
            0.10154738 = weight(_text_:2010 in 3569) [ClassicSimilarity], result of:
              0.10154738 = score(doc=3569,freq=7.0), product of:
                0.14672957 = queryWeight, product of:
                  4.7831497 = idf(docFreq=1005, maxDocs=44218)
                  0.03067635 = queryNorm
                0.6920717 = fieldWeight in 3569, product of:
                  2.6457512 = tf(freq=7.0), with freq of:
                    7.0 = termFreq=7.0
                  4.7831497 = idf(docFreq=1005, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=3569)
          0.6666667 = coord(2/3)
      0.1 = coord(1/10)
    
    Source
    Cataloging and classification quarterly. 48(2010) no.1, S.64-82
    Year
    2010
  12. Karpuk, S.: Cataloging seventeenth- and eighteenth-century German dissertations : guidelines and observations (2010) 0.01
    0.009795459 = product of:
      0.048977293 = sum of:
        0.016282737 = product of:
          0.04884821 = sum of:
            0.04884821 = weight(_text_:problem in 3555) [ClassicSimilarity], result of:
              0.04884821 = score(doc=3555,freq=2.0), product of:
                0.1302053 = queryWeight, product of:
                  4.244485 = idf(docFreq=1723, maxDocs=44218)
                  0.03067635 = queryNorm
                0.375163 = fieldWeight in 3555, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  4.244485 = idf(docFreq=1723, maxDocs=44218)
                  0.0625 = fieldNorm(doc=3555)
          0.33333334 = coord(1/3)
        0.032694556 = product of:
          0.09808366 = sum of:
            0.09808366 = weight(_text_:2010 in 3555) [ClassicSimilarity], result of:
              0.09808366 = score(doc=3555,freq=5.0), product of:
                0.14672957 = queryWeight, product of:
                  4.7831497 = idf(docFreq=1005, maxDocs=44218)
                  0.03067635 = queryNorm
                0.6684655 = fieldWeight in 3555, product of:
                  2.236068 = tf(freq=5.0), with freq of:
                    5.0 = termFreq=5.0
                  4.7831497 = idf(docFreq=1005, maxDocs=44218)
                  0.0625 = fieldNorm(doc=3555)
          0.33333334 = coord(1/3)
      0.2 = coord(2/10)
    
    Abstract
    The author provides historical background useful in understanding the title pages of seventeenth- and eighteenth-century German dissertations. Images of title pages are included, with details of bibliographic description, and Machine Readable Cataloging (MARC) coding, as well as links to examples of catalog records in the Yale Law Library catalog, MORRIS. This article also includes comments on Anglo-American Cataloguing Rules, Second Edition (AACR2) Rule 21.27 regarding the problem of authorship in early dissertations.
    Source
    Cataloging and classification quarterly. 48(2010) no.4, S.330-342
    Year
    2010
  13. Euzenat, J.; Shvaiko, P.: Ontology matching (2010) 0.01
    0.009584397 = product of:
      0.047921985 = sum of:
        0.011513635 = product of:
          0.034540903 = sum of:
            0.034540903 = weight(_text_:problem in 168) [ClassicSimilarity], result of:
              0.034540903 = score(doc=168,freq=4.0), product of:
                0.1302053 = queryWeight, product of:
                  4.244485 = idf(docFreq=1723, maxDocs=44218)
                  0.03067635 = queryNorm
                0.2652803 = fieldWeight in 168, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  4.244485 = idf(docFreq=1723, maxDocs=44218)
                  0.03125 = fieldNorm(doc=168)
          0.33333334 = coord(1/3)
        0.03640835 = product of:
          0.054612525 = sum of:
            0.03798764 = weight(_text_:2010 in 168) [ClassicSimilarity], result of:
              0.03798764 = score(doc=168,freq=3.0), product of:
                0.14672957 = queryWeight, product of:
                  4.7831497 = idf(docFreq=1005, maxDocs=44218)
                  0.03067635 = queryNorm
                0.25889558 = fieldWeight in 168, product of:
                  1.7320508 = tf(freq=3.0), with freq of:
                    3.0 = termFreq=3.0
                  4.7831497 = idf(docFreq=1005, maxDocs=44218)
                  0.03125 = fieldNorm(doc=168)
            0.016624888 = weight(_text_:22 in 168) [ClassicSimilarity], result of:
              0.016624888 = score(doc=168,freq=2.0), product of:
                0.10742335 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.03067635 = queryNorm
                0.15476047 = fieldWeight in 168, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.03125 = fieldNorm(doc=168)
          0.6666667 = coord(2/3)
      0.2 = coord(2/10)
    
    Abstract
    Ontologies are viewed as the silver bullet for many applications, but in open or evolving systems, different parties can adopt different ontologies. This increases heterogeneity problems rather than reducing heterogeneity. This book proposes ontology matching as a solution to the problem of semantic heterogeneity, offering researchers and practitioners a uniform framework of reference to currently available work. The techniques presented apply to database schema matching, catalog integration, XML schema matching and more. Ontologies tend to be found everywhere. They are viewed as the silver bullet for many applications, such as database integration, peer-to-peer systems, e-commerce, semantic web services, or social networks. However, in open or evolving systems, such as the semantic web, different parties would, in general, adopt different ontologies. Thus, merely using ontologies, like using XML, does not reduce heterogeneity: it just raises heterogeneity problems to a higher level. Euzenat and Shvaiko's book is devoted to ontology matching as a solution to the semantic heterogeneity problem faced by computer systems. Ontology matching aims at finding correspondences between semantically related entities of different ontologies. These correspondences may stand for equivalence as well as other relations, such as consequence, subsumption, or disjointness, between ontology entities. Many different matching solutions have been proposed so far from various viewpoints, e.g., databases, information systems, artificial intelligence. With Ontology Matching, researchers and practitioners will find a reference book which presents currently available work in a uniform framework. In particular, the work and the techniques presented in this book can equally be applied to database schema matching, catalog integration, XML schema matching and other related problems. The objectives of the book include presenting (i) the state of the art and (ii) the latest research results in ontology matching by providing a detailed account of matching techniques and matching systems in a systematic way from theoretical, practical and application perspectives.
    Date
    20. 6.2012 19:08:22
    Year
    2010
  14. Khalifa, M.; Shen, K.N.: Applying semantic networks to hypertext design : effects on knowledge structure acquisition and problem solving (2010) 0.01
    0.009134563 = product of:
      0.04567281 = sum of:
        0.021151898 = product of:
          0.06345569 = sum of:
            0.06345569 = weight(_text_:problem in 3708) [ClassicSimilarity], result of:
              0.06345569 = score(doc=3708,freq=6.0), product of:
                0.1302053 = queryWeight, product of:
                  4.244485 = idf(docFreq=1723, maxDocs=44218)
                  0.03067635 = queryNorm
                0.48735106 = fieldWeight in 3708, product of:
                  2.4494898 = tf(freq=6.0), with freq of:
                    6.0 = termFreq=6.0
                  4.244485 = idf(docFreq=1723, maxDocs=44218)
                  0.046875 = fieldNorm(doc=3708)
          0.33333334 = coord(1/3)
        0.024520915 = product of:
          0.07356274 = sum of:
            0.07356274 = weight(_text_:2010 in 3708) [ClassicSimilarity], result of:
              0.07356274 = score(doc=3708,freq=5.0), product of:
                0.14672957 = queryWeight, product of:
                  4.7831497 = idf(docFreq=1005, maxDocs=44218)
                  0.03067635 = queryNorm
                0.5013491 = fieldWeight in 3708, product of:
                  2.236068 = tf(freq=5.0), with freq of:
                    5.0 = termFreq=5.0
                  4.7831497 = idf(docFreq=1005, maxDocs=44218)
                  0.046875 = fieldNorm(doc=3708)
          0.33333334 = coord(1/3)
      0.2 = coord(2/10)
    
    Abstract
    One of the key objectives of knowledge management is to transfer knowledge quickly and efficiently from experts to novices, who are different in terms of the structural properties of domain knowledge or knowledge structure. This study applies experts' semantic networks to hypertext navigation design and examines the potential of the resulting design, i.e., semantic hypertext, in facilitating knowledge structure acquisition and problem solving. Moreover, we argue that the level of sophistication of the knowledge structure acquired by learners is an important mediator influencing the learning outcomes (in this case, problem solving). The research model was empirically tested with a situated experiment involving 80 business professionals. The results of the empirical study provided strong support for the effectiveness of semantic hypertext in transferring knowledge structure and reported a significant full mediating effect of knowledge structure sophistication. Both theoretical and practical implications of this research are discussed.
    Source
    Journal of the American Society for Information Science and Technology. 61(2010) no.8, S.1673-1685
    Year
    2010
  15. Hwang, S.-Y.; Yang, W.-S.; Ting, K.-D.: Automatic index construction for multimedia digital libraries (2010) 0.01
    0.008358273 = product of:
      0.041791365 = sum of:
        0.017270451 = product of:
          0.051811352 = sum of:
            0.051811352 = weight(_text_:problem in 4228) [ClassicSimilarity], result of:
              0.051811352 = score(doc=4228,freq=4.0), product of:
                0.1302053 = queryWeight, product of:
                  4.244485 = idf(docFreq=1723, maxDocs=44218)
                  0.03067635 = queryNorm
                0.39792046 = fieldWeight in 4228, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  4.244485 = idf(docFreq=1723, maxDocs=44218)
                  0.046875 = fieldNorm(doc=4228)
          0.33333334 = coord(1/3)
        0.024520915 = product of:
          0.07356274 = sum of:
            0.07356274 = weight(_text_:2010 in 4228) [ClassicSimilarity], result of:
              0.07356274 = score(doc=4228,freq=5.0), product of:
                0.14672957 = queryWeight, product of:
                  4.7831497 = idf(docFreq=1005, maxDocs=44218)
                  0.03067635 = queryNorm
                0.5013491 = fieldWeight in 4228, product of:
                  2.236068 = tf(freq=5.0), with freq of:
                    5.0 = termFreq=5.0
                  4.7831497 = idf(docFreq=1005, maxDocs=44218)
                  0.046875 = fieldNorm(doc=4228)
          0.33333334 = coord(1/3)
      0.2 = coord(2/10)
    
    Abstract
    Indexing remains one of the most popular tools provided by digital libraries to help users identify and understand the characteristics of the information they need. Despite extensive studies of the problem of automatic index construction for text-based digital libraries, the construction of multimedia digital libraries continues to represent a challenge, because multimedia objects usually lack sufficient text information to ensure reliable index learning. This research attempts to tackle the problem of automatic index construction for multimedia objects by employing Web usage logs and limited keywords pertaining to multimedia objects. The tests of two proposed algorithms use two different data sets with different amounts of textual information. Web usage logs offer precious information for building indexes of multimedia digital libraries with limited textual information. The proposed methods generally yield better indexes, especially for the artwork data set.
    Source
    Information processing and management. 46(2010) no.3, S.295-307
    Year
    2010
  16. Halpin, H.; Hayes, P.J.: When owl:sameAs isn't the same : an analysis of identity links on the Semantic Web (2010) 0.01
    0.008358273 = product of:
      0.041791365 = sum of:
        0.017270451 = product of:
          0.051811352 = sum of:
            0.051811352 = weight(_text_:problem in 4834) [ClassicSimilarity], result of:
              0.051811352 = score(doc=4834,freq=4.0), product of:
                0.1302053 = queryWeight, product of:
                  4.244485 = idf(docFreq=1723, maxDocs=44218)
                  0.03067635 = queryNorm
                0.39792046 = fieldWeight in 4834, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  4.244485 = idf(docFreq=1723, maxDocs=44218)
                  0.046875 = fieldNorm(doc=4834)
          0.33333334 = coord(1/3)
        0.024520915 = product of:
          0.07356274 = sum of:
            0.07356274 = weight(_text_:2010 in 4834) [ClassicSimilarity], result of:
              0.07356274 = score(doc=4834,freq=5.0), product of:
                0.14672957 = queryWeight, product of:
                  4.7831497 = idf(docFreq=1005, maxDocs=44218)
                  0.03067635 = queryNorm
                0.5013491 = fieldWeight in 4834, product of:
                  2.236068 = tf(freq=5.0), with freq of:
                    5.0 = termFreq=5.0
                  4.7831497 = idf(docFreq=1005, maxDocs=44218)
                  0.046875 = fieldNorm(doc=4834)
          0.33333334 = coord(1/3)
      0.2 = coord(2/10)
    
    Abstract
    In Linked Data, the use of owl:sameAs is ubiquitous in 'inter-linking' data-sets. However, there is a lurking suspicion within the Linked Data community that this use of owl:sameAs may be somehow incorrect, in particular with regards to its interactions with inference. In fact, owl:sameAs can be considered just one type of 'identity link', a link that declares two items to be identical in some fashion. After reviewing the definitions and history of the problem of identity in philosophy and knowledge representation, we outline four alternative readings of owl:sameAs, showing with examples how it is being (ab)used on the Web of data. Then we present possible solutions to this problem by introducing alternative identity links that rely on named graphs.
    Source
    Linked Data on the Web (LDOW2010). Proceedings of the WWW2010 Workshop on Linked Data on the Web. Raleigh, USA, April 27, 2010. Edited by Christian Bizer et al
    Year
    2010
  17. Nguyen, S.-H.; Chowdhury, G.: Interpreting the knowledge map of digital library research (1990-2010) (2013) 0.01
    0.0082809385 = product of:
      0.08280938 = sum of:
        0.08280938 = product of:
          0.12421407 = sum of:
            0.058417547 = weight(_text_:1990 in 958) [ClassicSimilarity], result of:
              0.058417547 = score(doc=958,freq=4.0), product of:
                0.13825724 = queryWeight, product of:
                  4.506965 = idf(docFreq=1325, maxDocs=44218)
                  0.03067635 = queryNorm
                0.42252797 = fieldWeight in 958, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  4.506965 = idf(docFreq=1325, maxDocs=44218)
                  0.046875 = fieldNorm(doc=958)
            0.06579652 = weight(_text_:2010 in 958) [ClassicSimilarity], result of:
              0.06579652 = score(doc=958,freq=4.0), product of:
                0.14672957 = queryWeight, product of:
                  4.7831497 = idf(docFreq=1005, maxDocs=44218)
                  0.03067635 = queryNorm
                0.4484203 = fieldWeight in 958, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  4.7831497 = idf(docFreq=1005, maxDocs=44218)
                  0.046875 = fieldNorm(doc=958)
          0.6666667 = coord(2/3)
      0.1 = coord(1/10)
    
    Abstract
    A knowledge map of digital library (DL) research shows the semantic organization of DL research topics and also the evolution of the field. The research reported in this article aims to find the core topics and subtopics of DL research in order to build a knowledge map of the DL domain. The methodology is comprised of a four-step research process, and two knowledge organization methods (classification and thesaurus building) were used. A knowledge map covering 21 core topics and 1,015 subtopics of DL research was created and provides a systematic overview of DL research during the last two decades (1990-2010). We argue that the map can work as a knowledge platform to guide, evaluate, and improve the activities of DL research, education, and practices. Moreover, it can be transformed into a DL ontology for various applications. The research methodology can be used to map any human knowledge domain; it is a novel and scientific method for producing comprehensive and systematic knowledge maps based on literary warrant.
  18. Williamson, N.J.: Paradigms and conceptual systems in knowledge organization, the Eleventh International ISKO Conference, Rome, 2010 (2013) 0.01
    0.00820721 = product of:
      0.0820721 = sum of:
        0.0820721 = product of:
          0.12310815 = sum of:
            0.0940146 = weight(_text_:2010 in 638) [ClassicSimilarity], result of:
              0.0940146 = score(doc=638,freq=6.0), product of:
                0.14672957 = queryWeight, product of:
                  4.7831497 = idf(docFreq=1005, maxDocs=44218)
                  0.03067635 = queryNorm
                0.64073384 = fieldWeight in 638, product of:
                  2.4494898 = tf(freq=6.0), with freq of:
                    6.0 = termFreq=6.0
                  4.7831497 = idf(docFreq=1005, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=638)
            0.029093552 = weight(_text_:22 in 638) [ClassicSimilarity], result of:
              0.029093552 = score(doc=638,freq=2.0), product of:
                0.10742335 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.03067635 = queryNorm
                0.2708308 = fieldWeight in 638, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=638)
          0.6666667 = coord(2/3)
      0.1 = coord(1/10)
    
    Abstract
    The eleventh International ISKO Conference on "Paradigms and Conceptual Systems in Knowledge Organization" was held in Rome, February 23-26, 2010. The proceedings were edited by Claudio Gnoli and Fulvio Mazzocchi and published by Ergon Verlag in 2010. This analysis follows the order of the text of the proceedings, an order prescribed from the abridged scheme for KO literature published in Knowledge Organization, 25, 1998, no. 4, p. 226. Some invited papers, marked with [LR], have been included and are labelled as such in the table of contents. In all, 64 papers were published.
    Date
    22. 2.2013 12:09:50
  19. Deokattey, S.; Neelameghan, A.; Kumar, V.: ¬A method for developing a domain ontology : a case study for a multidisciplinary subject (2010) 0.01
    0.0076611177 = product of:
      0.076611176 = sum of:
        0.076611176 = product of:
          0.11491676 = sum of:
            0.0858232 = weight(_text_:2010 in 3694) [ClassicSimilarity], result of:
              0.0858232 = score(doc=3694,freq=5.0), product of:
                0.14672957 = queryWeight, product of:
                  4.7831497 = idf(docFreq=1005, maxDocs=44218)
                  0.03067635 = queryNorm
                0.5849073 = fieldWeight in 3694, product of:
                  2.236068 = tf(freq=5.0), with freq of:
                    5.0 = termFreq=5.0
                  4.7831497 = idf(docFreq=1005, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=3694)
            0.029093552 = weight(_text_:22 in 3694) [ClassicSimilarity], result of:
              0.029093552 = score(doc=3694,freq=2.0), product of:
                0.10742335 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.03067635 = queryNorm
                0.2708308 = fieldWeight in 3694, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=3694)
          0.6666667 = coord(2/3)
      0.1 = coord(1/10)
    
    Date
    22. 7.2010 19:41:16
    Source
    Knowledge organization. 37(2010) no.3, S.173-184
    Year
    2010
  20. Perugini, S.: Supporting multiple paths to objects in information hierarchies : faceted classification, faceted search, and symbolic links (2010) 0.01
    0.0076611177 = product of:
      0.076611176 = sum of:
        0.076611176 = product of:
          0.11491676 = sum of:
            0.0858232 = weight(_text_:2010 in 4227) [ClassicSimilarity], result of:
              0.0858232 = score(doc=4227,freq=5.0), product of:
                0.14672957 = queryWeight, product of:
                  4.7831497 = idf(docFreq=1005, maxDocs=44218)
                  0.03067635 = queryNorm
                0.5849073 = fieldWeight in 4227, product of:
                  2.236068 = tf(freq=5.0), with freq of:
                    5.0 = termFreq=5.0
                  4.7831497 = idf(docFreq=1005, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=4227)
            0.029093552 = weight(_text_:22 in 4227) [ClassicSimilarity], result of:
              0.029093552 = score(doc=4227,freq=2.0), product of:
                0.10742335 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.03067635 = queryNorm
                0.2708308 = fieldWeight in 4227, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=4227)
          0.6666667 = coord(2/3)
      0.1 = coord(1/10)
    
    Source
    Information processing and management. 46(2010) no.1, S.22-43
    Year
    2010

Types

  • a 1083
  • el 79
  • m 67
  • s 23
  • x 6
  • b 4
  • i 3
  • r 3
  • n 1
  • More… Less…

Themes

Subjects

Classifications