Search (2232 results, page 1 of 112)

  • × year_i:[2010 TO 2020}
  1. Kleineberg, M.: Context analysis and context indexing : formal pragmatics in knowledge organization (2014) 0.40
    0.3955221 = product of:
      0.7910442 = sum of:
        0.07573764 = product of:
          0.22721292 = sum of:
            0.22721292 = weight(_text_:3a in 1826) [ClassicSimilarity], result of:
              0.22721292 = score(doc=1826,freq=2.0), product of:
                0.24256827 = queryWeight, product of:
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.028611459 = queryNorm
                0.93669677 = fieldWeight in 1826, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.078125 = fieldNorm(doc=1826)
          0.33333334 = coord(1/3)
        0.22721292 = weight(_text_:2f in 1826) [ClassicSimilarity], result of:
          0.22721292 = score(doc=1826,freq=2.0), product of:
            0.24256827 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.028611459 = queryNorm
            0.93669677 = fieldWeight in 1826, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.078125 = fieldNorm(doc=1826)
        0.22721292 = weight(_text_:2f in 1826) [ClassicSimilarity], result of:
          0.22721292 = score(doc=1826,freq=2.0), product of:
            0.24256827 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.028611459 = queryNorm
            0.93669677 = fieldWeight in 1826, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.078125 = fieldNorm(doc=1826)
        0.033667825 = weight(_text_:web in 1826) [ClassicSimilarity], result of:
          0.033667825 = score(doc=1826,freq=2.0), product of:
            0.0933738 = queryWeight, product of:
              3.2635105 = idf(docFreq=4597, maxDocs=44218)
              0.028611459 = queryNorm
            0.36057037 = fieldWeight in 1826, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.2635105 = idf(docFreq=4597, maxDocs=44218)
              0.078125 = fieldNorm(doc=1826)
        0.22721292 = weight(_text_:2f in 1826) [ClassicSimilarity], result of:
          0.22721292 = score(doc=1826,freq=2.0), product of:
            0.24256827 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.028611459 = queryNorm
            0.93669677 = fieldWeight in 1826, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.078125 = fieldNorm(doc=1826)
      0.5 = coord(5/10)
    
    Source
    http://www.google.de/url?sa=t&rct=j&q=&esrc=s&source=web&cd=5&ved=0CDQQFjAE&url=http%3A%2F%2Fdigbib.ubka.uni-karlsruhe.de%2Fvolltexte%2Fdocuments%2F3131107&ei=HzFWVYvGMsiNsgGTyoFI&usg=AFQjCNE2FHUeR9oQTQlNC4TPedv4Mo3DaQ&sig2=Rlzpr7a3BLZZkqZCXXN_IA&bvm=bv.93564037,d.bGg&cad=rja
  2. Verwer, K.: Freiheit und Verantwortung bei Hans Jonas (2011) 0.36
    0.36354065 = product of:
      0.9088516 = sum of:
        0.09088516 = product of:
          0.2726555 = sum of:
            0.2726555 = weight(_text_:3a in 973) [ClassicSimilarity], result of:
              0.2726555 = score(doc=973,freq=2.0), product of:
                0.24256827 = queryWeight, product of:
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.028611459 = queryNorm
                1.1240361 = fieldWeight in 973, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.09375 = fieldNorm(doc=973)
          0.33333334 = coord(1/3)
        0.2726555 = weight(_text_:2f in 973) [ClassicSimilarity], result of:
          0.2726555 = score(doc=973,freq=2.0), product of:
            0.24256827 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.028611459 = queryNorm
            1.1240361 = fieldWeight in 973, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.09375 = fieldNorm(doc=973)
        0.2726555 = weight(_text_:2f in 973) [ClassicSimilarity], result of:
          0.2726555 = score(doc=973,freq=2.0), product of:
            0.24256827 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.028611459 = queryNorm
            1.1240361 = fieldWeight in 973, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.09375 = fieldNorm(doc=973)
        0.2726555 = weight(_text_:2f in 973) [ClassicSimilarity], result of:
          0.2726555 = score(doc=973,freq=2.0), product of:
            0.24256827 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.028611459 = queryNorm
            1.1240361 = fieldWeight in 973, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.09375 = fieldNorm(doc=973)
      0.4 = coord(4/10)
    
    Content
    Vgl.: http%3A%2F%2Fcreativechoice.org%2Fdoc%2FHansJonas.pdf&usg=AOvVaw1TM3teaYKgABL5H9yoIifA&opi=89978449.
  3. Suchenwirth, L.: Sacherschliessung in Zeiten von Corona : neue Herausforderungen und Chancen (2019) 0.25
    0.2495329 = product of:
      0.6238322 = sum of:
        0.04544258 = product of:
          0.13632774 = sum of:
            0.13632774 = weight(_text_:3a in 484) [ClassicSimilarity], result of:
              0.13632774 = score(doc=484,freq=2.0), product of:
                0.24256827 = queryWeight, product of:
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.028611459 = queryNorm
                0.56201804 = fieldWeight in 484, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.046875 = fieldNorm(doc=484)
          0.33333334 = coord(1/3)
        0.19279654 = weight(_text_:2f in 484) [ClassicSimilarity], result of:
          0.19279654 = score(doc=484,freq=4.0), product of:
            0.24256827 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.028611459 = queryNorm
            0.7948135 = fieldWeight in 484, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.046875 = fieldNorm(doc=484)
        0.19279654 = weight(_text_:2f in 484) [ClassicSimilarity], result of:
          0.19279654 = score(doc=484,freq=4.0), product of:
            0.24256827 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.028611459 = queryNorm
            0.7948135 = fieldWeight in 484, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.046875 = fieldNorm(doc=484)
        0.19279654 = weight(_text_:2f in 484) [ClassicSimilarity], result of:
          0.19279654 = score(doc=484,freq=4.0), product of:
            0.24256827 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.028611459 = queryNorm
            0.7948135 = fieldWeight in 484, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.046875 = fieldNorm(doc=484)
      0.4 = coord(4/10)
    
    Footnote
    https%3A%2F%2Fjournals.univie.ac.at%2Findex.php%2Fvoebm%2Farticle%2Fdownload%2F5332%2F5271%2F&usg=AOvVaw2yQdFGHlmOwVls7ANCpTii.
  4. Herb, U.; Beucke, D.: ¬Die Zukunft der Impact-Messung : Social Media, Nutzung und Zitate im World Wide Web (2013) 0.23
    0.2288981 = product of:
      0.57224524 = sum of:
        0.18177032 = weight(_text_:2f in 2188) [ClassicSimilarity], result of:
          0.18177032 = score(doc=2188,freq=2.0), product of:
            0.24256827 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.028611459 = queryNorm
            0.7493574 = fieldWeight in 2188, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.0625 = fieldNorm(doc=2188)
        0.18177032 = weight(_text_:2f in 2188) [ClassicSimilarity], result of:
          0.18177032 = score(doc=2188,freq=2.0), product of:
            0.24256827 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.028611459 = queryNorm
            0.7493574 = fieldWeight in 2188, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.0625 = fieldNorm(doc=2188)
        0.026934259 = weight(_text_:web in 2188) [ClassicSimilarity], result of:
          0.026934259 = score(doc=2188,freq=2.0), product of:
            0.0933738 = queryWeight, product of:
              3.2635105 = idf(docFreq=4597, maxDocs=44218)
              0.028611459 = queryNorm
            0.2884563 = fieldWeight in 2188, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.2635105 = idf(docFreq=4597, maxDocs=44218)
              0.0625 = fieldNorm(doc=2188)
        0.18177032 = weight(_text_:2f in 2188) [ClassicSimilarity], result of:
          0.18177032 = score(doc=2188,freq=2.0), product of:
            0.24256827 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.028611459 = queryNorm
            0.7493574 = fieldWeight in 2188, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.0625 = fieldNorm(doc=2188)
      0.4 = coord(4/10)
    
    Content
    Vgl. unter: https://www.leibniz-science20.de%2Fforschung%2Fprojekte%2Faltmetrics-in-verschiedenen-wissenschaftsdisziplinen%2F&ei=2jTgVaaXGcK4Udj1qdgB&usg=AFQjCNFOPdONj4RKBDf9YDJOLuz3lkGYlg&sig2=5YI3KWIGxBmk5_kv0P_8iQ.
  5. Huo, W.: Automatic multi-word term extraction and its application to Web-page summarization (2012) 0.23
    0.22856878 = product of:
      0.45713755 = sum of:
        0.13632774 = weight(_text_:2f in 563) [ClassicSimilarity], result of:
          0.13632774 = score(doc=563,freq=2.0), product of:
            0.24256827 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.028611459 = queryNorm
            0.56201804 = fieldWeight in 563, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.046875 = fieldNorm(doc=563)
        0.13632774 = weight(_text_:2f in 563) [ClassicSimilarity], result of:
          0.13632774 = score(doc=563,freq=2.0), product of:
            0.24256827 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.028611459 = queryNorm
            0.56201804 = fieldWeight in 563, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.046875 = fieldNorm(doc=563)
        0.040401388 = weight(_text_:web in 563) [ClassicSimilarity], result of:
          0.040401388 = score(doc=563,freq=8.0), product of:
            0.0933738 = queryWeight, product of:
              3.2635105 = idf(docFreq=4597, maxDocs=44218)
              0.028611459 = queryNorm
            0.43268442 = fieldWeight in 563, product of:
              2.828427 = tf(freq=8.0), with freq of:
                8.0 = termFreq=8.0
              3.2635105 = idf(docFreq=4597, maxDocs=44218)
              0.046875 = fieldNorm(doc=563)
        0.13632774 = weight(_text_:2f in 563) [ClassicSimilarity], result of:
          0.13632774 = score(doc=563,freq=2.0), product of:
            0.24256827 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.028611459 = queryNorm
            0.56201804 = fieldWeight in 563, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.046875 = fieldNorm(doc=563)
        0.0077529154 = product of:
          0.023258746 = sum of:
            0.023258746 = weight(_text_:22 in 563) [ClassicSimilarity], result of:
              0.023258746 = score(doc=563,freq=2.0), product of:
                0.10019246 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.028611459 = queryNorm
                0.23214069 = fieldWeight in 563, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.046875 = fieldNorm(doc=563)
          0.33333334 = coord(1/3)
      0.5 = coord(5/10)
    
    Abstract
    In this thesis we propose three new word association measures for multi-word term extraction. We combine these association measures with LocalMaxs algorithm in our extraction model and compare the results of different multi-word term extraction methods. Our approach is language and domain independent and requires no training data. It can be applied to such tasks as text summarization, information retrieval, and document classification. We further explore the potential of using multi-word terms as an effective representation for general web-page summarization. We extract multi-word terms from human written summaries in a large collection of web-pages, and generate the summaries by aligning document words with these multi-word terms. Our system applies machine translation technology to learn the aligning process from a training set and focuses on selecting high quality multi-word terms from human written summaries to generate suitable results for web-page summarization.
    Content
    A Thesis presented to The University of Guelph In partial fulfilment of requirements for the degree of Master of Science in Computer Science. Vgl. Unter: http://www.inf.ufrgs.br%2F~ceramisch%2Fdownload_files%2Fpublications%2F2009%2Fp01.pdf.
    Date
    10. 1.2013 19:22:47
  6. Gödert, W.; Lepsky, K.: Informationelle Kompetenz : ein humanistischer Entwurf (2019) 0.21
    0.21206538 = product of:
      0.53016347 = sum of:
        0.05301635 = product of:
          0.15904905 = sum of:
            0.15904905 = weight(_text_:3a in 5955) [ClassicSimilarity], result of:
              0.15904905 = score(doc=5955,freq=2.0), product of:
                0.24256827 = queryWeight, product of:
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.028611459 = queryNorm
                0.65568775 = fieldWeight in 5955, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=5955)
          0.33333334 = coord(1/3)
        0.15904905 = weight(_text_:2f in 5955) [ClassicSimilarity], result of:
          0.15904905 = score(doc=5955,freq=2.0), product of:
            0.24256827 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.028611459 = queryNorm
            0.65568775 = fieldWeight in 5955, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.0546875 = fieldNorm(doc=5955)
        0.15904905 = weight(_text_:2f in 5955) [ClassicSimilarity], result of:
          0.15904905 = score(doc=5955,freq=2.0), product of:
            0.24256827 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.028611459 = queryNorm
            0.65568775 = fieldWeight in 5955, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.0546875 = fieldNorm(doc=5955)
        0.15904905 = weight(_text_:2f in 5955) [ClassicSimilarity], result of:
          0.15904905 = score(doc=5955,freq=2.0), product of:
            0.24256827 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.028611459 = queryNorm
            0.65568775 = fieldWeight in 5955, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.0546875 = fieldNorm(doc=5955)
      0.4 = coord(4/10)
    
    Footnote
    Rez. in: Philosophisch-ethische Rezensionen vom 09.11.2019 (Jürgen Czogalla), Unter: https://philosophisch-ethische-rezensionen.de/rezension/Goedert1.html. In: B.I.T. online 23(2020) H.3, S.345-347 (W. Sühl-Strohmenger) [Unter: https%3A%2F%2Fwww.b-i-t-online.de%2Fheft%2F2020-03-rezensionen.pdf&usg=AOvVaw0iY3f_zNcvEjeZ6inHVnOK]. In: Open Password Nr. 805 vom 14.08.2020 (H.-C. Hobohm) [Unter: https://www.password-online.de/?mailpoet_router&endpoint=view_in_browser&action=view&data=WzE0MywiOGI3NjZkZmNkZjQ1IiwwLDAsMTMxLDFd].
  7. Farazi, M.: Faceted lightweight ontologies : a formalization and some experiments (2010) 0.20
    0.20392269 = product of:
      0.40784538 = sum of:
        0.03786882 = product of:
          0.11360646 = sum of:
            0.11360646 = weight(_text_:3a in 4997) [ClassicSimilarity], result of:
              0.11360646 = score(doc=4997,freq=2.0), product of:
                0.24256827 = queryWeight, product of:
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.028611459 = queryNorm
                0.46834838 = fieldWeight in 4997, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=4997)
          0.33333334 = coord(1/3)
        0.11360646 = weight(_text_:2f in 4997) [ClassicSimilarity], result of:
          0.11360646 = score(doc=4997,freq=2.0), product of:
            0.24256827 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.028611459 = queryNorm
            0.46834838 = fieldWeight in 4997, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.0390625 = fieldNorm(doc=4997)
        0.11360646 = weight(_text_:2f in 4997) [ClassicSimilarity], result of:
          0.11360646 = score(doc=4997,freq=2.0), product of:
            0.24256827 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.028611459 = queryNorm
            0.46834838 = fieldWeight in 4997, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.0390625 = fieldNorm(doc=4997)
        0.029157192 = weight(_text_:web in 4997) [ClassicSimilarity], result of:
          0.029157192 = score(doc=4997,freq=6.0), product of:
            0.0933738 = queryWeight, product of:
              3.2635105 = idf(docFreq=4597, maxDocs=44218)
              0.028611459 = queryNorm
            0.3122631 = fieldWeight in 4997, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              3.2635105 = idf(docFreq=4597, maxDocs=44218)
              0.0390625 = fieldNorm(doc=4997)
        0.11360646 = weight(_text_:2f in 4997) [ClassicSimilarity], result of:
          0.11360646 = score(doc=4997,freq=2.0), product of:
            0.24256827 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.028611459 = queryNorm
            0.46834838 = fieldWeight in 4997, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.0390625 = fieldNorm(doc=4997)
      0.5 = coord(5/10)
    
    Abstract
    While classifications are heavily used to categorize web content, the evolution of the web foresees a more formal structure - ontology - which can serve this purpose. Ontologies are core artifacts of the Semantic Web which enable machines to use inference rules to conduct automated reasoning on data. Lightweight ontologies bridge the gap between classifications and ontologies. A lightweight ontology (LO) is an ontology representing a backbone taxonomy where the concept of the child node is more specific than the concept of the parent node. Formal lightweight ontologies can be generated from their informal ones. The key applications of formal lightweight ontologies are document classification, semantic search, and data integration. However, these applications suffer from the following problems: the disambiguation accuracy of the state of the art NLP tools used in generating formal lightweight ontologies from their informal ones; the lack of background knowledge needed for the formal lightweight ontologies; and the limitation of ontology reuse. In this dissertation, we propose a novel solution to these problems in formal lightweight ontologies; namely, faceted lightweight ontology (FLO). FLO is a lightweight ontology in which terms, present in each node label, and their concepts, are available in the background knowledge (BK), which is organized as a set of facets. A facet can be defined as a distinctive property of the groups of concepts that can help in differentiating one group from another. Background knowledge can be defined as a subset of a knowledge base, such as WordNet, and often represents a specific domain.
    Content
    PhD Dissertation at International Doctorate School in Information and Communication Technology. Vgl.: https%3A%2F%2Fcore.ac.uk%2Fdownload%2Fpdf%2F150083013.pdf&usg=AOvVaw2n-qisNagpyT0lli_6QbAQ.
  8. Shala, E.: ¬Die Autonomie des Menschen und der Maschine : gegenwärtige Definitionen von Autonomie zwischen philosophischem Hintergrund und technologischer Umsetzbarkeit (2014) 0.20
    0.19776104 = product of:
      0.3955221 = sum of:
        0.03786882 = product of:
          0.11360646 = sum of:
            0.11360646 = weight(_text_:3a in 4388) [ClassicSimilarity], result of:
              0.11360646 = score(doc=4388,freq=2.0), product of:
                0.24256827 = queryWeight, product of:
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.028611459 = queryNorm
                0.46834838 = fieldWeight in 4388, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=4388)
          0.33333334 = coord(1/3)
        0.11360646 = weight(_text_:2f in 4388) [ClassicSimilarity], result of:
          0.11360646 = score(doc=4388,freq=2.0), product of:
            0.24256827 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.028611459 = queryNorm
            0.46834838 = fieldWeight in 4388, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.0390625 = fieldNorm(doc=4388)
        0.11360646 = weight(_text_:2f in 4388) [ClassicSimilarity], result of:
          0.11360646 = score(doc=4388,freq=2.0), product of:
            0.24256827 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.028611459 = queryNorm
            0.46834838 = fieldWeight in 4388, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.0390625 = fieldNorm(doc=4388)
        0.016833913 = weight(_text_:web in 4388) [ClassicSimilarity], result of:
          0.016833913 = score(doc=4388,freq=2.0), product of:
            0.0933738 = queryWeight, product of:
              3.2635105 = idf(docFreq=4597, maxDocs=44218)
              0.028611459 = queryNorm
            0.18028519 = fieldWeight in 4388, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.2635105 = idf(docFreq=4597, maxDocs=44218)
              0.0390625 = fieldNorm(doc=4388)
        0.11360646 = weight(_text_:2f in 4388) [ClassicSimilarity], result of:
          0.11360646 = score(doc=4388,freq=2.0), product of:
            0.24256827 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.028611459 = queryNorm
            0.46834838 = fieldWeight in 4388, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.0390625 = fieldNorm(doc=4388)
      0.5 = coord(5/10)
    
    Footnote
    Vgl. unter: https://www.google.de/url?sa=t&rct=j&q=&esrc=s&source=web&cd=2&cad=rja&uact=8&ved=2ahUKEwizweHljdbcAhVS16QKHXcFD9QQFjABegQICRAB&url=https%3A%2F%2Fwww.researchgate.net%2Fpublication%2F271200105_Die_Autonomie_des_Menschen_und_der_Maschine_-_gegenwartige_Definitionen_von_Autonomie_zwischen_philosophischem_Hintergrund_und_technologischer_Umsetzbarkeit_Redigierte_Version_der_Magisterarbeit_Karls&usg=AOvVaw06orrdJmFF2xbCCp_hL26q.
  9. Zeng, Q.; Yu, M.; Yu, W.; Xiong, J.; Shi, Y.; Jiang, M.: Faceted hierarchy : a new graph type to organize scientific concepts and a construction method (2019) 0.18
    0.18177032 = product of:
      0.4544258 = sum of:
        0.04544258 = product of:
          0.13632774 = sum of:
            0.13632774 = weight(_text_:3a in 400) [ClassicSimilarity], result of:
              0.13632774 = score(doc=400,freq=2.0), product of:
                0.24256827 = queryWeight, product of:
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.028611459 = queryNorm
                0.56201804 = fieldWeight in 400, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.046875 = fieldNorm(doc=400)
          0.33333334 = coord(1/3)
        0.13632774 = weight(_text_:2f in 400) [ClassicSimilarity], result of:
          0.13632774 = score(doc=400,freq=2.0), product of:
            0.24256827 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.028611459 = queryNorm
            0.56201804 = fieldWeight in 400, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.046875 = fieldNorm(doc=400)
        0.13632774 = weight(_text_:2f in 400) [ClassicSimilarity], result of:
          0.13632774 = score(doc=400,freq=2.0), product of:
            0.24256827 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.028611459 = queryNorm
            0.56201804 = fieldWeight in 400, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.046875 = fieldNorm(doc=400)
        0.13632774 = weight(_text_:2f in 400) [ClassicSimilarity], result of:
          0.13632774 = score(doc=400,freq=2.0), product of:
            0.24256827 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.028611459 = queryNorm
            0.56201804 = fieldWeight in 400, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.046875 = fieldNorm(doc=400)
      0.4 = coord(4/10)
    
    Content
    Vgl.: https%3A%2F%2Faclanthology.org%2FD19-5317.pdf&usg=AOvVaw0ZZFyq5wWTtNTvNkrvjlGA.
  10. Xiong, C.: Knowledge based text representations for information retrieval (2016) 0.17
    0.16635525 = product of:
      0.41588813 = sum of:
        0.030295055 = product of:
          0.09088516 = sum of:
            0.09088516 = weight(_text_:3a in 5820) [ClassicSimilarity], result of:
              0.09088516 = score(doc=5820,freq=2.0), product of:
                0.24256827 = queryWeight, product of:
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.028611459 = queryNorm
                0.3746787 = fieldWeight in 5820, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.03125 = fieldNorm(doc=5820)
          0.33333334 = coord(1/3)
        0.12853102 = weight(_text_:2f in 5820) [ClassicSimilarity], result of:
          0.12853102 = score(doc=5820,freq=4.0), product of:
            0.24256827 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.028611459 = queryNorm
            0.5298757 = fieldWeight in 5820, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03125 = fieldNorm(doc=5820)
        0.12853102 = weight(_text_:2f in 5820) [ClassicSimilarity], result of:
          0.12853102 = score(doc=5820,freq=4.0), product of:
            0.24256827 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.028611459 = queryNorm
            0.5298757 = fieldWeight in 5820, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03125 = fieldNorm(doc=5820)
        0.12853102 = weight(_text_:2f in 5820) [ClassicSimilarity], result of:
          0.12853102 = score(doc=5820,freq=4.0), product of:
            0.24256827 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.028611459 = queryNorm
            0.5298757 = fieldWeight in 5820, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03125 = fieldNorm(doc=5820)
      0.4 = coord(4/10)
    
    Content
    Submitted in partial fulfillment of the requirements for the degree of Doctor of Philosophy in Language and Information Technologies. Vgl.: https%3A%2F%2Fwww.cs.cmu.edu%2F~cx%2Fpapers%2Fknowledge_based_text_representation.pdf&usg=AOvVaw0SaTSvhWLTh__Uz_HtOtl3.
  11. Piros, A.: Az ETO-jelzetek automatikus interpretálásának és elemzésének kérdései (2018) 0.15
    0.15147528 = product of:
      0.3786882 = sum of:
        0.03786882 = product of:
          0.11360646 = sum of:
            0.11360646 = weight(_text_:3a in 855) [ClassicSimilarity], result of:
              0.11360646 = score(doc=855,freq=2.0), product of:
                0.24256827 = queryWeight, product of:
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.028611459 = queryNorm
                0.46834838 = fieldWeight in 855, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=855)
          0.33333334 = coord(1/3)
        0.11360646 = weight(_text_:2f in 855) [ClassicSimilarity], result of:
          0.11360646 = score(doc=855,freq=2.0), product of:
            0.24256827 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.028611459 = queryNorm
            0.46834838 = fieldWeight in 855, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.0390625 = fieldNorm(doc=855)
        0.11360646 = weight(_text_:2f in 855) [ClassicSimilarity], result of:
          0.11360646 = score(doc=855,freq=2.0), product of:
            0.24256827 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.028611459 = queryNorm
            0.46834838 = fieldWeight in 855, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.0390625 = fieldNorm(doc=855)
        0.11360646 = weight(_text_:2f in 855) [ClassicSimilarity], result of:
          0.11360646 = score(doc=855,freq=2.0), product of:
            0.24256827 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.028611459 = queryNorm
            0.46834838 = fieldWeight in 855, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.0390625 = fieldNorm(doc=855)
      0.4 = coord(4/10)
    
    Content
    Vgl. auch: New automatic interpreter for complex UDC numbers. Unter: <https%3A%2F%2Fudcc.org%2Ffiles%2FAttilaPiros_EC_36-37_2014-2015.pdf&usg=AOvVaw3kc9CwDDCWP7aArpfjrs5b>
  12. Kaden, B.; Kindling, M.: Kommunikation und Kontext : Überlegungen zur Entwicklung virtueller Diskursräume für die Wissenschaft (2010) 0.04
    0.03955717 = product of:
      0.13185723 = sum of:
        0.09449192 = weight(_text_:kommunikation in 4271) [ClassicSimilarity], result of:
          0.09449192 = score(doc=4271,freq=4.0), product of:
            0.14706601 = queryWeight, product of:
              5.140109 = idf(docFreq=703, maxDocs=44218)
              0.028611459 = queryNorm
            0.64251363 = fieldWeight in 4271, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              5.140109 = idf(docFreq=703, maxDocs=44218)
              0.0625 = fieldNorm(doc=4271)
        0.026934259 = weight(_text_:web in 4271) [ClassicSimilarity], result of:
          0.026934259 = score(doc=4271,freq=2.0), product of:
            0.0933738 = queryWeight, product of:
              3.2635105 = idf(docFreq=4597, maxDocs=44218)
              0.028611459 = queryNorm
            0.2884563 = fieldWeight in 4271, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.2635105 = idf(docFreq=4597, maxDocs=44218)
              0.0625 = fieldNorm(doc=4271)
        0.010431055 = product of:
          0.031293165 = sum of:
            0.031293165 = weight(_text_:29 in 4271) [ClassicSimilarity], result of:
              0.031293165 = score(doc=4271,freq=2.0), product of:
                0.10064617 = queryWeight, product of:
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.028611459 = queryNorm
                0.31092256 = fieldWeight in 4271, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.0625 = fieldNorm(doc=4271)
          0.33333334 = coord(1/3)
      0.3 = coord(3/10)
    
    Abstract
    Der Beitrag behandelt die wissenschaftsinterne Kommunikation bzw. genauer die diskursive Verbreitung und Rezeption von wissenschaftskommunikativen Äußerungen in virtuellen Kommunikationsräumen, die mit semantischen Technologien gestützt werden. Ziel der Überlegungen ist die Darstellung von Anforderungen an und Möglichkeiten für die Gestaltung virtueller Diskurs-räume für Fach- und Wissenschaftsgemeinschaften.
    Date
    2. 2.2011 18:29:45
    Source
    Semantic web & linked data: Elemente zukünftiger Informationsinfrastrukturen ; 1. DGI-Konferenz ; 62. Jahrestagung der DGI ; Frankfurt am Main, 7. - 9. Oktober 2010 ; Proceedings / Deutsche Gesellschaft für Informationswissenschaft und Informationspraxis. Hrsg.: M. Ockenfeld
  13. Mandl, T.; Schulz, J.M.; Marholz, N.; Werner, K.: Benutzerforschung anhand von Log-Dateien : Chancen Grenzen und aktuelle Trends (2011) 0.04
    0.03806568 = product of:
      0.19032839 = sum of:
        0.17989734 = weight(_text_:log in 4304) [ClassicSimilarity], result of:
          0.17989734 = score(doc=4304,freq=6.0), product of:
            0.18335998 = queryWeight, product of:
              6.4086204 = idf(docFreq=197, maxDocs=44218)
              0.028611459 = queryNorm
            0.98111564 = fieldWeight in 4304, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              6.4086204 = idf(docFreq=197, maxDocs=44218)
              0.0625 = fieldNorm(doc=4304)
        0.010431055 = product of:
          0.031293165 = sum of:
            0.031293165 = weight(_text_:29 in 4304) [ClassicSimilarity], result of:
              0.031293165 = score(doc=4304,freq=2.0), product of:
                0.10064617 = queryWeight, product of:
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.028611459 = queryNorm
                0.31092256 = fieldWeight in 4304, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.0625 = fieldNorm(doc=4304)
          0.33333334 = coord(1/3)
      0.2 = coord(2/10)
    
    Abstract
    Die Analyse des Verhaltens von Benutzern von Informationssystemen stellt einen Kern der Informationswissenschaft dar. Die Sammlung von umfangreichen Verhaltensdaten fällt mit den heutigen technischen Möglichkeiten leicht. Der Artikel fasst Möglichkeiten und Chancen der Analyse von Log-Dateien zusammen. Der Track LogCLEF wird vorgestellt, der Forschern erstmals die Möglichkeit eröffnet, mit den denselben Log-Dateien und somit vergleichend arbeiten zu können. Die Datengrundlage und einige Ergebnisse von LogCLEF werden vorgestellt.
    Source
    Information - Wissenschaft und Praxis. 62(2011) H.1, S.29-35
  14. Digital research confidential : the secrets of studying behavior online (2015) 0.04
    0.036751084 = product of:
      0.12250361 = sum of:
        0.04724596 = weight(_text_:kommunikation in 3702) [ClassicSimilarity], result of:
          0.04724596 = score(doc=3702,freq=4.0), product of:
            0.14706601 = queryWeight, product of:
              5.140109 = idf(docFreq=703, maxDocs=44218)
              0.028611459 = queryNorm
            0.32125682 = fieldWeight in 3702, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              5.140109 = idf(docFreq=703, maxDocs=44218)
              0.03125 = fieldNorm(doc=3702)
        0.023325754 = weight(_text_:web in 3702) [ClassicSimilarity], result of:
          0.023325754 = score(doc=3702,freq=6.0), product of:
            0.0933738 = queryWeight, product of:
              3.2635105 = idf(docFreq=4597, maxDocs=44218)
              0.028611459 = queryNorm
            0.24981049 = fieldWeight in 3702, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              3.2635105 = idf(docFreq=4597, maxDocs=44218)
              0.03125 = fieldNorm(doc=3702)
        0.05193189 = weight(_text_:log in 3702) [ClassicSimilarity], result of:
          0.05193189 = score(doc=3702,freq=2.0), product of:
            0.18335998 = queryWeight, product of:
              6.4086204 = idf(docFreq=197, maxDocs=44218)
              0.028611459 = queryNorm
            0.2832237 = fieldWeight in 3702, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              6.4086204 = idf(docFreq=197, maxDocs=44218)
              0.03125 = fieldNorm(doc=3702)
      0.3 = coord(3/10)
    
    Abstract
    The realm of the digital offers both new methods of research and new objects of study. Because the digital environment for scholarship is constantly evolving, researchers must sometimes improvise, change their plans, and adapt. These details are often left out of research write-ups, leaving newcomers to the field frustrated when their approaches do not work as expected. Digital Research Confidential offers scholars a chance to learn from their fellow researchers' mistakes -- and their successes. The book -- a follow-up to Eszter Hargittai's widely read Research Confidential -- presents behind-the-scenes, nuts-and-bolts stories of digital research projects, written by established and rising scholars. They discuss such challenges as archiving, Web crawling, crowdsourcing, and confidentiality. They do not shrink from specifics, describing such research hiccups as an ethnographic interview so emotionally draining that afterward the researcher retreated to a bathroom to cry, and the seemingly simple research question about Wikipedia that mushroomed into years of work on millions of data points. Digital Research Confidential will be an essential resource for scholars in every field.
    BK
    05.20 Kommunikation und Gesellschaft
    Classification
    05.20 Kommunikation und Gesellschaft
    Content
    Preface How to think about digital research / Christian Sandvig and Eszter Hargittai -- "How local is user-generated content" : a 9,000+ word essay on answering a five-word research question" : or how we learned to stop worrying (or worry less) and love the diverse challenges of our fast-moving, geographically-flavored interdisciplinary research area / Darren Gergle and Brent Hecht -- Flash mobs and the social life of public spaces : analyzing online visual data to study new forms of sociability / Virag Molnar and Aron Hsiao -- Social software as social science / Eric Gilbert and Karrie Karahalios -- Hired hands and dubious guesses : adventures in crowdsourced data collection / Aaron Shaw -- Making sense of teen life : strategies for capturing ethnographic data in a networked era / Danah Boyd -- When should we use real names in published accounts of internet research? / Amy Bruckman, Kurt Luther, and Casey Fiesler -- The art of web crawling for social science research / Michelle Shumate and Matthew Weber -- The ethnographic study of visual culture in the age of digitization / Paul Leonardi -- Read/write the digital archive: strategies for historical web research / Megan Sapnar Ankerson -- Big data, big problems, big opportunities : using internet log data to conduct social network analysis research / Brooke Foucault Welles -- Contributors -- References -- Index.
  15. Kruschwitz, U.; Lungley, D.; Albakour, M-D.; Song, D.: Deriving query suggestions for site search (2013) 0.04
    0.036563005 = product of:
      0.18281503 = sum of:
        0.023806747 = weight(_text_:web in 1085) [ClassicSimilarity], result of:
          0.023806747 = score(doc=1085,freq=4.0), product of:
            0.0933738 = queryWeight, product of:
              3.2635105 = idf(docFreq=4597, maxDocs=44218)
              0.028611459 = queryNorm
            0.25496176 = fieldWeight in 1085, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              3.2635105 = idf(docFreq=4597, maxDocs=44218)
              0.0390625 = fieldNorm(doc=1085)
        0.15900828 = weight(_text_:log in 1085) [ClassicSimilarity], result of:
          0.15900828 = score(doc=1085,freq=12.0), product of:
            0.18335998 = queryWeight, product of:
              6.4086204 = idf(docFreq=197, maxDocs=44218)
              0.028611459 = queryNorm
            0.86719185 = fieldWeight in 1085, product of:
              3.4641016 = tf(freq=12.0), with freq of:
                12.0 = termFreq=12.0
              6.4086204 = idf(docFreq=197, maxDocs=44218)
              0.0390625 = fieldNorm(doc=1085)
      0.2 = coord(2/10)
    
    Abstract
    Modern search engines have been moving away from simplistic interfaces that aimed at satisfying a user's need with a single-shot query. Interactive features are now integral parts of web search engines. However, generating good query modification suggestions remains a challenging issue. Query log analysis is one of the major strands of work in this direction. Although much research has been performed on query logs collected on the web as a whole, query log analysis to enhance search on smaller and more focused collections has attracted less attention, despite its increasing practical importance. In this article, we report on a systematic study of different query modification methods applied to a substantial query log collected on a local website that already uses an interactive search engine. We conducted experiments in which we asked users to assess the relevance of potential query modification suggestions that have been constructed using a range of log analysis methods and different baseline approaches. The experimental results demonstrate the usefulness of log analysis to extract query modification suggestions. Furthermore, our experiments demonstrate that a more fine-grained approach than grouping search requests into sessions allows for extraction of better refinement terms from query log files.
  16. Falchi, F.; Lucchese, C.; Orlando, S.; Perego, R.; Rabitti, F.: Similarity caching in large-scale image retrieval (2012) 0.03
    0.034547042 = product of:
      0.1151568 = sum of:
        0.016833913 = weight(_text_:web in 2729) [ClassicSimilarity], result of:
          0.016833913 = score(doc=2729,freq=2.0), product of:
            0.0933738 = queryWeight, product of:
              3.2635105 = idf(docFreq=4597, maxDocs=44218)
              0.028611459 = queryNorm
            0.18028519 = fieldWeight in 2729, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.2635105 = idf(docFreq=4597, maxDocs=44218)
              0.0390625 = fieldNorm(doc=2729)
        0.091803476 = weight(_text_:log in 2729) [ClassicSimilarity], result of:
          0.091803476 = score(doc=2729,freq=4.0), product of:
            0.18335998 = queryWeight, product of:
              6.4086204 = idf(docFreq=197, maxDocs=44218)
              0.028611459 = queryNorm
            0.5006735 = fieldWeight in 2729, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              6.4086204 = idf(docFreq=197, maxDocs=44218)
              0.0390625 = fieldNorm(doc=2729)
        0.00651941 = product of:
          0.019558229 = sum of:
            0.019558229 = weight(_text_:29 in 2729) [ClassicSimilarity], result of:
              0.019558229 = score(doc=2729,freq=2.0), product of:
                0.10064617 = queryWeight, product of:
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.028611459 = queryNorm
                0.19432661 = fieldWeight in 2729, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=2729)
          0.33333334 = coord(1/3)
      0.3 = coord(3/10)
    
    Abstract
    Feature-rich data, such as audio-video recordings, digital images, and results of scientific experiments, nowadays constitute the largest fraction of the massive data sets produced daily in the e-society. Content-based similarity search systems working on such data collections are rapidly growing in importance. Unfortunately, similarity search is in general very expensive and hardly scalable. In this paper we study the case of content-based image retrieval (CBIR) systems, and focus on the problem of increasing the throughput of a large-scale CBIR system that indexes a very large collection of digital images. By analyzing the query log of a real CBIR system available on the Web, we characterize the behavior of users who experience a novel search paradigm, where content-based similarity queries and text-based ones can easily be interleaved. We show that locality and self-similarity is present even in the stream of queries submitted to such a CBIR system. According to these results, we propose an effective way to exploit this locality, by means of a similarity caching system, which stores the results of recently/frequently submitted queries and associated results. Unlike traditional caching, the proposed cache can manage not only exact hits, but also approximate ones that are solved by similarity with respect to the result sets of past queries present in the cache. We evaluate extensively the proposed solution by using the real query stream recorded in the log and a collection of 100 millions of digital photographs. The high hit ratios and small average approximation error figures obtained demonstrate the effectiveness of the approach.
    Date
    27. 1.2016 18:30:29
  17. Derek Doran, D.; Gokhale, S.S.: ¬A classification framework for web robots (2012) 0.03
    0.031546462 = product of:
      0.15773231 = sum of:
        0.053868517 = weight(_text_:web in 505) [ClassicSimilarity], result of:
          0.053868517 = score(doc=505,freq=8.0), product of:
            0.0933738 = queryWeight, product of:
              3.2635105 = idf(docFreq=4597, maxDocs=44218)
              0.028611459 = queryNorm
            0.5769126 = fieldWeight in 505, product of:
              2.828427 = tf(freq=8.0), with freq of:
                8.0 = termFreq=8.0
              3.2635105 = idf(docFreq=4597, maxDocs=44218)
              0.0625 = fieldNorm(doc=505)
        0.10386378 = weight(_text_:log in 505) [ClassicSimilarity], result of:
          0.10386378 = score(doc=505,freq=2.0), product of:
            0.18335998 = queryWeight, product of:
              6.4086204 = idf(docFreq=197, maxDocs=44218)
              0.028611459 = queryNorm
            0.5664474 = fieldWeight in 505, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              6.4086204 = idf(docFreq=197, maxDocs=44218)
              0.0625 = fieldNorm(doc=505)
      0.2 = coord(2/10)
    
    Abstract
    The behavior of modern web robots varies widely when they crawl for different purposes. In this article, we present a framework to classify these web robots from two orthogonal perspectives, namely, their functionality and the types of resources they consume. Applying the classification framework to a year-long access log from the UConn SoE web server, we present trends that point to significant differences in their crawling behavior.
  18. Li, C.; Sugimoto, S.: Provenance description of metadata application profiles for long-term maintenance of metadata schemas : Luciano Floridi's philosophy of information as the foundation for library and information science (2018) 0.03
    0.028572304 = product of:
      0.09524101 = sum of:
        0.023806747 = weight(_text_:web in 4048) [ClassicSimilarity], result of:
          0.023806747 = score(doc=4048,freq=4.0), product of:
            0.0933738 = queryWeight, product of:
              3.2635105 = idf(docFreq=4597, maxDocs=44218)
              0.028611459 = queryNorm
            0.25496176 = fieldWeight in 4048, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              3.2635105 = idf(docFreq=4597, maxDocs=44218)
              0.0390625 = fieldNorm(doc=4048)
        0.06491486 = weight(_text_:log in 4048) [ClassicSimilarity], result of:
          0.06491486 = score(doc=4048,freq=2.0), product of:
            0.18335998 = queryWeight, product of:
              6.4086204 = idf(docFreq=197, maxDocs=44218)
              0.028611459 = queryNorm
            0.3540296 = fieldWeight in 4048, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              6.4086204 = idf(docFreq=197, maxDocs=44218)
              0.0390625 = fieldNorm(doc=4048)
        0.00651941 = product of:
          0.019558229 = sum of:
            0.019558229 = weight(_text_:29 in 4048) [ClassicSimilarity], result of:
              0.019558229 = score(doc=4048,freq=2.0), product of:
                0.10064617 = queryWeight, product of:
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.028611459 = queryNorm
                0.19432661 = fieldWeight in 4048, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=4048)
          0.33333334 = coord(1/3)
      0.3 = coord(3/10)
    
    Abstract
    Purpose Provenance information is crucial for consistent maintenance of metadata schemas over time. The purpose of this paper is to propose a provenance model named DSP-PROV to keep track of structural changes of metadata schemas. Design/methodology/approach The DSP-PROV model is developed through applying the general provenance description standard PROV of the World Wide Web Consortium to the Dublin Core Application Profile. Metadata Application Profile of Digital Public Library of America is selected as a case study to apply the DSP-PROV model. Finally, this paper evaluates the proposed model by comparison between formal provenance description in DSP-PROV and semi-formal change log description in English. Findings Formal provenance description in the DSP-PROV model has advantages over semi-formal provenance description in English to keep metadata schemas consistent over time. Research limitations/implications The DSP-PROV model is applicable to keep track of the structural changes of metadata schema over time. Provenance description of other features of metadata schema such as vocabulary and encoding syntax are not covered. Originality/value This study proposes a simple model for provenance description of structural features of metadata schemas based on a few standards widely accepted on the Web and shows the advantage of the proposed model to conventional semi-formal provenance description.
    Date
    15. 1.2018 19:13:29
  19. Siever, C.M.: Multimodale Kommunikation im Social Web : Forschungsansätze und Analysen zu Text-Bild-Relationen (2015) 0.03
    0.028384332 = product of:
      0.14192165 = sum of:
        0.1181149 = weight(_text_:kommunikation in 4056) [ClassicSimilarity], result of:
          0.1181149 = score(doc=4056,freq=16.0), product of:
            0.14706601 = queryWeight, product of:
              5.140109 = idf(docFreq=703, maxDocs=44218)
              0.028611459 = queryNorm
            0.8031421 = fieldWeight in 4056, product of:
              4.0 = tf(freq=16.0), with freq of:
                16.0 = termFreq=16.0
              5.140109 = idf(docFreq=703, maxDocs=44218)
              0.0390625 = fieldNorm(doc=4056)
        0.023806747 = weight(_text_:web in 4056) [ClassicSimilarity], result of:
          0.023806747 = score(doc=4056,freq=4.0), product of:
            0.0933738 = queryWeight, product of:
              3.2635105 = idf(docFreq=4597, maxDocs=44218)
              0.028611459 = queryNorm
            0.25496176 = fieldWeight in 4056, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              3.2635105 = idf(docFreq=4597, maxDocs=44218)
              0.0390625 = fieldNorm(doc=4056)
      0.2 = coord(2/10)
    
    Abstract
    Multimodalität ist ein typisches Merkmal der Kommunikation im Social Web. Der Fokus dieses Bandes liegt auf der Kommunikation in Foto-Communitys, insbesondere auf den beiden kommunikativen Praktiken des Social Taggings und des Verfassens von Notizen innerhalb von Bildern. Bei den Tags stehen semantische Text-Bild-Relationen im Vordergrund: Tags dienen der Wissensrepräsentation, eine adäquate Versprachlichung der Bilder ist folglich unabdingbar. Notizen-Bild-Relationen sind aus pragmatischer Perspektive von Interesse: Die Informationen eines Kommunikats werden komplementär auf Text und Bild verteilt, was sich in verschiedenen sprachlichen Phänomenen niederschlägt. Ein diachroner Vergleich mit der Postkartenkommunikation sowie ein Exkurs zur Kommunikation mit Emojis runden das Buch ab.
    RSWK
    Social Media / Multimodalität / Kommunikation / Social Tagging (DNB)
    Text / Bild / Computerunterstützte Kommunikation / Soziale Software (SBB)
    Subject
    Social Media / Multimodalität / Kommunikation / Social Tagging (DNB)
    Text / Bild / Computerunterstützte Kommunikation / Soziale Software (SBB)
  20. Calvanese, D.; Kalayci, T.E.; Montali, M.; Santoso, A.: OBDA for log extraction in process mining (2017) 0.03
    0.028318608 = product of:
      0.14159304 = sum of:
        0.029157192 = weight(_text_:web in 3931) [ClassicSimilarity], result of:
          0.029157192 = score(doc=3931,freq=6.0), product of:
            0.0933738 = queryWeight, product of:
              3.2635105 = idf(docFreq=4597, maxDocs=44218)
              0.028611459 = queryNorm
            0.3122631 = fieldWeight in 3931, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              3.2635105 = idf(docFreq=4597, maxDocs=44218)
              0.0390625 = fieldNorm(doc=3931)
        0.11243584 = weight(_text_:log in 3931) [ClassicSimilarity], result of:
          0.11243584 = score(doc=3931,freq=6.0), product of:
            0.18335998 = queryWeight, product of:
              6.4086204 = idf(docFreq=197, maxDocs=44218)
              0.028611459 = queryNorm
            0.61319727 = fieldWeight in 3931, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              6.4086204 = idf(docFreq=197, maxDocs=44218)
              0.0390625 = fieldNorm(doc=3931)
      0.2 = coord(2/10)
    
    Abstract
    Process mining is an emerging area that synergically combines model-based and data-oriented analysis techniques to obtain useful insights on how business processes are executed within an organization. Through process mining, decision makers can discover process models from data, compare expected and actual behaviors, and enrich models with key information about their actual execution. To be applicable, process mining techniques require the input data to be explicitly structured in the form of an event log, which lists when and by whom different case objects (i.e., process instances) have been subject to the execution of tasks. Unfortunately, in many real world set-ups, such event logs are not explicitly given, but are instead implicitly represented in legacy information systems. To apply process mining in this widespread setting, there is a pressing need for techniques able to support various process stakeholders in data preparation and log extraction from legacy information systems. The purpose of this paper is to single out this challenging, open issue, and didactically introduce how techniques from intelligent data management, and in particular ontology-based data access, provide a viable solution with a solid theoretical basis.
    Series
    Lecture Notes in Computer Scienc;10370) (Information Systems and Applications, incl. Internet/Web, and HCI
    Source
    Reasoning Web: Semantic Interoperability on the Web, 13th International Summer School 2017, London, UK, July 7-11, 2017, Tutorial Lectures. Eds.: Ianni, G. et al

Languages

  • e 1674
  • d 533
  • f 3
  • i 2
  • a 1
  • hu 1
  • pt 1
  • sp 1
  • More… Less…

Types

  • a 1889
  • el 232
  • m 203
  • s 73
  • x 45
  • r 16
  • b 5
  • p 2
  • i 1
  • z 1
  • More… Less…

Themes

Subjects

Classifications