Search (25693 results, page 1 of 1285)

  • × language_ss:"e"
  1. Virtuelle Welten im Internet : Tagungsband ; [Vorträge und Diskussionen der Fachkonferenz des Münchner Kreises am 21. November 2007] / [Münchner Kreis] (2008) 0.72
    0.7229941 = product of:
      1.0844911 = sum of:
        0.077089146 = weight(_text_:760 in 2926) [ClassicSimilarity], result of:
          0.077089146 = score(doc=2926,freq=4.0), product of:
            0.1486828 = queryWeight, product of:
              8.29569 = idf(docFreq=29, maxDocs=44218)
              0.017922899 = queryNorm
            0.5184806 = fieldWeight in 2926, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              8.29569 = idf(docFreq=29, maxDocs=44218)
              0.03125 = fieldNorm(doc=2926)
        0.04516537 = weight(_text_:wirtschaftswissenschaften in 2926) [ClassicSimilarity], result of:
          0.04516537 = score(doc=2926,freq=4.0), product of:
            0.11380646 = queryWeight, product of:
              6.3497796 = idf(docFreq=209, maxDocs=44218)
              0.017922899 = queryNorm
            0.39686123 = fieldWeight in 2926, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              6.3497796 = idf(docFreq=209, maxDocs=44218)
              0.03125 = fieldNorm(doc=2926)
        0.101089165 = weight(_text_:gewerbepolitik in 2926) [ClassicSimilarity], result of:
          0.101089165 = score(doc=2926,freq=4.0), product of:
            0.17026149 = queryWeight, product of:
              9.499662 = idf(docFreq=8, maxDocs=44218)
              0.017922899 = queryNorm
            0.5937289 = fieldWeight in 2926, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              9.499662 = idf(docFreq=8, maxDocs=44218)
              0.03125 = fieldNorm(doc=2926)
        0.03880532 = weight(_text_:einzelne in 2926) [ClassicSimilarity], result of:
          0.03880532 = score(doc=2926,freq=4.0), product of:
            0.10548963 = queryWeight, product of:
              5.885746 = idf(docFreq=333, maxDocs=44218)
              0.017922899 = queryNorm
            0.36785913 = fieldWeight in 2926, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              5.885746 = idf(docFreq=333, maxDocs=44218)
              0.03125 = fieldNorm(doc=2926)
        0.09885925 = weight(_text_:wirtschaftszweige in 2926) [ClassicSimilarity], result of:
          0.09885925 = score(doc=2926,freq=4.0), product of:
            0.16837312 = queryWeight, product of:
              9.394302 = idf(docFreq=9, maxDocs=44218)
              0.017922899 = queryNorm
            0.5871439 = fieldWeight in 2926, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              9.394302 = idf(docFreq=9, maxDocs=44218)
              0.03125 = fieldNorm(doc=2926)
        0.025190933 = product of:
          0.050381865 = sum of:
            0.050381865 = weight(_text_:industrie in 2926) [ClassicSimilarity], result of:
              0.050381865 = score(doc=2926,freq=4.0), product of:
                0.12019911 = queryWeight, product of:
                  6.7064548 = idf(docFreq=146, maxDocs=44218)
                  0.017922899 = queryNorm
                0.41915342 = fieldWeight in 2926, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  6.7064548 = idf(docFreq=146, maxDocs=44218)
                  0.03125 = fieldNorm(doc=2926)
          0.5 = coord(1/2)
        0.04460735 = product of:
          0.0892147 = sum of:
            0.0892147 = weight(_text_:bergbau in 2926) [ClassicSimilarity], result of:
              0.0892147 = score(doc=2926,freq=4.0), product of:
                0.15994929 = queryWeight, product of:
                  8.924298 = idf(docFreq=15, maxDocs=44218)
                  0.017922899 = queryNorm
                0.55776864 = fieldWeight in 2926, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  8.924298 = idf(docFreq=15, maxDocs=44218)
                  0.03125 = fieldNorm(doc=2926)
          0.5 = coord(1/2)
        0.032372482 = product of:
          0.064744964 = sum of:
            0.064744964 = weight(_text_:handel in 2926) [ClassicSimilarity], result of:
              0.064744964 = score(doc=2926,freq=4.0), product of:
                0.1362596 = queryWeight, product of:
                  7.602543 = idf(docFreq=59, maxDocs=44218)
                  0.017922899 = queryNorm
                0.47515893 = fieldWeight in 2926, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  7.602543 = idf(docFreq=59, maxDocs=44218)
                  0.03125 = fieldNorm(doc=2926)
          0.5 = coord(1/2)
        0.020229936 = product of:
          0.04045987 = sum of:
            0.04045987 = weight(_text_:dienstleistungen in 2926) [ClassicSimilarity], result of:
              0.04045987 = score(doc=2926,freq=4.0), product of:
                0.10771505 = queryWeight, product of:
                  6.009912 = idf(docFreq=294, maxDocs=44218)
                  0.017922899 = queryNorm
                0.3756195 = fieldWeight in 2926, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  6.009912 = idf(docFreq=294, maxDocs=44218)
                  0.03125 = fieldNorm(doc=2926)
          0.5 = coord(1/2)
        0.08480903 = weight(_text_:handwerk in 2926) [ClassicSimilarity], result of:
          0.08480903 = score(doc=2926,freq=4.0), product of:
            0.15594992 = queryWeight, product of:
              8.701155 = idf(docFreq=19, maxDocs=44218)
              0.017922899 = queryNorm
            0.54382217 = fieldWeight in 2926, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              8.701155 = idf(docFreq=19, maxDocs=44218)
              0.03125 = fieldNorm(doc=2926)
        0.101089165 = weight(_text_:versorgungseinrichtungen in 2926) [ClassicSimilarity], result of:
          0.101089165 = score(doc=2926,freq=4.0), product of:
            0.17026149 = queryWeight, product of:
              9.499662 = idf(docFreq=8, maxDocs=44218)
              0.017922899 = queryNorm
            0.5937289 = fieldWeight in 2926, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              9.499662 = idf(docFreq=8, maxDocs=44218)
              0.03125 = fieldNorm(doc=2926)
        0.08129213 = weight(_text_:gas in 2926) [ClassicSimilarity], result of:
          0.08129213 = score(doc=2926,freq=4.0), product of:
            0.15268219 = queryWeight, product of:
              8.518833 = idf(docFreq=23, maxDocs=44218)
              0.017922899 = queryNorm
            0.5324271 = fieldWeight in 2926, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              8.518833 = idf(docFreq=23, maxDocs=44218)
              0.03125 = fieldNorm(doc=2926)
        0.0747805 = weight(_text_:wasser in 2926) [ClassicSimilarity], result of:
          0.0747805 = score(doc=2926,freq=4.0), product of:
            0.14643952 = queryWeight, product of:
              8.1705265 = idf(docFreq=33, maxDocs=44218)
              0.017922899 = queryNorm
            0.5106579 = fieldWeight in 2926, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              8.1705265 = idf(docFreq=33, maxDocs=44218)
              0.03125 = fieldNorm(doc=2926)
        0.09885925 = weight(_text_:informationsgewerbe in 2926) [ClassicSimilarity], result of:
          0.09885925 = score(doc=2926,freq=4.0), product of:
            0.16837312 = queryWeight, product of:
              9.394302 = idf(docFreq=9, maxDocs=44218)
              0.017922899 = queryNorm
            0.5871439 = fieldWeight in 2926, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              9.394302 = idf(docFreq=9, maxDocs=44218)
              0.03125 = fieldNorm(doc=2926)
        0.03778704 = weight(_text_:post in 2926) [ClassicSimilarity], result of:
          0.03778704 = score(doc=2926,freq=4.0), product of:
            0.10409636 = queryWeight, product of:
              5.808009 = idf(docFreq=360, maxDocs=44218)
              0.017922899 = queryNorm
            0.36300057 = fieldWeight in 2926, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              5.808009 = idf(docFreq=360, maxDocs=44218)
              0.03125 = fieldNorm(doc=2926)
        0.02277317 = weight(_text_:neue in 2926) [ClassicSimilarity], result of:
          0.02277317 = score(doc=2926,freq=6.0), product of:
            0.07302189 = queryWeight, product of:
              4.074223 = idf(docFreq=2043, maxDocs=44218)
              0.017922899 = queryNorm
            0.31186774 = fieldWeight in 2926, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              4.074223 = idf(docFreq=2043, maxDocs=44218)
              0.03125 = fieldNorm(doc=2926)
        0.035093244 = weight(_text_:medien in 2926) [ClassicSimilarity], result of:
          0.035093244 = score(doc=2926,freq=8.0), product of:
            0.084356464 = queryWeight, product of:
              4.7066307 = idf(docFreq=1085, maxDocs=44218)
              0.017922899 = queryNorm
            0.4160113 = fieldWeight in 2926, product of:
              2.828427 = tf(freq=8.0), with freq of:
                8.0 = termFreq=8.0
              4.7066307 = idf(docFreq=1085, maxDocs=44218)
              0.03125 = fieldNorm(doc=2926)
        0.052091725 = sum of:
          0.012636391 = weight(_text_:online in 2926) [ClassicSimilarity], result of:
            0.012636391 = score(doc=2926,freq=6.0), product of:
              0.05439423 = queryWeight, product of:
                3.0349014 = idf(docFreq=5778, maxDocs=44218)
                0.017922899 = queryNorm
              0.23231125 = fieldWeight in 2926, product of:
                2.4494898 = tf(freq=6.0), with freq of:
                  6.0 = termFreq=6.0
                3.0349014 = idf(docFreq=5778, maxDocs=44218)
                0.03125 = fieldNorm(doc=2926)
          0.03945533 = weight(_text_:dienste in 2926) [ClassicSimilarity], result of:
            0.03945533 = score(doc=2926,freq=4.0), product of:
              0.106369466 = queryWeight, product of:
                5.934836 = idf(docFreq=317, maxDocs=44218)
                0.017922899 = queryNorm
              0.37092724 = fieldWeight in 2926, product of:
                2.0 = tf(freq=4.0), with freq of:
                  4.0 = termFreq=4.0
                5.934836 = idf(docFreq=317, maxDocs=44218)
                0.03125 = fieldNorm(doc=2926)
        0.012010567 = weight(_text_:u in 2926) [ClassicSimilarity], result of:
          0.012010567 = score(doc=2926,freq=4.0), product of:
            0.058687534 = queryWeight, product of:
              3.2744443 = idf(docFreq=4547, maxDocs=44218)
              0.017922899 = queryNorm
            0.20465277 = fieldWeight in 2926, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              3.2744443 = idf(docFreq=4547, maxDocs=44218)
              0.03125 = fieldNorm(doc=2926)
        4.9643347E-4 = product of:
          0.0014893003 = sum of:
            0.0014893003 = weight(_text_:a in 2926) [ClassicSimilarity], result of:
              0.0014893003 = score(doc=2926,freq=4.0), product of:
                0.020665944 = queryWeight, product of:
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.017922899 = queryNorm
                0.072065435 = fieldWeight in 2926, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.03125 = fieldNorm(doc=2926)
          0.33333334 = coord(1/3)
      0.6666667 = coord(20/30)
    
    Abstract
    - Wachstumsfeld 3D-Internet - wertvolle Informationen hochkarätiger Fachleute Virtuelle Welten entwickeln sich in der industrialisierten Welt zu einem Massenphänomen. Millionen Menschen schaffen sich eine zweite Identität in Second Life, agieren als künstliche "Wesen", den Avataren in virtuellen Umgebungen oder spielen Online-Rollenspiele. Der Reiz der virtuellen Welten liegt in der Überwindung der Grenzen, die in einem realen Leben gesetzt sind. Ganz neue Geschäftsmöglichkeiten können hier entstehen. Ein Unternehmen nach dem anderen baut daher seine Insel in den virtuellen Welten. In zahlreichen Beiträgen erhalten Sie in diesem Band wertvolle Informationen, um selbst in den virtuellen Welten operieren zu können: - Boas Betzler von IBM Research, Wappingers Falls, NY erläutert Anforderungen an die technologische Infrastruktur, - Claus Nehmzow von der PA Consulting Group, London nimmt sie mit auf eine Geschäftsreise in Second Life, - Daniel Michelis von der Universität der Künste, Berlin zeigt, wie Sie Ihren Avatar studieren lassen können, - Dr. Wolfram Proksch von Proksch & Fritsche Rechtsanwälte, Wien klärt auf über Cyberspace Regulation, - Robert Gehorsam von Forterra Systems, NY berichtet über Erfahrungen mit Mitarbeiterschulung in den virtuellen Welten und - Jean Miller von Linden Lab, San Francisco beleuchtet Entwicklungsmöglichkeiten, die sie für Second Life sieht. Ungezählte weitere Aspekte der faszinierenden virtuellen Welten werden behandelt und geben Ihnen Antwort zu vielfältigen Fragestellungen, die sich in diesem Zusammenhang stellen (sollten).
    Classification
    AP 18420 Allgemeines / Medien- und Kommunikationswissenschaften, Kommunikationsdesign / Arten des Nachrichtenwesens, Medientechnik / Internet
    QR 760 Wirtschaftswissenschaften / Gewerbepolitik. Einzelne Wirtschaftszweige / Industrie, Bergbau, Handel, Dienstleistungen, Handwerk / Öffentliche Versorgungseinrichtungen. Elektrizität. Gas. Wasser / Informationsgewerbe (Massenmedien). Post / Neue Medien. Online-Dienste (Internet u. a.)
    RVK
    AP 18420 Allgemeines / Medien- und Kommunikationswissenschaften, Kommunikationsdesign / Arten des Nachrichtenwesens, Medientechnik / Internet
    QR 760 Wirtschaftswissenschaften / Gewerbepolitik. Einzelne Wirtschaftszweige / Industrie, Bergbau, Handel, Dienstleistungen, Handwerk / Öffentliche Versorgungseinrichtungen. Elektrizität. Gas. Wasser / Informationsgewerbe (Massenmedien). Post / Neue Medien. Online-Dienste (Internet u. a.)
  2. Kleineberg, M.: Context analysis and context indexing : formal pragmatics in knowledge organization (2014) 0.15
    0.15182036 = product of:
      0.7591018 = sum of:
        0.047443863 = product of:
          0.14233159 = sum of:
            0.14233159 = weight(_text_:3a in 1826) [ClassicSimilarity], result of:
              0.14233159 = score(doc=1826,freq=2.0), product of:
                0.15195054 = queryWeight, product of:
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.017922899 = queryNorm
                0.93669677 = fieldWeight in 1826, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.078125 = fieldNorm(doc=1826)
          0.33333334 = coord(1/3)
        0.14233159 = weight(_text_:2f in 1826) [ClassicSimilarity], result of:
          0.14233159 = score(doc=1826,freq=2.0), product of:
            0.15195054 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.017922899 = queryNorm
            0.93669677 = fieldWeight in 1826, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.078125 = fieldNorm(doc=1826)
        0.14233159 = weight(_text_:2f in 1826) [ClassicSimilarity], result of:
          0.14233159 = score(doc=1826,freq=2.0), product of:
            0.15195054 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.017922899 = queryNorm
            0.93669677 = fieldWeight in 1826, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.078125 = fieldNorm(doc=1826)
        0.14233159 = weight(_text_:2f in 1826) [ClassicSimilarity], result of:
          0.14233159 = score(doc=1826,freq=2.0), product of:
            0.15195054 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.017922899 = queryNorm
            0.93669677 = fieldWeight in 1826, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.078125 = fieldNorm(doc=1826)
        0.14233159 = weight(_text_:2f in 1826) [ClassicSimilarity], result of:
          0.14233159 = score(doc=1826,freq=2.0), product of:
            0.15195054 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.017922899 = queryNorm
            0.93669677 = fieldWeight in 1826, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.078125 = fieldNorm(doc=1826)
        0.14233159 = weight(_text_:2f in 1826) [ClassicSimilarity], result of:
          0.14233159 = score(doc=1826,freq=2.0), product of:
            0.15195054 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.017922899 = queryNorm
            0.93669677 = fieldWeight in 1826, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.078125 = fieldNorm(doc=1826)
      0.2 = coord(6/30)
    
    Source
    http://www.google.de/url?sa=t&rct=j&q=&esrc=s&source=web&cd=5&ved=0CDQQFjAE&url=http%3A%2F%2Fdigbib.ubka.uni-karlsruhe.de%2Fvolltexte%2Fdocuments%2F3131107&ei=HzFWVYvGMsiNsgGTyoFI&usg=AFQjCNE2FHUeR9oQTQlNC4TPedv4Mo3DaQ&sig2=Rlzpr7a3BLZZkqZCXXN_IA&bvm=bv.93564037,d.bGg&cad=rja
  3. Li, L.; Shang, Y.; Zhang, W.: Improvement of HITS-based algorithms on Web documents 0.15
    0.14778893 = product of:
      0.6333811 = sum of:
        0.028466314 = product of:
          0.08539894 = sum of:
            0.08539894 = weight(_text_:3a in 2514) [ClassicSimilarity], result of:
              0.08539894 = score(doc=2514,freq=2.0), product of:
                0.15195054 = queryWeight, product of:
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.017922899 = queryNorm
                0.56201804 = fieldWeight in 2514, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.046875 = fieldNorm(doc=2514)
          0.33333334 = coord(1/3)
        0.12077234 = weight(_text_:2f in 2514) [ClassicSimilarity], result of:
          0.12077234 = score(doc=2514,freq=4.0), product of:
            0.15195054 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.017922899 = queryNorm
            0.7948135 = fieldWeight in 2514, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.046875 = fieldNorm(doc=2514)
        0.12077234 = weight(_text_:2f in 2514) [ClassicSimilarity], result of:
          0.12077234 = score(doc=2514,freq=4.0), product of:
            0.15195054 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.017922899 = queryNorm
            0.7948135 = fieldWeight in 2514, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.046875 = fieldNorm(doc=2514)
        0.12077234 = weight(_text_:2f in 2514) [ClassicSimilarity], result of:
          0.12077234 = score(doc=2514,freq=4.0), product of:
            0.15195054 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.017922899 = queryNorm
            0.7948135 = fieldWeight in 2514, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.046875 = fieldNorm(doc=2514)
        0.12077234 = weight(_text_:2f in 2514) [ClassicSimilarity], result of:
          0.12077234 = score(doc=2514,freq=4.0), product of:
            0.15195054 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.017922899 = queryNorm
            0.7948135 = fieldWeight in 2514, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.046875 = fieldNorm(doc=2514)
        0.12077234 = weight(_text_:2f in 2514) [ClassicSimilarity], result of:
          0.12077234 = score(doc=2514,freq=4.0), product of:
            0.15195054 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.017922899 = queryNorm
            0.7948135 = fieldWeight in 2514, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.046875 = fieldNorm(doc=2514)
        0.0010530944 = product of:
          0.003159283 = sum of:
            0.003159283 = weight(_text_:a in 2514) [ClassicSimilarity], result of:
              0.003159283 = score(doc=2514,freq=8.0), product of:
                0.020665944 = queryWeight, product of:
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.017922899 = queryNorm
                0.15287387 = fieldWeight in 2514, product of:
                  2.828427 = tf(freq=8.0), with freq of:
                    8.0 = termFreq=8.0
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.046875 = fieldNorm(doc=2514)
          0.33333334 = coord(1/3)
      0.23333333 = coord(7/30)
    
    Abstract
    In this paper, we present two ways to improve the precision of HITS-based algorithms onWeb documents. First, by analyzing the limitations of current HITS-based algorithms, we propose a new weighted HITS-based method that assigns appropriate weights to in-links of root documents. Then, we combine content analysis with HITS-based algorithms and study the effects of four representative relevance scoring methods, VSM, Okapi, TLS, and CDR, using a set of broad topic queries. Our experimental results show that our weighted HITS-based method performs significantly better than Bharat's improved HITS algorithm. When we combine our weighted HITS-based method or Bharat's HITS algorithm with any of the four relevance scoring methods, the combined methods are only marginally better than our weighted HITS-based method. Between the four relevance scoring methods, there is no significant quality difference when they are combined with a HITS-based algorithm.
    Content
    Vgl.: http%3A%2F%2Fdelab.csd.auth.gr%2F~dimitris%2Fcourses%2Fir_spring06%2Fpage_rank_computing%2Fp527-li.pdf. Vgl. auch: http://www2002.org/CDROM/refereed/643/.
    Type
    a
  4. Popper, K.R.: Three worlds : the Tanner lecture on human values. Deliverd at the University of Michigan, April 7, 1978 (1978) 0.14
    0.14206529 = product of:
      0.60885125 = sum of:
        0.03795509 = product of:
          0.113865264 = sum of:
            0.113865264 = weight(_text_:3a in 230) [ClassicSimilarity], result of:
              0.113865264 = score(doc=230,freq=2.0), product of:
                0.15195054 = queryWeight, product of:
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.017922899 = queryNorm
                0.7493574 = fieldWeight in 230, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.0625 = fieldNorm(doc=230)
          0.33333334 = coord(1/3)
        0.113865264 = weight(_text_:2f in 230) [ClassicSimilarity], result of:
          0.113865264 = score(doc=230,freq=2.0), product of:
            0.15195054 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.017922899 = queryNorm
            0.7493574 = fieldWeight in 230, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.0625 = fieldNorm(doc=230)
        0.113865264 = weight(_text_:2f in 230) [ClassicSimilarity], result of:
          0.113865264 = score(doc=230,freq=2.0), product of:
            0.15195054 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.017922899 = queryNorm
            0.7493574 = fieldWeight in 230, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.0625 = fieldNorm(doc=230)
        0.113865264 = weight(_text_:2f in 230) [ClassicSimilarity], result of:
          0.113865264 = score(doc=230,freq=2.0), product of:
            0.15195054 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.017922899 = queryNorm
            0.7493574 = fieldWeight in 230, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.0625 = fieldNorm(doc=230)
        0.113865264 = weight(_text_:2f in 230) [ClassicSimilarity], result of:
          0.113865264 = score(doc=230,freq=2.0), product of:
            0.15195054 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.017922899 = queryNorm
            0.7493574 = fieldWeight in 230, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.0625 = fieldNorm(doc=230)
        0.113865264 = weight(_text_:2f in 230) [ClassicSimilarity], result of:
          0.113865264 = score(doc=230,freq=2.0), product of:
            0.15195054 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.017922899 = queryNorm
            0.7493574 = fieldWeight in 230, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.0625 = fieldNorm(doc=230)
        0.0015698604 = product of:
          0.004709581 = sum of:
            0.004709581 = weight(_text_:a in 230) [ClassicSimilarity], result of:
              0.004709581 = score(doc=230,freq=10.0), product of:
                0.020665944 = queryWeight, product of:
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.017922899 = queryNorm
                0.22789092 = fieldWeight in 230, product of:
                  3.1622777 = tf(freq=10.0), with freq of:
                    10.0 = termFreq=10.0
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.0625 = fieldNorm(doc=230)
          0.33333334 = coord(1/3)
      0.23333333 = coord(7/30)
    
    Abstract
    In this lecture I intend to challenge those who uphold a monist or even a dualist view of the universe; and I will propose, instead, a pluralist view. I will propose a view of the universe that recognizes at least three different but interacting sub-universes.
    Source
    https%3A%2F%2Ftannerlectures.utah.edu%2F_documents%2Fa-to-z%2Fp%2Fpopper80.pdf&usg=AOvVaw3f4QRTEH-OEBmoYr2J_c7H
    Type
    a
  5. Mas, S.; Marleau, Y.: Proposition of a faceted classification model to support corporate information organization and digital records management (2009) 0.13
    0.12682456 = product of:
      0.47559205 = sum of:
        0.028466314 = product of:
          0.08539894 = sum of:
            0.08539894 = weight(_text_:3a in 2918) [ClassicSimilarity], result of:
              0.08539894 = score(doc=2918,freq=2.0), product of:
                0.15195054 = queryWeight, product of:
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.017922899 = queryNorm
                0.56201804 = fieldWeight in 2918, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.046875 = fieldNorm(doc=2918)
          0.33333334 = coord(1/3)
        0.08539894 = weight(_text_:2f in 2918) [ClassicSimilarity], result of:
          0.08539894 = score(doc=2918,freq=2.0), product of:
            0.15195054 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.017922899 = queryNorm
            0.56201804 = fieldWeight in 2918, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.046875 = fieldNorm(doc=2918)
        0.08539894 = weight(_text_:2f in 2918) [ClassicSimilarity], result of:
          0.08539894 = score(doc=2918,freq=2.0), product of:
            0.15195054 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.017922899 = queryNorm
            0.56201804 = fieldWeight in 2918, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.046875 = fieldNorm(doc=2918)
        0.08539894 = weight(_text_:2f in 2918) [ClassicSimilarity], result of:
          0.08539894 = score(doc=2918,freq=2.0), product of:
            0.15195054 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.017922899 = queryNorm
            0.56201804 = fieldWeight in 2918, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.046875 = fieldNorm(doc=2918)
        0.08539894 = weight(_text_:2f in 2918) [ClassicSimilarity], result of:
          0.08539894 = score(doc=2918,freq=2.0), product of:
            0.15195054 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.017922899 = queryNorm
            0.56201804 = fieldWeight in 2918, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.046875 = fieldNorm(doc=2918)
        0.0073510436 = product of:
          0.014702087 = sum of:
            0.014702087 = weight(_text_:29 in 2918) [ClassicSimilarity], result of:
              0.014702087 = score(doc=2918,freq=2.0), product of:
                0.063047156 = queryWeight, product of:
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.017922899 = queryNorm
                0.23319192 = fieldWeight in 2918, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.046875 = fieldNorm(doc=2918)
          0.5 = coord(1/2)
        0.08539894 = weight(_text_:2f in 2918) [ClassicSimilarity], result of:
          0.08539894 = score(doc=2918,freq=2.0), product of:
            0.15195054 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.017922899 = queryNorm
            0.56201804 = fieldWeight in 2918, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.046875 = fieldNorm(doc=2918)
        0.012779992 = product of:
          0.019169988 = sum of:
            0.0044679004 = weight(_text_:a in 2918) [ClassicSimilarity], result of:
              0.0044679004 = score(doc=2918,freq=16.0), product of:
                0.020665944 = queryWeight, product of:
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.017922899 = queryNorm
                0.2161963 = fieldWeight in 2918, product of:
                  4.0 = tf(freq=16.0), with freq of:
                    16.0 = termFreq=16.0
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.046875 = fieldNorm(doc=2918)
            0.014702087 = weight(_text_:29 in 2918) [ClassicSimilarity], result of:
              0.014702087 = score(doc=2918,freq=2.0), product of:
                0.063047156 = queryWeight, product of:
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.017922899 = queryNorm
                0.23319192 = fieldWeight in 2918, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.046875 = fieldNorm(doc=2918)
          0.6666667 = coord(2/3)
      0.26666668 = coord(8/30)
    
    Abstract
    The employees of an organization often use a personal hierarchical classification scheme to organize digital documents that are stored on their own workstations. As this may make it hard for other employees to retrieve these documents, there is a risk that the organization will lose track of needed documentation. Furthermore, the inherent boundaries of such a hierarchical structure require making arbitrary decisions about which specific criteria the classification will b.e based on (for instance, the administrative activity or the document type, although a document can have several attributes and require classification in several classes).A faceted classification model to support corporate information organization is proposed. Partially based on Ranganathan's facets theory, this model aims not only to standardize the organization of digital documents, but also to simplify the management of a document throughout its life cycle for both individuals and organizations, while ensuring compliance to regulatory and policy requirements.
    Date
    29. 8.2009 21:15:48
    Footnote
    Vgl.: http://ieeexplore.ieee.org/Xplore/login.jsp?reload=true&url=http%3A%2F%2Fieeexplore.ieee.org%2Fiel5%2F4755313%2F4755314%2F04755480.pdf%3Farnumber%3D4755480&authDecision=-203.
    Type
    a
  6. Vetere, G.; Lenzerini, M.: Models for semantic interoperability in service-oriented architectures (2005) 0.12
    0.1242733 = product of:
      0.53259987 = sum of:
        0.033210702 = product of:
          0.09963211 = sum of:
            0.09963211 = weight(_text_:3a in 306) [ClassicSimilarity], result of:
              0.09963211 = score(doc=306,freq=2.0), product of:
                0.15195054 = queryWeight, product of:
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.017922899 = queryNorm
                0.65568775 = fieldWeight in 306, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=306)
          0.33333334 = coord(1/3)
        0.09963211 = weight(_text_:2f in 306) [ClassicSimilarity], result of:
          0.09963211 = score(doc=306,freq=2.0), product of:
            0.15195054 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.017922899 = queryNorm
            0.65568775 = fieldWeight in 306, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.0546875 = fieldNorm(doc=306)
        0.09963211 = weight(_text_:2f in 306) [ClassicSimilarity], result of:
          0.09963211 = score(doc=306,freq=2.0), product of:
            0.15195054 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.017922899 = queryNorm
            0.65568775 = fieldWeight in 306, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.0546875 = fieldNorm(doc=306)
        0.09963211 = weight(_text_:2f in 306) [ClassicSimilarity], result of:
          0.09963211 = score(doc=306,freq=2.0), product of:
            0.15195054 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.017922899 = queryNorm
            0.65568775 = fieldWeight in 306, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.0546875 = fieldNorm(doc=306)
        0.09963211 = weight(_text_:2f in 306) [ClassicSimilarity], result of:
          0.09963211 = score(doc=306,freq=2.0), product of:
            0.15195054 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.017922899 = queryNorm
            0.65568775 = fieldWeight in 306, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.0546875 = fieldNorm(doc=306)
        0.09963211 = weight(_text_:2f in 306) [ClassicSimilarity], result of:
          0.09963211 = score(doc=306,freq=2.0), product of:
            0.15195054 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.017922899 = queryNorm
            0.65568775 = fieldWeight in 306, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.0546875 = fieldNorm(doc=306)
        0.00122861 = product of:
          0.00368583 = sum of:
            0.00368583 = weight(_text_:a in 306) [ClassicSimilarity], result of:
              0.00368583 = score(doc=306,freq=8.0), product of:
                0.020665944 = queryWeight, product of:
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.017922899 = queryNorm
                0.17835285 = fieldWeight in 306, product of:
                  2.828427 = tf(freq=8.0), with freq of:
                    8.0 = termFreq=8.0
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=306)
          0.33333334 = coord(1/3)
      0.23333333 = coord(7/30)
    
    Abstract
    Although service-oriented architectures go a long way toward providing interoperability in distributed, heterogeneous environments, managing semantic differences in such environments remains a challenge. We give an overview of the issue of semantic interoperability (integration), provide a semantic characterization of services, and discuss the role of ontologies. Then we analyze four basic models of semantic interoperability that differ in respect to their mapping between service descriptions and ontologies and in respect to where the evaluation of the integration logic is performed. We also provide some guidelines for selecting one of the possible interoperability models.
    Content
    Vgl.: http://ieeexplore.ieee.org/xpl/login.jsp?tp=&arnumber=5386707&url=http%3A%2F%2Fieeexplore.ieee.org%2Fxpls%2Fabs_all.jsp%3Farnumber%3D5386707.
    Type
    a
  7. Hotho, A.; Bloehdorn, S.: Data Mining 2004 : Text classification by boosting weak learners based on terms and concepts (2004) 0.11
    0.1090321 = product of:
      0.46728045 = sum of:
        0.028466314 = product of:
          0.08539894 = sum of:
            0.08539894 = weight(_text_:3a in 562) [ClassicSimilarity], result of:
              0.08539894 = score(doc=562,freq=2.0), product of:
                0.15195054 = queryWeight, product of:
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.017922899 = queryNorm
                0.56201804 = fieldWeight in 562, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.046875 = fieldNorm(doc=562)
          0.33333334 = coord(1/3)
        0.08539894 = weight(_text_:2f in 562) [ClassicSimilarity], result of:
          0.08539894 = score(doc=562,freq=2.0), product of:
            0.15195054 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.017922899 = queryNorm
            0.56201804 = fieldWeight in 562, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.046875 = fieldNorm(doc=562)
        0.08539894 = weight(_text_:2f in 562) [ClassicSimilarity], result of:
          0.08539894 = score(doc=562,freq=2.0), product of:
            0.15195054 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.017922899 = queryNorm
            0.56201804 = fieldWeight in 562, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.046875 = fieldNorm(doc=562)
        0.08539894 = weight(_text_:2f in 562) [ClassicSimilarity], result of:
          0.08539894 = score(doc=562,freq=2.0), product of:
            0.15195054 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.017922899 = queryNorm
            0.56201804 = fieldWeight in 562, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.046875 = fieldNorm(doc=562)
        0.08539894 = weight(_text_:2f in 562) [ClassicSimilarity], result of:
          0.08539894 = score(doc=562,freq=2.0), product of:
            0.15195054 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.017922899 = queryNorm
            0.56201804 = fieldWeight in 562, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.046875 = fieldNorm(doc=562)
        0.08539894 = weight(_text_:2f in 562) [ClassicSimilarity], result of:
          0.08539894 = score(doc=562,freq=2.0), product of:
            0.15195054 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.017922899 = queryNorm
            0.56201804 = fieldWeight in 562, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.046875 = fieldNorm(doc=562)
        0.01181941 = product of:
          0.017729115 = sum of:
            0.003159283 = weight(_text_:a in 562) [ClassicSimilarity], result of:
              0.003159283 = score(doc=562,freq=8.0), product of:
                0.020665944 = queryWeight, product of:
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.017922899 = queryNorm
                0.15287387 = fieldWeight in 562, product of:
                  2.828427 = tf(freq=8.0), with freq of:
                    8.0 = termFreq=8.0
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.046875 = fieldNorm(doc=562)
            0.014569832 = weight(_text_:22 in 562) [ClassicSimilarity], result of:
              0.014569832 = score(doc=562,freq=2.0), product of:
                0.06276294 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.017922899 = queryNorm
                0.23214069 = fieldWeight in 562, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.046875 = fieldNorm(doc=562)
          0.6666667 = coord(2/3)
      0.23333333 = coord(7/30)
    
    Abstract
    Document representations for text classification are typically based on the classical Bag-Of-Words paradigm. This approach comes with deficiencies that motivate the integration of features on a higher semantic level than single words. In this paper we propose an enhancement of the classical document representation through concepts extracted from background knowledge. Boosting is used for actual classification. Experimental evaluations on two well known text corpora support our approach through consistent improvement of the results.
    Content
    Vgl.: http://www.google.de/url?sa=t&rct=j&q=&esrc=s&source=web&cd=1&cad=rja&ved=0CEAQFjAA&url=http%3A%2F%2Fciteseerx.ist.psu.edu%2Fviewdoc%2Fdownload%3Fdoi%3D10.1.1.91.4940%26rep%3Drep1%26type%3Dpdf&ei=dOXrUMeIDYHDtQahsIGACg&usg=AFQjCNHFWVh6gNPvnOrOS9R3rkrXCNVD-A&sig2=5I2F5evRfMnsttSgFF9g7Q&bvm=bv.1357316858,d.Yms.
    Date
    8. 1.2013 10:22:32
    Type
    a
  8. Zeng, Q.; Yu, M.; Yu, W.; Xiong, J.; Shi, Y.; Jiang, M.: Faceted hierarchy : a new graph type to organize scientific concepts and a construction method (2019) 0.11
    0.10671722 = product of:
      0.45735952 = sum of:
        0.028466314 = product of:
          0.08539894 = sum of:
            0.08539894 = weight(_text_:3a in 400) [ClassicSimilarity], result of:
              0.08539894 = score(doc=400,freq=2.0), product of:
                0.15195054 = queryWeight, product of:
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.017922899 = queryNorm
                0.56201804 = fieldWeight in 400, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.046875 = fieldNorm(doc=400)
          0.33333334 = coord(1/3)
        0.08539894 = weight(_text_:2f in 400) [ClassicSimilarity], result of:
          0.08539894 = score(doc=400,freq=2.0), product of:
            0.15195054 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.017922899 = queryNorm
            0.56201804 = fieldWeight in 400, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.046875 = fieldNorm(doc=400)
        0.08539894 = weight(_text_:2f in 400) [ClassicSimilarity], result of:
          0.08539894 = score(doc=400,freq=2.0), product of:
            0.15195054 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.017922899 = queryNorm
            0.56201804 = fieldWeight in 400, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.046875 = fieldNorm(doc=400)
        0.08539894 = weight(_text_:2f in 400) [ClassicSimilarity], result of:
          0.08539894 = score(doc=400,freq=2.0), product of:
            0.15195054 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.017922899 = queryNorm
            0.56201804 = fieldWeight in 400, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.046875 = fieldNorm(doc=400)
        0.08539894 = weight(_text_:2f in 400) [ClassicSimilarity], result of:
          0.08539894 = score(doc=400,freq=2.0), product of:
            0.15195054 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.017922899 = queryNorm
            0.56201804 = fieldWeight in 400, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.046875 = fieldNorm(doc=400)
        0.08539894 = weight(_text_:2f in 400) [ClassicSimilarity], result of:
          0.08539894 = score(doc=400,freq=2.0), product of:
            0.15195054 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.017922899 = queryNorm
            0.56201804 = fieldWeight in 400, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.046875 = fieldNorm(doc=400)
        0.0018984926 = product of:
          0.0056954776 = sum of:
            0.0056954776 = weight(_text_:a in 400) [ClassicSimilarity], result of:
              0.0056954776 = score(doc=400,freq=26.0), product of:
                0.020665944 = queryWeight, product of:
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.017922899 = queryNorm
                0.27559727 = fieldWeight in 400, product of:
                  5.0990195 = tf(freq=26.0), with freq of:
                    26.0 = termFreq=26.0
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.046875 = fieldNorm(doc=400)
          0.33333334 = coord(1/3)
      0.23333333 = coord(7/30)
    
    Abstract
    On a scientific concept hierarchy, a parent concept may have a few attributes, each of which has multiple values being a group of child concepts. We call these attributes facets: classification has a few facets such as application (e.g., face recognition), model (e.g., svm, knn), and metric (e.g., precision). In this work, we aim at building faceted concept hierarchies from scientific literature. Hierarchy construction methods heavily rely on hypernym detection, however, the faceted relations are parent-to-child links but the hypernym relation is a multi-hop, i.e., ancestor-to-descendent link with a specific facet "type-of". We use information extraction techniques to find synonyms, sibling concepts, and ancestor-descendent relations from a data science corpus. And we propose a hierarchy growth algorithm to infer the parent-child links from the three types of relationships. It resolves conflicts by maintaining the acyclic structure of a hierarchy.
    Content
    Vgl.: https%3A%2F%2Faclanthology.org%2FD19-5317.pdf&usg=AOvVaw0ZZFyq5wWTtNTvNkrvjlGA.
    Type
    a
  9. Noever, D.; Ciolino, M.: ¬The Turing deception (2022) 0.11
    0.106548965 = product of:
      0.45663843 = sum of:
        0.028466314 = product of:
          0.08539894 = sum of:
            0.08539894 = weight(_text_:3a in 862) [ClassicSimilarity], result of:
              0.08539894 = score(doc=862,freq=2.0), product of:
                0.15195054 = queryWeight, product of:
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.017922899 = queryNorm
                0.56201804 = fieldWeight in 862, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.046875 = fieldNorm(doc=862)
          0.33333334 = coord(1/3)
        0.08539894 = weight(_text_:2f in 862) [ClassicSimilarity], result of:
          0.08539894 = score(doc=862,freq=2.0), product of:
            0.15195054 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.017922899 = queryNorm
            0.56201804 = fieldWeight in 862, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.046875 = fieldNorm(doc=862)
        0.08539894 = weight(_text_:2f in 862) [ClassicSimilarity], result of:
          0.08539894 = score(doc=862,freq=2.0), product of:
            0.15195054 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.017922899 = queryNorm
            0.56201804 = fieldWeight in 862, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.046875 = fieldNorm(doc=862)
        0.08539894 = weight(_text_:2f in 862) [ClassicSimilarity], result of:
          0.08539894 = score(doc=862,freq=2.0), product of:
            0.15195054 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.017922899 = queryNorm
            0.56201804 = fieldWeight in 862, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.046875 = fieldNorm(doc=862)
        0.08539894 = weight(_text_:2f in 862) [ClassicSimilarity], result of:
          0.08539894 = score(doc=862,freq=2.0), product of:
            0.15195054 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.017922899 = queryNorm
            0.56201804 = fieldWeight in 862, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.046875 = fieldNorm(doc=862)
        0.08539894 = weight(_text_:2f in 862) [ClassicSimilarity], result of:
          0.08539894 = score(doc=862,freq=2.0), product of:
            0.15195054 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.017922899 = queryNorm
            0.56201804 = fieldWeight in 862, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.046875 = fieldNorm(doc=862)
        0.0011773953 = product of:
          0.003532186 = sum of:
            0.003532186 = weight(_text_:a in 862) [ClassicSimilarity], result of:
              0.003532186 = score(doc=862,freq=10.0), product of:
                0.020665944 = queryWeight, product of:
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.017922899 = queryNorm
                0.1709182 = fieldWeight in 862, product of:
                  3.1622777 = tf(freq=10.0), with freq of:
                    10.0 = termFreq=10.0
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.046875 = fieldNorm(doc=862)
          0.33333334 = coord(1/3)
      0.23333333 = coord(7/30)
    
    Abstract
    This research revisits the classic Turing test and compares recent large language models such as ChatGPT for their abilities to reproduce human-level comprehension and compelling text generation. Two task challenges- summary and question answering- prompt ChatGPT to produce original content (98-99%) from a single text entry and sequential questions initially posed by Turing in 1950. We score the original and generated content against the OpenAI GPT-2 Output Detector from 2019, and establish multiple cases where the generated content proves original and undetectable (98%). The question of a machine fooling a human judge recedes in this work relative to the question of "how would one prove it?" The original contribution of the work presents a metric and simple grammatical set for understanding the writing mechanics of chatbots in evaluating their readability and statistical clarity, engagement, delivery, overall quality, and plagiarism risks. While Turing's original prose scores at least 14% below the machine-generated output, whether an algorithm displays hints of Turing's true initial thoughts (the "Lovelace 2.0" test) remains unanswerable.
    Source
    https%3A%2F%2Farxiv.org%2Fabs%2F2212.06721&usg=AOvVaw3i_9pZm9y_dQWoHi6uv0EN
    Type
    a
  10. Xiong, C.: Knowledge based text representations for information retrieval (2016) 0.10
    0.09862116 = product of:
      0.4226621 = sum of:
        0.018977545 = product of:
          0.056932632 = sum of:
            0.056932632 = weight(_text_:3a in 5820) [ClassicSimilarity], result of:
              0.056932632 = score(doc=5820,freq=2.0), product of:
                0.15195054 = queryWeight, product of:
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.017922899 = queryNorm
                0.3746787 = fieldWeight in 5820, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.03125 = fieldNorm(doc=5820)
          0.33333334 = coord(1/3)
        0.0805149 = weight(_text_:2f in 5820) [ClassicSimilarity], result of:
          0.0805149 = score(doc=5820,freq=4.0), product of:
            0.15195054 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.017922899 = queryNorm
            0.5298757 = fieldWeight in 5820, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03125 = fieldNorm(doc=5820)
        0.0805149 = weight(_text_:2f in 5820) [ClassicSimilarity], result of:
          0.0805149 = score(doc=5820,freq=4.0), product of:
            0.15195054 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.017922899 = queryNorm
            0.5298757 = fieldWeight in 5820, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03125 = fieldNorm(doc=5820)
        0.0805149 = weight(_text_:2f in 5820) [ClassicSimilarity], result of:
          0.0805149 = score(doc=5820,freq=4.0), product of:
            0.15195054 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.017922899 = queryNorm
            0.5298757 = fieldWeight in 5820, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03125 = fieldNorm(doc=5820)
        0.0805149 = weight(_text_:2f in 5820) [ClassicSimilarity], result of:
          0.0805149 = score(doc=5820,freq=4.0), product of:
            0.15195054 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.017922899 = queryNorm
            0.5298757 = fieldWeight in 5820, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03125 = fieldNorm(doc=5820)
        0.0805149 = weight(_text_:2f in 5820) [ClassicSimilarity], result of:
          0.0805149 = score(doc=5820,freq=4.0), product of:
            0.15195054 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.017922899 = queryNorm
            0.5298757 = fieldWeight in 5820, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03125 = fieldNorm(doc=5820)
        0.0011100589 = product of:
          0.0033301765 = sum of:
            0.0033301765 = weight(_text_:a in 5820) [ClassicSimilarity], result of:
              0.0033301765 = score(doc=5820,freq=20.0), product of:
                0.020665944 = queryWeight, product of:
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.017922899 = queryNorm
                0.16114321 = fieldWeight in 5820, product of:
                  4.472136 = tf(freq=20.0), with freq of:
                    20.0 = termFreq=20.0
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.03125 = fieldNorm(doc=5820)
          0.33333334 = coord(1/3)
      0.23333333 = coord(7/30)
    
    Abstract
    The successes of information retrieval (IR) in recent decades were built upon bag-of-words representations. Effective as it is, bag-of-words is only a shallow text understanding; there is a limited amount of information for document ranking in the word space. This dissertation goes beyond words and builds knowledge based text representations, which embed the external and carefully curated information from knowledge bases, and provide richer and structured evidence for more advanced information retrieval systems. This thesis research first builds query representations with entities associated with the query. Entities' descriptions are used by query expansion techniques that enrich the query with explanation terms. Then we present a general framework that represents a query with entities that appear in the query, are retrieved by the query, or frequently show up in the top retrieved documents. A latent space model is developed to jointly learn the connections from query to entities and the ranking of documents, modeling the external evidence from knowledge bases and internal ranking features cooperatively. To further improve the quality of relevant entities, a defining factor of our query representations, we introduce learning to rank to entity search and retrieve better entities from knowledge bases. In the document representation part, this thesis research also moves one step forward with a bag-of-entities model, in which documents are represented by their automatic entity annotations, and the ranking is performed in the entity space.
    This proposal includes plans to improve the quality of relevant entities with a co-learning framework that learns from both entity labels and document labels. We also plan to develop a hybrid ranking system that combines word based and entity based representations together with their uncertainties considered. At last, we plan to enrich the text representations with connections between entities. We propose several ways to infer entity graph representations for texts, and to rank documents using their structure representations. This dissertation overcomes the limitation of word based representations with external and carefully curated information from knowledge bases. We believe this thesis research is a solid start towards the new generation of intelligent, semantic, and structured information retrieval.
    Content
    Submitted in partial fulfillment of the requirements for the degree of Doctor of Philosophy in Language and Information Technologies. Vgl.: https%3A%2F%2Fwww.cs.cmu.edu%2F~cx%2Fpapers%2Fknowledge_based_text_representation.pdf&usg=AOvVaw0SaTSvhWLTh__Uz_HtOtl3.
  11. Stojanovic, N.: Ontology-based Information Retrieval : methods and tools for cooperative query answering (2005) 0.10
    0.09611896 = product of:
      0.36044607 = sum of:
        0.018977545 = product of:
          0.056932632 = sum of:
            0.056932632 = weight(_text_:3a in 701) [ClassicSimilarity], result of:
              0.056932632 = score(doc=701,freq=2.0), product of:
                0.15195054 = queryWeight, product of:
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.017922899 = queryNorm
                0.3746787 = fieldWeight in 701, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.03125 = fieldNorm(doc=701)
          0.33333334 = coord(1/3)
        0.055316057 = weight(_text_:wirtschaftswissenschaften in 701) [ClassicSimilarity], result of:
          0.055316057 = score(doc=701,freq=6.0), product of:
            0.11380646 = queryWeight, product of:
              6.3497796 = idf(docFreq=209, maxDocs=44218)
              0.017922899 = queryNorm
            0.48605376 = fieldWeight in 701, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              6.3497796 = idf(docFreq=209, maxDocs=44218)
              0.03125 = fieldNorm(doc=701)
        0.056932632 = weight(_text_:2f in 701) [ClassicSimilarity], result of:
          0.056932632 = score(doc=701,freq=2.0), product of:
            0.15195054 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.017922899 = queryNorm
            0.3746787 = fieldWeight in 701, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03125 = fieldNorm(doc=701)
        0.056932632 = weight(_text_:2f in 701) [ClassicSimilarity], result of:
          0.056932632 = score(doc=701,freq=2.0), product of:
            0.15195054 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.017922899 = queryNorm
            0.3746787 = fieldWeight in 701, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03125 = fieldNorm(doc=701)
        0.056932632 = weight(_text_:2f in 701) [ClassicSimilarity], result of:
          0.056932632 = score(doc=701,freq=2.0), product of:
            0.15195054 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.017922899 = queryNorm
            0.3746787 = fieldWeight in 701, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03125 = fieldNorm(doc=701)
        0.056932632 = weight(_text_:2f in 701) [ClassicSimilarity], result of:
          0.056932632 = score(doc=701,freq=2.0), product of:
            0.15195054 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.017922899 = queryNorm
            0.3746787 = fieldWeight in 701, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03125 = fieldNorm(doc=701)
        0.056932632 = weight(_text_:2f in 701) [ClassicSimilarity], result of:
          0.056932632 = score(doc=701,freq=2.0), product of:
            0.15195054 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.017922899 = queryNorm
            0.3746787 = fieldWeight in 701, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03125 = fieldNorm(doc=701)
        0.0014893002 = product of:
          0.0044679004 = sum of:
            0.0044679004 = weight(_text_:a in 701) [ClassicSimilarity], result of:
              0.0044679004 = score(doc=701,freq=36.0), product of:
                0.020665944 = queryWeight, product of:
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.017922899 = queryNorm
                0.2161963 = fieldWeight in 701, product of:
                  6.0 = tf(freq=36.0), with freq of:
                    36.0 = termFreq=36.0
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.03125 = fieldNorm(doc=701)
          0.33333334 = coord(1/3)
      0.26666668 = coord(8/30)
    
    Abstract
    By the explosion of possibilities for a ubiquitous content production, the information overload problem reaches the level of complexity which cannot be managed by traditional modelling approaches anymore. Due to their pure syntactical nature traditional information retrieval approaches did not succeed in treating content itself (i.e. its meaning, and not its representation). This leads to a very low usefulness of the results of a retrieval process for a user's task at hand. In the last ten years ontologies have been emerged from an interesting conceptualisation paradigm to a very promising (semantic) modelling technology, especially in the context of the Semantic Web. From the information retrieval point of view, ontologies enable a machine-understandable form of content description, such that the retrieval process can be driven by the meaning of the content. However, the very ambiguous nature of the retrieval process in which a user, due to the unfamiliarity with the underlying repository and/or query syntax, just approximates his information need in a query, implies a necessity to include the user in the retrieval process more actively in order to close the gap between the meaning of the content and the meaning of a user's query (i.e. his information need). This thesis lays foundation for such an ontology-based interactive retrieval process, in which the retrieval system interacts with a user in order to conceptually interpret the meaning of his query, whereas the underlying domain ontology drives the conceptualisation process. In that way the retrieval process evolves from a query evaluation process into a highly interactive cooperation between a user and the retrieval system, in which the system tries to anticipate the user's information need and to deliver the relevant content proactively. Moreover, the notion of content relevance for a user's query evolves from a content dependent artefact to the multidimensional context-dependent structure, strongly influenced by the user's preferences. This cooperation process is realized as the so-called Librarian Agent Query Refinement Process. In order to clarify the impact of an ontology on the retrieval process (regarding its complexity and quality), a set of methods and tools for different levels of content and query formalisation is developed, ranging from pure ontology-based inferencing to keyword-based querying in which semantics automatically emerges from the results. Our evaluation studies have shown that the possibilities to conceptualize a user's information need in the right manner and to interpret the retrieval results accordingly are key issues for realizing much more meaningful information retrieval systems.
    Content
    Vgl.: http%3A%2F%2Fdigbib.ubka.uni-karlsruhe.de%2Fvolltexte%2Fdocuments%2F1627&ei=tAtYUYrBNoHKtQb3l4GYBw&usg=AFQjCNHeaxKkKU3-u54LWxMNYGXaaDLCGw&sig2=8WykXWQoDKjDSdGtAakH2Q&bvm=bv.44442042,d.Yms.
    Footnote
    Zur Erlangung des akademischen Grades eines Doktors der Wirtschaftswissenschaften (Dr. rer. pol.) von der Fakultaet fuer Wirtschaftswissenschaften der Universitaet Fridericiana zu Karlsruhe genehmigte Dissertation.
    Imprint
    Karlsruhe : Fakultaet fuer Wirtschaftswissenschaften der Universitaet Fridericiana zu Karlsruhe
  12. Farazi, M.: Faceted lightweight ontologies : a formalization and some experiments (2010) 0.09
    0.08891655 = product of:
      0.3810709 = sum of:
        0.023721932 = product of:
          0.07116579 = sum of:
            0.07116579 = weight(_text_:3a in 4997) [ClassicSimilarity], result of:
              0.07116579 = score(doc=4997,freq=2.0), product of:
                0.15195054 = queryWeight, product of:
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.017922899 = queryNorm
                0.46834838 = fieldWeight in 4997, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=4997)
          0.33333334 = coord(1/3)
        0.07116579 = weight(_text_:2f in 4997) [ClassicSimilarity], result of:
          0.07116579 = score(doc=4997,freq=2.0), product of:
            0.15195054 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.017922899 = queryNorm
            0.46834838 = fieldWeight in 4997, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.0390625 = fieldNorm(doc=4997)
        0.07116579 = weight(_text_:2f in 4997) [ClassicSimilarity], result of:
          0.07116579 = score(doc=4997,freq=2.0), product of:
            0.15195054 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.017922899 = queryNorm
            0.46834838 = fieldWeight in 4997, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.0390625 = fieldNorm(doc=4997)
        0.07116579 = weight(_text_:2f in 4997) [ClassicSimilarity], result of:
          0.07116579 = score(doc=4997,freq=2.0), product of:
            0.15195054 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.017922899 = queryNorm
            0.46834838 = fieldWeight in 4997, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.0390625 = fieldNorm(doc=4997)
        0.07116579 = weight(_text_:2f in 4997) [ClassicSimilarity], result of:
          0.07116579 = score(doc=4997,freq=2.0), product of:
            0.15195054 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.017922899 = queryNorm
            0.46834838 = fieldWeight in 4997, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.0390625 = fieldNorm(doc=4997)
        0.07116579 = weight(_text_:2f in 4997) [ClassicSimilarity], result of:
          0.07116579 = score(doc=4997,freq=2.0), product of:
            0.15195054 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.017922899 = queryNorm
            0.46834838 = fieldWeight in 4997, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.0390625 = fieldNorm(doc=4997)
        0.0015200109 = product of:
          0.0045600324 = sum of:
            0.0045600324 = weight(_text_:a in 4997) [ClassicSimilarity], result of:
              0.0045600324 = score(doc=4997,freq=24.0), product of:
                0.020665944 = queryWeight, product of:
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.017922899 = queryNorm
                0.22065444 = fieldWeight in 4997, product of:
                  4.8989797 = tf(freq=24.0), with freq of:
                    24.0 = termFreq=24.0
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=4997)
          0.33333334 = coord(1/3)
      0.23333333 = coord(7/30)
    
    Abstract
    While classifications are heavily used to categorize web content, the evolution of the web foresees a more formal structure - ontology - which can serve this purpose. Ontologies are core artifacts of the Semantic Web which enable machines to use inference rules to conduct automated reasoning on data. Lightweight ontologies bridge the gap between classifications and ontologies. A lightweight ontology (LO) is an ontology representing a backbone taxonomy where the concept of the child node is more specific than the concept of the parent node. Formal lightweight ontologies can be generated from their informal ones. The key applications of formal lightweight ontologies are document classification, semantic search, and data integration. However, these applications suffer from the following problems: the disambiguation accuracy of the state of the art NLP tools used in generating formal lightweight ontologies from their informal ones; the lack of background knowledge needed for the formal lightweight ontologies; and the limitation of ontology reuse. In this dissertation, we propose a novel solution to these problems in formal lightweight ontologies; namely, faceted lightweight ontology (FLO). FLO is a lightweight ontology in which terms, present in each node label, and their concepts, are available in the background knowledge (BK), which is organized as a set of facets. A facet can be defined as a distinctive property of the groups of concepts that can help in differentiating one group from another. Background knowledge can be defined as a subset of a knowledge base, such as WordNet, and often represents a specific domain.
    Content
    PhD Dissertation at International Doctorate School in Information and Communication Technology. Vgl.: https%3A%2F%2Fcore.ac.uk%2Fdownload%2Fpdf%2F150083013.pdf&usg=AOvVaw2n-qisNagpyT0lli_6QbAQ.
  13. Malsburg, C. von der: ¬The correlation theory of brain function (1981) 0.09
    0.08886903 = product of:
      0.38086727 = sum of:
        0.023721932 = product of:
          0.07116579 = sum of:
            0.07116579 = weight(_text_:3a in 76) [ClassicSimilarity], result of:
              0.07116579 = score(doc=76,freq=2.0), product of:
                0.15195054 = queryWeight, product of:
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.017922899 = queryNorm
                0.46834838 = fieldWeight in 76, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=76)
          0.33333334 = coord(1/3)
        0.07116579 = weight(_text_:2f in 76) [ClassicSimilarity], result of:
          0.07116579 = score(doc=76,freq=2.0), product of:
            0.15195054 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.017922899 = queryNorm
            0.46834838 = fieldWeight in 76, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.0390625 = fieldNorm(doc=76)
        0.07116579 = weight(_text_:2f in 76) [ClassicSimilarity], result of:
          0.07116579 = score(doc=76,freq=2.0), product of:
            0.15195054 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.017922899 = queryNorm
            0.46834838 = fieldWeight in 76, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.0390625 = fieldNorm(doc=76)
        0.07116579 = weight(_text_:2f in 76) [ClassicSimilarity], result of:
          0.07116579 = score(doc=76,freq=2.0), product of:
            0.15195054 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.017922899 = queryNorm
            0.46834838 = fieldWeight in 76, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.0390625 = fieldNorm(doc=76)
        0.07116579 = weight(_text_:2f in 76) [ClassicSimilarity], result of:
          0.07116579 = score(doc=76,freq=2.0), product of:
            0.15195054 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.017922899 = queryNorm
            0.46834838 = fieldWeight in 76, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.0390625 = fieldNorm(doc=76)
        0.07116579 = weight(_text_:2f in 76) [ClassicSimilarity], result of:
          0.07116579 = score(doc=76,freq=2.0), product of:
            0.15195054 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.017922899 = queryNorm
            0.46834838 = fieldWeight in 76, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.0390625 = fieldNorm(doc=76)
        0.0013163679 = product of:
          0.0039491034 = sum of:
            0.0039491034 = weight(_text_:a in 76) [ClassicSimilarity], result of:
              0.0039491034 = score(doc=76,freq=18.0), product of:
                0.020665944 = queryWeight, product of:
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.017922899 = queryNorm
                0.19109234 = fieldWeight in 76, product of:
                  4.2426405 = tf(freq=18.0), with freq of:
                    18.0 = termFreq=18.0
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=76)
          0.33333334 = coord(1/3)
      0.23333333 = coord(7/30)
    
    Abstract
    A summary of brain theory is given so far as it is contained within the framework of Localization Theory. Difficulties of this "conventional theory" are traced back to a specific deficiency: there is no way to express relations between active cells (as for instance their representing parts of the same object). A new theory is proposed to cure this deficiency. It introduces a new kind of dynamical control, termed synaptic modulation, according to which synapses switch between a conducting and a non- conducting state. The dynamics of this variable is controlled on a fast time scale by correlations in the temporal fine structure of cellular signals. Furthermore, conventional synaptic plasticity is replaced by a refined version. Synaptic modulation and plasticity form the basis for short-term and long-term memory, respectively. Signal correlations, shaped by the variable network, express structure and relationships within objects. In particular, the figure-ground problem may be solved in this way. Synaptic modulation introduces exibility into cerebral networks which is necessary to solve the invariance problem. Since momentarily useless connections are deactivated, interference between di erent memory traces can be reduced, and memory capacity increased, in comparison with conventional associative memory
    Source
    http%3A%2F%2Fcogprints.org%2F1380%2F1%2FvdM_correlation.pdf&usg=AOvVaw0g7DvZbQPb2U7dYb49b9v_
    Type
    a
  14. Huo, W.: Automatic multi-word term extraction and its application to Web-page summarization (2012) 0.09
    0.08770639 = product of:
      0.43853194 = sum of:
        0.08539894 = weight(_text_:2f in 563) [ClassicSimilarity], result of:
          0.08539894 = score(doc=563,freq=2.0), product of:
            0.15195054 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.017922899 = queryNorm
            0.56201804 = fieldWeight in 563, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.046875 = fieldNorm(doc=563)
        0.08539894 = weight(_text_:2f in 563) [ClassicSimilarity], result of:
          0.08539894 = score(doc=563,freq=2.0), product of:
            0.15195054 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.017922899 = queryNorm
            0.56201804 = fieldWeight in 563, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.046875 = fieldNorm(doc=563)
        0.08539894 = weight(_text_:2f in 563) [ClassicSimilarity], result of:
          0.08539894 = score(doc=563,freq=2.0), product of:
            0.15195054 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.017922899 = queryNorm
            0.56201804 = fieldWeight in 563, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.046875 = fieldNorm(doc=563)
        0.08539894 = weight(_text_:2f in 563) [ClassicSimilarity], result of:
          0.08539894 = score(doc=563,freq=2.0), product of:
            0.15195054 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.017922899 = queryNorm
            0.56201804 = fieldWeight in 563, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.046875 = fieldNorm(doc=563)
        0.08539894 = weight(_text_:2f in 563) [ClassicSimilarity], result of:
          0.08539894 = score(doc=563,freq=2.0), product of:
            0.15195054 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.017922899 = queryNorm
            0.56201804 = fieldWeight in 563, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.046875 = fieldNorm(doc=563)
        0.011537234 = product of:
          0.017305851 = sum of:
            0.0027360192 = weight(_text_:a in 563) [ClassicSimilarity], result of:
              0.0027360192 = score(doc=563,freq=6.0), product of:
                0.020665944 = queryWeight, product of:
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.017922899 = queryNorm
                0.13239266 = fieldWeight in 563, product of:
                  2.4494898 = tf(freq=6.0), with freq of:
                    6.0 = termFreq=6.0
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.046875 = fieldNorm(doc=563)
            0.014569832 = weight(_text_:22 in 563) [ClassicSimilarity], result of:
              0.014569832 = score(doc=563,freq=2.0), product of:
                0.06276294 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.017922899 = queryNorm
                0.23214069 = fieldWeight in 563, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.046875 = fieldNorm(doc=563)
          0.6666667 = coord(2/3)
      0.2 = coord(6/30)
    
    Abstract
    In this thesis we propose three new word association measures for multi-word term extraction. We combine these association measures with LocalMaxs algorithm in our extraction model and compare the results of different multi-word term extraction methods. Our approach is language and domain independent and requires no training data. It can be applied to such tasks as text summarization, information retrieval, and document classification. We further explore the potential of using multi-word terms as an effective representation for general web-page summarization. We extract multi-word terms from human written summaries in a large collection of web-pages, and generate the summaries by aligning document words with these multi-word terms. Our system applies machine translation technology to learn the aligning process from a training set and focuses on selecting high quality multi-word terms from human written summaries to generate suitable results for web-page summarization.
    Content
    A Thesis presented to The University of Guelph In partial fulfilment of requirements for the degree of Master of Science in Computer Science. Vgl. Unter: http://www.inf.ufrgs.br%2F~ceramisch%2Fdownload_files%2Fpublications%2F2009%2Fp01.pdf.
    Date
    10. 1.2013 19:22:47
  15. Ackermann, E.: Piaget's constructivism, Papert's constructionism : what's the difference? (2001) 0.05
    0.04938139 = product of:
      0.24690695 = sum of:
        0.023721932 = product of:
          0.07116579 = sum of:
            0.07116579 = weight(_text_:3a in 692) [ClassicSimilarity], result of:
              0.07116579 = score(doc=692,freq=2.0), product of:
                0.15195054 = queryWeight, product of:
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.017922899 = queryNorm
                0.46834838 = fieldWeight in 692, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=692)
          0.33333334 = coord(1/3)
        0.055606253 = product of:
          0.11121251 = sum of:
            0.11121251 = weight(_text_:2c in 692) [ClassicSimilarity], result of:
              0.11121251 = score(doc=692,freq=2.0), product of:
                0.1899518 = queryWeight, product of:
                  10.598275 = idf(docFreq=2, maxDocs=44218)
                  0.017922899 = queryNorm
                0.5854775 = fieldWeight in 692, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  10.598275 = idf(docFreq=2, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=692)
          0.5 = coord(1/2)
        0.055606253 = product of:
          0.11121251 = sum of:
            0.11121251 = weight(_text_:2c in 692) [ClassicSimilarity], result of:
              0.11121251 = score(doc=692,freq=2.0), product of:
                0.1899518 = queryWeight, product of:
                  10.598275 = idf(docFreq=2, maxDocs=44218)
                  0.017922899 = queryNorm
                0.5854775 = fieldWeight in 692, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  10.598275 = idf(docFreq=2, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=692)
          0.5 = coord(1/2)
        0.055606253 = product of:
          0.11121251 = sum of:
            0.11121251 = weight(_text_:2c in 692) [ClassicSimilarity], result of:
              0.11121251 = score(doc=692,freq=2.0), product of:
                0.1899518 = queryWeight, product of:
                  10.598275 = idf(docFreq=2, maxDocs=44218)
                  0.017922899 = queryNorm
                0.5854775 = fieldWeight in 692, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  10.598275 = idf(docFreq=2, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=692)
          0.5 = coord(1/2)
        0.055606253 = product of:
          0.11121251 = sum of:
            0.11121251 = weight(_text_:2c in 692) [ClassicSimilarity], result of:
              0.11121251 = score(doc=692,freq=2.0), product of:
                0.1899518 = queryWeight, product of:
                  10.598275 = idf(docFreq=2, maxDocs=44218)
                  0.017922899 = queryNorm
                0.5854775 = fieldWeight in 692, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  10.598275 = idf(docFreq=2, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=692)
          0.5 = coord(1/2)
        7.6000544E-4 = product of:
          0.0022800162 = sum of:
            0.0022800162 = weight(_text_:a in 692) [ClassicSimilarity], result of:
              0.0022800162 = score(doc=692,freq=6.0), product of:
                0.020665944 = queryWeight, product of:
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.017922899 = queryNorm
                0.11032722 = fieldWeight in 692, product of:
                  2.4494898 = tf(freq=6.0), with freq of:
                    6.0 = termFreq=6.0
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=692)
          0.33333334 = coord(1/3)
      0.2 = coord(6/30)
    
    Abstract
    What is the difference between Piaget's constructivism and Papert's "constructionism"? Beyond the mere play on the words, I think the distinction holds, and that integrating both views can enrich our understanding of how people learn and grow. Piaget's constructivism offers a window into what children are interested in, and able to achieve, at different stages of their development. The theory describes how children's ways of doing and thinking evolve over time, and under which circumstance children are more likely to let go of-or hold onto- their currently held views. Piaget suggests that children have very good reasons not to abandon their worldviews just because someone else, be it an expert, tells them they're wrong. Papert's constructionism, in contrast, focuses more on the art of learning, or 'learning to learn', and on the significance of making things in learning. Papert is interested in how learners engage in a conversation with [their own or other people's] artifacts, and how these conversations boost self-directed learning, and ultimately facilitate the construction of new knowledge. He stresses the importance of tools, media, and context in human development. Integrating both perspectives illuminates the processes by which individuals come to make sense of their experience, gradually optimizing their interactions with the world.
    Content
    Vgl.: https://www.semanticscholar.org/paper/Piaget-%E2%80%99-s-Constructivism-%2C-Papert-%E2%80%99-s-%3A-What-%E2%80%99-s-Ackermann/89cbcc1e740a4591443ff4765a6ae8df0fdf5554. Darunter weitere Hinweise auf verwandte Beiträge. Auch unter: Learning Group Publication 5(2001) no.3, S.438.
    Type
    a
  16. Miller, U.; Teitelbaum, R.: Pre-coordination and post-coordination : past and future (2002) 0.02
    0.020729046 = product of:
      0.12437428 = sum of:
        0.008576217 = product of:
          0.017152434 = sum of:
            0.017152434 = weight(_text_:29 in 1395) [ClassicSimilarity], result of:
              0.017152434 = score(doc=1395,freq=2.0), product of:
                0.063047156 = queryWeight, product of:
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.017922899 = queryNorm
                0.27205724 = fieldWeight in 1395, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=1395)
          0.5 = coord(1/2)
        0.0809891 = weight(_text_:post in 1395) [ClassicSimilarity], result of:
          0.0809891 = score(doc=1395,freq=6.0), product of:
            0.10409636 = queryWeight, product of:
              5.808009 = idf(docFreq=360, maxDocs=44218)
              0.017922899 = queryNorm
            0.77802044 = fieldWeight in 1395, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              5.808009 = idf(docFreq=360, maxDocs=44218)
              0.0546875 = fieldNorm(doc=1395)
        0.0063836705 = product of:
          0.012767341 = sum of:
            0.012767341 = weight(_text_:online in 1395) [ClassicSimilarity], result of:
              0.012767341 = score(doc=1395,freq=2.0), product of:
                0.05439423 = queryWeight, product of:
                  3.0349014 = idf(docFreq=5778, maxDocs=44218)
                  0.017922899 = queryNorm
                0.23471867 = fieldWeight in 1395, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.0349014 = idf(docFreq=5778, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=1395)
          0.5 = coord(1/2)
        0.014862318 = weight(_text_:u in 1395) [ClassicSimilarity], result of:
          0.014862318 = score(doc=1395,freq=2.0), product of:
            0.058687534 = queryWeight, product of:
              3.2744443 = idf(docFreq=4547, maxDocs=44218)
              0.017922899 = queryNorm
            0.25324488 = fieldWeight in 1395, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.2744443 = idf(docFreq=4547, maxDocs=44218)
              0.0546875 = fieldNorm(doc=1395)
        0.013562972 = product of:
          0.020344457 = sum of:
            0.0031920224 = weight(_text_:a in 1395) [ClassicSimilarity], result of:
              0.0031920224 = score(doc=1395,freq=6.0), product of:
                0.020665944 = queryWeight, product of:
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.017922899 = queryNorm
                0.1544581 = fieldWeight in 1395, product of:
                  2.4494898 = tf(freq=6.0), with freq of:
                    6.0 = termFreq=6.0
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=1395)
            0.017152434 = weight(_text_:29 in 1395) [ClassicSimilarity], result of:
              0.017152434 = score(doc=1395,freq=2.0), product of:
                0.063047156 = queryWeight, product of:
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.017922899 = queryNorm
                0.27205724 = fieldWeight in 1395, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=1395)
          0.6666667 = coord(2/3)
      0.16666667 = coord(5/30)
    
    Abstract
    This article deals with the meaningful processing of information in relation to two systems of Information processing: pre-coordination and post-coordination. The different approaches are discussed, with emphasis an the need for a controlled vocabulary in information retrieval. Assigned indexing, which employs a controlled vocabulary, is described in detail. Types of indexing language can be divided into two broad groups - those using pre-coordinated terms and those depending an post-coordination. They represent two different basic approaches in processing and Information retrieval. The historical development of these two approaches is described, as well as the two tools that apply to these approaches: thesauri and subject headings.
    Source
    Knowledge organization. 29(2002) no.2, S.87-93
    Theme
    Verbale Doksprachen im Online-Retrieval
    Type
    a
  17. Olson, N.B.: Cataloging of audiovisual materials : a manual based on AACR2 (1992) 0.02
    0.020551067 = product of:
      0.12330639 = sum of:
        0.0073510436 = product of:
          0.014702087 = sum of:
            0.014702087 = weight(_text_:29 in 1518) [ClassicSimilarity], result of:
              0.014702087 = score(doc=1518,freq=2.0), product of:
                0.063047156 = queryWeight, product of:
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.017922899 = queryNorm
                0.23319192 = fieldWeight in 1518, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.046875 = fieldNorm(doc=1518)
          0.5 = coord(1/2)
        0.027891325 = weight(_text_:neue in 1518) [ClassicSimilarity], result of:
          0.027891325 = score(doc=1518,freq=4.0), product of:
            0.07302189 = queryWeight, product of:
              4.074223 = idf(docFreq=2043, maxDocs=44218)
              0.017922899 = queryNorm
            0.38195843 = fieldWeight in 1518, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              4.074223 = idf(docFreq=2043, maxDocs=44218)
              0.046875 = fieldNorm(doc=1518)
        0.06447041 = weight(_text_:medien in 1518) [ClassicSimilarity], result of:
          0.06447041 = score(doc=1518,freq=12.0), product of:
            0.084356464 = queryWeight, product of:
              4.7066307 = idf(docFreq=1085, maxDocs=44218)
              0.017922899 = queryNorm
            0.7642616 = fieldWeight in 1518, product of:
              3.4641016 = tf(freq=12.0), with freq of:
                12.0 = termFreq=12.0
              4.7066307 = idf(docFreq=1085, maxDocs=44218)
              0.046875 = fieldNorm(doc=1518)
        0.012739129 = weight(_text_:u in 1518) [ClassicSimilarity], result of:
          0.012739129 = score(doc=1518,freq=2.0), product of:
            0.058687534 = queryWeight, product of:
              3.2744443 = idf(docFreq=4547, maxDocs=44218)
              0.017922899 = queryNorm
            0.21706703 = fieldWeight in 1518, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.2744443 = idf(docFreq=4547, maxDocs=44218)
              0.046875 = fieldNorm(doc=1518)
        0.010854486 = product of:
          0.01628173 = sum of:
            0.0015796415 = weight(_text_:a in 1518) [ClassicSimilarity], result of:
              0.0015796415 = score(doc=1518,freq=2.0), product of:
                0.020665944 = queryWeight, product of:
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.017922899 = queryNorm
                0.07643694 = fieldWeight in 1518, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.046875 = fieldNorm(doc=1518)
            0.014702087 = weight(_text_:29 in 1518) [ClassicSimilarity], result of:
              0.014702087 = score(doc=1518,freq=2.0), product of:
                0.063047156 = queryWeight, product of:
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.017922899 = queryNorm
                0.23319192 = fieldWeight in 1518, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.046875 = fieldNorm(doc=1518)
          0.6666667 = coord(2/3)
      0.16666667 = coord(5/30)
    
    BK
    05.38 / Neue elektronische Medien <Kommunikationswissenschaft>
    Classification
    05.38 / Neue elektronische Medien <Kommunikationswissenschaft>
    Date
    29. 2.2008 19:43:10
    Editor
    Intner, S.S. u. E. Swanson
    RSWK
    Audiovisuelle Medien / Katalogisierung / Einführung (BVB)
    Anglo-American cataloguing rules 2 / Audiovisuelle Medien / Richtlinie (BVB)
    Subject
    Audiovisuelle Medien / Katalogisierung / Einführung (BVB)
    Anglo-American cataloguing rules 2 / Audiovisuelle Medien / Richtlinie (BVB)
  18. Neue visuelle Welt für digitale Eingeborene : Claire Hart über die Zukunft Factivas und der Branche (2005) 0.02
    0.015902555 = product of:
      0.15902553 = sum of:
        0.11177859 = weight(_text_:wirtschaftswissenschaften in 12) [ClassicSimilarity], result of:
          0.11177859 = score(doc=12,freq=2.0), product of:
            0.11380646 = queryWeight, product of:
              6.3497796 = idf(docFreq=209, maxDocs=44218)
              0.017922899 = queryNorm
            0.98218143 = fieldWeight in 12, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              6.3497796 = idf(docFreq=209, maxDocs=44218)
              0.109375 = fieldNorm(doc=12)
        0.046018336 = weight(_text_:neue in 12) [ClassicSimilarity], result of:
          0.046018336 = score(doc=12,freq=2.0), product of:
            0.07302189 = queryWeight, product of:
              4.074223 = idf(docFreq=2043, maxDocs=44218)
              0.017922899 = queryNorm
            0.6301992 = fieldWeight in 12, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              4.074223 = idf(docFreq=2043, maxDocs=44218)
              0.109375 = fieldNorm(doc=12)
        0.00122861 = product of:
          0.00368583 = sum of:
            0.00368583 = weight(_text_:a in 12) [ClassicSimilarity], result of:
              0.00368583 = score(doc=12,freq=2.0), product of:
                0.020665944 = queryWeight, product of:
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.017922899 = queryNorm
                0.17835285 = fieldWeight in 12, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.109375 = fieldNorm(doc=12)
          0.33333334 = coord(1/3)
      0.1 = coord(3/30)
    
    Field
    Wirtschaftswissenschaften
    Type
    a
  19. Internet publishing and beyond : the economics of digital information and intellectual property ; a publication of the Harvard Information Infrastructure Project in collab. with the School of Information Management and Systems at the Univ. of California at Berkeley (2000) 0.02
    0.015038262 = product of:
      0.15038262 = sum of:
        0.13490601 = weight(_text_:760 in 526) [ClassicSimilarity], result of:
          0.13490601 = score(doc=526,freq=4.0), product of:
            0.1486828 = queryWeight, product of:
              8.29569 = idf(docFreq=29, maxDocs=44218)
              0.017922899 = queryNorm
            0.90734106 = fieldWeight in 526, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              8.29569 = idf(docFreq=29, maxDocs=44218)
              0.0546875 = fieldNorm(doc=526)
        0.014862318 = weight(_text_:u in 526) [ClassicSimilarity], result of:
          0.014862318 = score(doc=526,freq=2.0), product of:
            0.058687534 = queryWeight, product of:
              3.2744443 = idf(docFreq=4547, maxDocs=44218)
              0.017922899 = queryNorm
            0.25324488 = fieldWeight in 526, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.2744443 = idf(docFreq=4547, maxDocs=44218)
              0.0546875 = fieldNorm(doc=526)
        6.14305E-4 = product of:
          0.001842915 = sum of:
            0.001842915 = weight(_text_:a in 526) [ClassicSimilarity], result of:
              0.001842915 = score(doc=526,freq=2.0), product of:
                0.020665944 = queryWeight, product of:
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.017922899 = queryNorm
                0.089176424 = fieldWeight in 526, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=526)
          0.33333334 = coord(1/3)
      0.1 = coord(3/30)
    
    Classification
    QR 760
    Editor
    Kahin, B. u. H.R. Varian
    RVK
    QR 760
  20. Clyde, L.A.: Weblogs and libraries (2004) 0.01
    0.0144056175 = product of:
      0.072028086 = sum of:
        0.024009569 = weight(_text_:einzelne in 4496) [ClassicSimilarity], result of:
          0.024009569 = score(doc=4496,freq=2.0), product of:
            0.10548963 = queryWeight, product of:
              5.885746 = idf(docFreq=333, maxDocs=44218)
              0.017922899 = queryNorm
            0.22760123 = fieldWeight in 4496, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              5.885746 = idf(docFreq=333, maxDocs=44218)
              0.02734375 = fieldNorm(doc=4496)
        0.012516635 = product of:
          0.02503327 = sum of:
            0.02503327 = weight(_text_:dienstleistungen in 4496) [ClassicSimilarity], result of:
              0.02503327 = score(doc=4496,freq=2.0), product of:
                0.10771505 = queryWeight, product of:
                  6.009912 = idf(docFreq=294, maxDocs=44218)
                  0.017922899 = queryNorm
                0.23240271 = fieldWeight in 4496, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  6.009912 = idf(docFreq=294, maxDocs=44218)
                  0.02734375 = fieldNorm(doc=4496)
          0.5 = coord(1/2)
        0.016269939 = weight(_text_:neue in 4496) [ClassicSimilarity], result of:
          0.016269939 = score(doc=4496,freq=4.0), product of:
            0.07302189 = queryWeight, product of:
              4.074223 = idf(docFreq=2043, maxDocs=44218)
              0.017922899 = queryNorm
            0.22280908 = fieldWeight in 4496, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              4.074223 = idf(docFreq=2043, maxDocs=44218)
              0.02734375 = fieldNorm(doc=4496)
        0.015353293 = weight(_text_:medien in 4496) [ClassicSimilarity], result of:
          0.015353293 = score(doc=4496,freq=2.0), product of:
            0.084356464 = queryWeight, product of:
              4.7066307 = idf(docFreq=1085, maxDocs=44218)
              0.017922899 = queryNorm
            0.18200494 = fieldWeight in 4496, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              4.7066307 = idf(docFreq=1085, maxDocs=44218)
              0.02734375 = fieldNorm(doc=4496)
        0.0031918352 = product of:
          0.0063836705 = sum of:
            0.0063836705 = weight(_text_:online in 4496) [ClassicSimilarity], result of:
              0.0063836705 = score(doc=4496,freq=2.0), product of:
                0.05439423 = queryWeight, product of:
                  3.0349014 = idf(docFreq=5778, maxDocs=44218)
                  0.017922899 = queryNorm
                0.11735933 = fieldWeight in 4496, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.0349014 = idf(docFreq=5778, maxDocs=44218)
                  0.02734375 = fieldNorm(doc=4496)
          0.5 = coord(1/2)
        6.8681384E-4 = product of:
          0.0020604415 = sum of:
            0.0020604415 = weight(_text_:a in 4496) [ClassicSimilarity], result of:
              0.0020604415 = score(doc=4496,freq=10.0), product of:
                0.020665944 = queryWeight, product of:
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.017922899 = queryNorm
                0.09970228 = fieldWeight in 4496, product of:
                  3.1622777 = tf(freq=10.0), with freq of:
                    10.0 = termFreq=10.0
                  1.153047 = idf(docFreq=37942, maxDocs=44218)
                  0.02734375 = fieldNorm(doc=4496)
          0.33333334 = coord(1/3)
      0.2 = coord(6/30)
    
    Abstract
    This book discusses the topic of 'weblogs and libraries' from two main perspectives: weblogs as sources of information for libraries and librarians; and weblogs as tools that libraries can use to promote their services and to provide a means of communication with their clients. It begins with an overview of the whole weblog and blogging phenomenon and traces its development over the last six years. The many different kinds of weblogs are outlined (including personal weblogs, community weblogs, multimedia weblogs). The problem of locating weblogs is addressed through a discussion of weblog directories, search engines and other finding tools. Chapters include using weblogs as sources of information in the library or information service, the options for creating a weblog, and managing the library's own weblog.
    Content
    Key Features - No other book currently available specifically addresses this highly topical subject - Weblogs are becoming more important as sources of up-to-date information an many different topics, and so librarians need to be aware of these resources, how they are created and by whom - Weblogs are already important as sources of news and current professional information in the field of library and information science; this book helps librarians to become familiar with the best weblogs in this field - While relatively few libraries have created their own weblogs, the use of weblogs has been recommended in the library/information press as a way of providing information for library patrons; this book helps library managers to make decisions about a weblog for their library
    Footnote
    Rez. in: B.I.T. online 8(2005) H.2, S.202 (J. Plieninger): "Weblogs oder Blogs, zu deutsch: Netztagebücher, machen seit einigen Jahren als neue Kommunikationsform im World Wide Web Furore. Waren es zunächst einzelne Menschen, welche über Weblogs Informationen und Meinungen transportierten, so entwickeln sich Weblogs zunehmend zu Medien, durch welche auch von Institutionen Marketinginformationen an Nutzer/Kunden verteilt werden. Freilich ist dabei zu beachten, dass es sich bei Weblogs nicht unbedingt um ein Ein-WegMedium handelt: Indem die Nutzer oft vom Betreiber des Weblogs die Möglichkeit bekommen, Kommentare abzugeben, bekommt ein Weblog so einen Forencharakter, indem die angebotene Information diskutiert wird bzw. werden kann. Wenn man sich der Dienstleistungen seiner Institution sicher ist und die Außendarstellung souverän zu handhaben vermag, kann man also mittels eines Weblogs Inhalte gut transportieren und Rückmeldungen direkt entgegennehmen. Falls nicht, kann man auf die Kommentarfunktion auch verzichten. Wer sich überlegt, eventuell ein Weblog als weiteres Marketinginstrument und zur Hebung des Images der Bibliothek einzuführen, der bekommt mit diesem Werk eine umfassende Einführung. Die Autorin ist Professorin an einer bibliothekarischen Ausbildungsstätte in, Island und gibt hier einen Überblick über Weblogs im allgemeinen und ihren Einsatz im bibliothekarischen Feld im besonderen. Nach einem Überblick über die Weblogs als neues Phänomen des Internets bietet sie eine Einschätzung von Blogs als Informationsquellen, schildert danach die Suche nach Weblogs bzw. nach Inhalten von Weblogs. Sodann behandelt sie Weblogs in der Bibliotheks- und Informationswissenschaft und geht weiter auf Weblogs ein, die von Bibliotheken erstellt werden. Danach kommt der praktische Teil: Wie man ein Weblog einrichtet und - meiner Meinung nach das wichtigste Kapitel - wie man es managt. Am Schluss gibt sie Auskunft über Quellen zu Informationen über Blogs. Ein Stichwortregister schließt den Band ab.
    Wenngleich in diesem Text ausnahmslos angelsächsische Blogs und Ressourcen genannt werden, so stellt er dennoch eine ausgezeichnete Einführung dar, welche wenig Wünsche offen lässt. Nicht nur sind die Sachverhalte knapp, didaktisch und gut strukturiert dargestellt, es herrscht auch ein ausgewogenes Verhältnis zwischen Anleitungstext, statistischen Zahlen, illustrierten Beispielen und Bibliographie. Wer auch immer überlegt, die Mitteilungen seiner Bibliothek in Form eines Weblogs auf der Homepage anzubieten und den Lesern eventuell Rückmeldung in Form von Kommentaren zu ermöglichen, der hat hier die Möglichkeit, sich kompetent über das neue Feld zu informieren."

Authors

Languages

  • d 30
  • m 4
  • es 2
  • f 1
  • nl 1
  • ro 1
  • More… Less…

Types

Themes

Subjects

Classifications