Search (155 results, page 1 of 8)

  • × type_ss:"x"
  1. Verwer, K.: Freiheit und Verantwortung bei Hans Jonas (2011) 0.44
    0.43744037 = product of:
      1.0936009 = sum of:
        0.10936009 = product of:
          0.32808027 = sum of:
            0.32808027 = weight(_text_:3a in 973) [ClassicSimilarity], result of:
              0.32808027 = score(doc=973,freq=2.0), product of:
                0.291877 = queryWeight, product of:
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.03442753 = queryNorm
                1.1240361 = fieldWeight in 973, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.09375 = fieldNorm(doc=973)
          0.33333334 = coord(1/3)
        0.32808027 = weight(_text_:2f in 973) [ClassicSimilarity], result of:
          0.32808027 = score(doc=973,freq=2.0), product of:
            0.291877 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03442753 = queryNorm
            1.1240361 = fieldWeight in 973, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.09375 = fieldNorm(doc=973)
        0.32808027 = weight(_text_:2f in 973) [ClassicSimilarity], result of:
          0.32808027 = score(doc=973,freq=2.0), product of:
            0.291877 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03442753 = queryNorm
            1.1240361 = fieldWeight in 973, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.09375 = fieldNorm(doc=973)
        0.32808027 = weight(_text_:2f in 973) [ClassicSimilarity], result of:
          0.32808027 = score(doc=973,freq=2.0), product of:
            0.291877 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03442753 = queryNorm
            1.1240361 = fieldWeight in 973, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.09375 = fieldNorm(doc=973)
      0.4 = coord(4/10)
    
    Content
    Vgl.: http%3A%2F%2Fcreativechoice.org%2Fdoc%2FHansJonas.pdf&usg=AOvVaw1TM3teaYKgABL5H9yoIifA&opi=89978449.
  2. Huo, W.: Automatic multi-word term extraction and its application to Web-page summarization (2012) 0.34
    0.34462228 = product of:
      0.57437044 = sum of:
        0.16404013 = weight(_text_:2f in 563) [ClassicSimilarity], result of:
          0.16404013 = score(doc=563,freq=2.0), product of:
            0.291877 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03442753 = queryNorm
            0.56201804 = fieldWeight in 563, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.046875 = fieldNorm(doc=563)
        0.16404013 = weight(_text_:2f in 563) [ClassicSimilarity], result of:
          0.16404013 = score(doc=563,freq=2.0), product of:
            0.291877 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03442753 = queryNorm
            0.56201804 = fieldWeight in 563, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.046875 = fieldNorm(doc=563)
        0.024307044 = product of:
          0.04861409 = sum of:
            0.04861409 = weight(_text_:web in 563) [ClassicSimilarity], result of:
              0.04861409 = score(doc=563,freq=8.0), product of:
                0.11235461 = queryWeight, product of:
                  3.2635105 = idf(docFreq=4597, maxDocs=44218)
                  0.03442753 = queryNorm
                0.43268442 = fieldWeight in 563, product of:
                  2.828427 = tf(freq=8.0), with freq of:
                    8.0 = termFreq=8.0
                  3.2635105 = idf(docFreq=4597, maxDocs=44218)
                  0.046875 = fieldNorm(doc=563)
          0.5 = coord(1/2)
        0.16404013 = weight(_text_:2f in 563) [ClassicSimilarity], result of:
          0.16404013 = score(doc=563,freq=2.0), product of:
            0.291877 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03442753 = queryNorm
            0.56201804 = fieldWeight in 563, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.046875 = fieldNorm(doc=563)
        0.04861409 = weight(_text_:web in 563) [ClassicSimilarity], result of:
          0.04861409 = score(doc=563,freq=8.0), product of:
            0.11235461 = queryWeight, product of:
              3.2635105 = idf(docFreq=4597, maxDocs=44218)
              0.03442753 = queryNorm
            0.43268442 = fieldWeight in 563, product of:
              2.828427 = tf(freq=8.0), with freq of:
                8.0 = termFreq=8.0
              3.2635105 = idf(docFreq=4597, maxDocs=44218)
              0.046875 = fieldNorm(doc=563)
        0.009328911 = product of:
          0.027986731 = sum of:
            0.027986731 = weight(_text_:22 in 563) [ClassicSimilarity], result of:
              0.027986731 = score(doc=563,freq=2.0), product of:
                0.12055935 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.03442753 = queryNorm
                0.23214069 = fieldWeight in 563, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.046875 = fieldNorm(doc=563)
          0.33333334 = coord(1/3)
      0.6 = coord(6/10)
    
    Abstract
    In this thesis we propose three new word association measures for multi-word term extraction. We combine these association measures with LocalMaxs algorithm in our extraction model and compare the results of different multi-word term extraction methods. Our approach is language and domain independent and requires no training data. It can be applied to such tasks as text summarization, information retrieval, and document classification. We further explore the potential of using multi-word terms as an effective representation for general web-page summarization. We extract multi-word terms from human written summaries in a large collection of web-pages, and generate the summaries by aligning document words with these multi-word terms. Our system applies machine translation technology to learn the aligning process from a training set and focuses on selecting high quality multi-word terms from human written summaries to generate suitable results for web-page summarization.
    Content
    A Thesis presented to The University of Guelph In partial fulfilment of requirements for the degree of Master of Science in Computer Science. Vgl. Unter: http://www.inf.ufrgs.br%2F~ceramisch%2Fdownload_files%2Fpublications%2F2009%2Fp01.pdf.
    Date
    10. 1.2013 19:22:47
  3. Farazi, M.: Faceted lightweight ontologies : a formalization and some experiments (2010) 0.30
    0.30497605 = product of:
      0.5082934 = sum of:
        0.045566708 = product of:
          0.13670012 = sum of:
            0.13670012 = weight(_text_:3a in 4997) [ClassicSimilarity], result of:
              0.13670012 = score(doc=4997,freq=2.0), product of:
                0.291877 = queryWeight, product of:
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.03442753 = queryNorm
                0.46834838 = fieldWeight in 4997, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=4997)
          0.33333334 = coord(1/3)
        0.13670012 = weight(_text_:2f in 4997) [ClassicSimilarity], result of:
          0.13670012 = score(doc=4997,freq=2.0), product of:
            0.291877 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03442753 = queryNorm
            0.46834838 = fieldWeight in 4997, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.0390625 = fieldNorm(doc=4997)
        0.13670012 = weight(_text_:2f in 4997) [ClassicSimilarity], result of:
          0.13670012 = score(doc=4997,freq=2.0), product of:
            0.291877 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03442753 = queryNorm
            0.46834838 = fieldWeight in 4997, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.0390625 = fieldNorm(doc=4997)
        0.0175421 = product of:
          0.0350842 = sum of:
            0.0350842 = weight(_text_:web in 4997) [ClassicSimilarity], result of:
              0.0350842 = score(doc=4997,freq=6.0), product of:
                0.11235461 = queryWeight, product of:
                  3.2635105 = idf(docFreq=4597, maxDocs=44218)
                  0.03442753 = queryNorm
                0.3122631 = fieldWeight in 4997, product of:
                  2.4494898 = tf(freq=6.0), with freq of:
                    6.0 = termFreq=6.0
                  3.2635105 = idf(docFreq=4597, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=4997)
          0.5 = coord(1/2)
        0.13670012 = weight(_text_:2f in 4997) [ClassicSimilarity], result of:
          0.13670012 = score(doc=4997,freq=2.0), product of:
            0.291877 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03442753 = queryNorm
            0.46834838 = fieldWeight in 4997, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.0390625 = fieldNorm(doc=4997)
        0.0350842 = weight(_text_:web in 4997) [ClassicSimilarity], result of:
          0.0350842 = score(doc=4997,freq=6.0), product of:
            0.11235461 = queryWeight, product of:
              3.2635105 = idf(docFreq=4597, maxDocs=44218)
              0.03442753 = queryNorm
            0.3122631 = fieldWeight in 4997, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              3.2635105 = idf(docFreq=4597, maxDocs=44218)
              0.0390625 = fieldNorm(doc=4997)
      0.6 = coord(6/10)
    
    Abstract
    While classifications are heavily used to categorize web content, the evolution of the web foresees a more formal structure - ontology - which can serve this purpose. Ontologies are core artifacts of the Semantic Web which enable machines to use inference rules to conduct automated reasoning on data. Lightweight ontologies bridge the gap between classifications and ontologies. A lightweight ontology (LO) is an ontology representing a backbone taxonomy where the concept of the child node is more specific than the concept of the parent node. Formal lightweight ontologies can be generated from their informal ones. The key applications of formal lightweight ontologies are document classification, semantic search, and data integration. However, these applications suffer from the following problems: the disambiguation accuracy of the state of the art NLP tools used in generating formal lightweight ontologies from their informal ones; the lack of background knowledge needed for the formal lightweight ontologies; and the limitation of ontology reuse. In this dissertation, we propose a novel solution to these problems in formal lightweight ontologies; namely, faceted lightweight ontology (FLO). FLO is a lightweight ontology in which terms, present in each node label, and their concepts, are available in the background knowledge (BK), which is organized as a set of facets. A facet can be defined as a distinctive property of the groups of concepts that can help in differentiating one group from another. Background knowledge can be defined as a subset of a knowledge base, such as WordNet, and often represents a specific domain.
    Content
    PhD Dissertation at International Doctorate School in Information and Communication Technology. Vgl.: https%3A%2F%2Fcore.ac.uk%2Fdownload%2Fpdf%2F150083013.pdf&usg=AOvVaw2n-qisNagpyT0lli_6QbAQ.
  4. Shala, E.: ¬Die Autonomie des Menschen und der Maschine : gegenwärtige Definitionen von Autonomie zwischen philosophischem Hintergrund und technologischer Umsetzbarkeit (2014) 0.29
    0.29163054 = product of:
      0.48605087 = sum of:
        0.045566708 = product of:
          0.13670012 = sum of:
            0.13670012 = weight(_text_:3a in 4388) [ClassicSimilarity], result of:
              0.13670012 = score(doc=4388,freq=2.0), product of:
                0.291877 = queryWeight, product of:
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.03442753 = queryNorm
                0.46834838 = fieldWeight in 4388, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=4388)
          0.33333334 = coord(1/3)
        0.13670012 = weight(_text_:2f in 4388) [ClassicSimilarity], result of:
          0.13670012 = score(doc=4388,freq=2.0), product of:
            0.291877 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03442753 = queryNorm
            0.46834838 = fieldWeight in 4388, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.0390625 = fieldNorm(doc=4388)
        0.13670012 = weight(_text_:2f in 4388) [ClassicSimilarity], result of:
          0.13670012 = score(doc=4388,freq=2.0), product of:
            0.291877 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03442753 = queryNorm
            0.46834838 = fieldWeight in 4388, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.0390625 = fieldNorm(doc=4388)
        0.010127936 = product of:
          0.020255871 = sum of:
            0.020255871 = weight(_text_:web in 4388) [ClassicSimilarity], result of:
              0.020255871 = score(doc=4388,freq=2.0), product of:
                0.11235461 = queryWeight, product of:
                  3.2635105 = idf(docFreq=4597, maxDocs=44218)
                  0.03442753 = queryNorm
                0.18028519 = fieldWeight in 4388, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.2635105 = idf(docFreq=4597, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=4388)
          0.5 = coord(1/2)
        0.13670012 = weight(_text_:2f in 4388) [ClassicSimilarity], result of:
          0.13670012 = score(doc=4388,freq=2.0), product of:
            0.291877 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03442753 = queryNorm
            0.46834838 = fieldWeight in 4388, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.0390625 = fieldNorm(doc=4388)
        0.020255871 = weight(_text_:web in 4388) [ClassicSimilarity], result of:
          0.020255871 = score(doc=4388,freq=2.0), product of:
            0.11235461 = queryWeight, product of:
              3.2635105 = idf(docFreq=4597, maxDocs=44218)
              0.03442753 = queryNorm
            0.18028519 = fieldWeight in 4388, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.2635105 = idf(docFreq=4597, maxDocs=44218)
              0.0390625 = fieldNorm(doc=4388)
      0.6 = coord(6/10)
    
    Footnote
    Vgl. unter: https://www.google.de/url?sa=t&rct=j&q=&esrc=s&source=web&cd=2&cad=rja&uact=8&ved=2ahUKEwizweHljdbcAhVS16QKHXcFD9QQFjABegQICRAB&url=https%3A%2F%2Fwww.researchgate.net%2Fpublication%2F271200105_Die_Autonomie_des_Menschen_und_der_Maschine_-_gegenwartige_Definitionen_von_Autonomie_zwischen_philosophischem_Hintergrund_und_technologischer_Umsetzbarkeit_Redigierte_Version_der_Magisterarbeit_Karls&usg=AOvVaw06orrdJmFF2xbCCp_hL26q.
  5. Gabler, S.: Vergabe von DDC-Sachgruppen mittels eines Schlagwort-Thesaurus (2021) 0.29
    0.29163054 = product of:
      0.48605087 = sum of:
        0.045566708 = product of:
          0.13670012 = sum of:
            0.13670012 = weight(_text_:3a in 1000) [ClassicSimilarity], result of:
              0.13670012 = score(doc=1000,freq=2.0), product of:
                0.291877 = queryWeight, product of:
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.03442753 = queryNorm
                0.46834838 = fieldWeight in 1000, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=1000)
          0.33333334 = coord(1/3)
        0.13670012 = weight(_text_:2f in 1000) [ClassicSimilarity], result of:
          0.13670012 = score(doc=1000,freq=2.0), product of:
            0.291877 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03442753 = queryNorm
            0.46834838 = fieldWeight in 1000, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.0390625 = fieldNorm(doc=1000)
        0.13670012 = weight(_text_:2f in 1000) [ClassicSimilarity], result of:
          0.13670012 = score(doc=1000,freq=2.0), product of:
            0.291877 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03442753 = queryNorm
            0.46834838 = fieldWeight in 1000, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.0390625 = fieldNorm(doc=1000)
        0.010127936 = product of:
          0.020255871 = sum of:
            0.020255871 = weight(_text_:web in 1000) [ClassicSimilarity], result of:
              0.020255871 = score(doc=1000,freq=2.0), product of:
                0.11235461 = queryWeight, product of:
                  3.2635105 = idf(docFreq=4597, maxDocs=44218)
                  0.03442753 = queryNorm
                0.18028519 = fieldWeight in 1000, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.2635105 = idf(docFreq=4597, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=1000)
          0.5 = coord(1/2)
        0.13670012 = weight(_text_:2f in 1000) [ClassicSimilarity], result of:
          0.13670012 = score(doc=1000,freq=2.0), product of:
            0.291877 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03442753 = queryNorm
            0.46834838 = fieldWeight in 1000, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.0390625 = fieldNorm(doc=1000)
        0.020255871 = weight(_text_:web in 1000) [ClassicSimilarity], result of:
          0.020255871 = score(doc=1000,freq=2.0), product of:
            0.11235461 = queryWeight, product of:
              3.2635105 = idf(docFreq=4597, maxDocs=44218)
              0.03442753 = queryNorm
            0.18028519 = fieldWeight in 1000, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.2635105 = idf(docFreq=4597, maxDocs=44218)
              0.0390625 = fieldNorm(doc=1000)
      0.6 = coord(6/10)
    
    Content
    Master thesis Master of Science (Library and Information Studies) (MSc), Universität Wien. Advisor: Christoph Steiner. Vgl.: https://www.researchgate.net/publication/371680244_Vergabe_von_DDC-Sachgruppen_mittels_eines_Schlagwort-Thesaurus. DOI: 10.25365/thesis.70030. Vgl. dazu die Präsentation unter: https://www.google.com/url?sa=i&rct=j&q=&esrc=s&source=web&cd=&ved=0CAIQw7AJahcKEwjwoZzzytz_AhUAAAAAHQAAAAAQAg&url=https%3A%2F%2Fwiki.dnb.de%2Fdownload%2Fattachments%2F252121510%2FDA3%2520Workshop-Gabler.pdf%3Fversion%3D1%26modificationDate%3D1671093170000%26api%3Dv2&psig=AOvVaw0szwENK1or3HevgvIDOfjx&ust=1687719410889597&opi=89978449.
  6. Stojanovic, N.: Ontology-based Information Retrieval : methods and tools for cooperative query answering (2005) 0.24
    0.23934543 = product of:
      0.39890903 = sum of:
        0.036453366 = product of:
          0.1093601 = sum of:
            0.1093601 = weight(_text_:3a in 701) [ClassicSimilarity], result of:
              0.1093601 = score(doc=701,freq=2.0), product of:
                0.291877 = queryWeight, product of:
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.03442753 = queryNorm
                0.3746787 = fieldWeight in 701, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.03125 = fieldNorm(doc=701)
          0.33333334 = coord(1/3)
        0.1093601 = weight(_text_:2f in 701) [ClassicSimilarity], result of:
          0.1093601 = score(doc=701,freq=2.0), product of:
            0.291877 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03442753 = queryNorm
            0.3746787 = fieldWeight in 701, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03125 = fieldNorm(doc=701)
        0.1093601 = weight(_text_:2f in 701) [ClassicSimilarity], result of:
          0.1093601 = score(doc=701,freq=2.0), product of:
            0.291877 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03442753 = queryNorm
            0.3746787 = fieldWeight in 701, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03125 = fieldNorm(doc=701)
        0.011458451 = product of:
          0.022916902 = sum of:
            0.022916902 = weight(_text_:web in 701) [ClassicSimilarity], result of:
              0.022916902 = score(doc=701,freq=4.0), product of:
                0.11235461 = queryWeight, product of:
                  3.2635105 = idf(docFreq=4597, maxDocs=44218)
                  0.03442753 = queryNorm
                0.2039694 = fieldWeight in 701, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  3.2635105 = idf(docFreq=4597, maxDocs=44218)
                  0.03125 = fieldNorm(doc=701)
          0.5 = coord(1/2)
        0.1093601 = weight(_text_:2f in 701) [ClassicSimilarity], result of:
          0.1093601 = score(doc=701,freq=2.0), product of:
            0.291877 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03442753 = queryNorm
            0.3746787 = fieldWeight in 701, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03125 = fieldNorm(doc=701)
        0.022916902 = weight(_text_:web in 701) [ClassicSimilarity], result of:
          0.022916902 = score(doc=701,freq=4.0), product of:
            0.11235461 = queryWeight, product of:
              3.2635105 = idf(docFreq=4597, maxDocs=44218)
              0.03442753 = queryNorm
            0.2039694 = fieldWeight in 701, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              3.2635105 = idf(docFreq=4597, maxDocs=44218)
              0.03125 = fieldNorm(doc=701)
      0.6 = coord(6/10)
    
    Abstract
    By the explosion of possibilities for a ubiquitous content production, the information overload problem reaches the level of complexity which cannot be managed by traditional modelling approaches anymore. Due to their pure syntactical nature traditional information retrieval approaches did not succeed in treating content itself (i.e. its meaning, and not its representation). This leads to a very low usefulness of the results of a retrieval process for a user's task at hand. In the last ten years ontologies have been emerged from an interesting conceptualisation paradigm to a very promising (semantic) modelling technology, especially in the context of the Semantic Web. From the information retrieval point of view, ontologies enable a machine-understandable form of content description, such that the retrieval process can be driven by the meaning of the content. However, the very ambiguous nature of the retrieval process in which a user, due to the unfamiliarity with the underlying repository and/or query syntax, just approximates his information need in a query, implies a necessity to include the user in the retrieval process more actively in order to close the gap between the meaning of the content and the meaning of a user's query (i.e. his information need). This thesis lays foundation for such an ontology-based interactive retrieval process, in which the retrieval system interacts with a user in order to conceptually interpret the meaning of his query, whereas the underlying domain ontology drives the conceptualisation process. In that way the retrieval process evolves from a query evaluation process into a highly interactive cooperation between a user and the retrieval system, in which the system tries to anticipate the user's information need and to deliver the relevant content proactively. Moreover, the notion of content relevance for a user's query evolves from a content dependent artefact to the multidimensional context-dependent structure, strongly influenced by the user's preferences. This cooperation process is realized as the so-called Librarian Agent Query Refinement Process. In order to clarify the impact of an ontology on the retrieval process (regarding its complexity and quality), a set of methods and tools for different levels of content and query formalisation is developed, ranging from pure ontology-based inferencing to keyword-based querying in which semantics automatically emerges from the results. Our evaluation studies have shown that the possibilities to conceptualize a user's information need in the right manner and to interpret the retrieval results accordingly are key issues for realizing much more meaningful information retrieval systems.
    Content
    Vgl.: http%3A%2F%2Fdigbib.ubka.uni-karlsruhe.de%2Fvolltexte%2Fdocuments%2F1627&ei=tAtYUYrBNoHKtQb3l4GYBw&usg=AFQjCNHeaxKkKU3-u54LWxMNYGXaaDLCGw&sig2=8WykXWQoDKjDSdGtAakH2Q&bvm=bv.44442042,d.Yms.
    Theme
    Semantic Web
  7. Xiong, C.: Knowledge based text representations for information retrieval (2016) 0.20
    0.20017157 = product of:
      0.5004289 = sum of:
        0.036453366 = product of:
          0.1093601 = sum of:
            0.1093601 = weight(_text_:3a in 5820) [ClassicSimilarity], result of:
              0.1093601 = score(doc=5820,freq=2.0), product of:
                0.291877 = queryWeight, product of:
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.03442753 = queryNorm
                0.3746787 = fieldWeight in 5820, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.03125 = fieldNorm(doc=5820)
          0.33333334 = coord(1/3)
        0.15465853 = weight(_text_:2f in 5820) [ClassicSimilarity], result of:
          0.15465853 = score(doc=5820,freq=4.0), product of:
            0.291877 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03442753 = queryNorm
            0.5298757 = fieldWeight in 5820, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03125 = fieldNorm(doc=5820)
        0.15465853 = weight(_text_:2f in 5820) [ClassicSimilarity], result of:
          0.15465853 = score(doc=5820,freq=4.0), product of:
            0.291877 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03442753 = queryNorm
            0.5298757 = fieldWeight in 5820, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03125 = fieldNorm(doc=5820)
        0.15465853 = weight(_text_:2f in 5820) [ClassicSimilarity], result of:
          0.15465853 = score(doc=5820,freq=4.0), product of:
            0.291877 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03442753 = queryNorm
            0.5298757 = fieldWeight in 5820, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03125 = fieldNorm(doc=5820)
      0.4 = coord(4/10)
    
    Content
    Submitted in partial fulfillment of the requirements for the degree of Doctor of Philosophy in Language and Information Technologies. Vgl.: https%3A%2F%2Fwww.cs.cmu.edu%2F~cx%2Fpapers%2Fknowledge_based_text_representation.pdf&usg=AOvVaw0SaTSvhWLTh__Uz_HtOtl3.
  8. Piros, A.: Az ETO-jelzetek automatikus interpretálásának és elemzésének kérdései (2018) 0.18
    0.18226683 = product of:
      0.45566708 = sum of:
        0.045566708 = product of:
          0.13670012 = sum of:
            0.13670012 = weight(_text_:3a in 855) [ClassicSimilarity], result of:
              0.13670012 = score(doc=855,freq=2.0), product of:
                0.291877 = queryWeight, product of:
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.03442753 = queryNorm
                0.46834838 = fieldWeight in 855, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=855)
          0.33333334 = coord(1/3)
        0.13670012 = weight(_text_:2f in 855) [ClassicSimilarity], result of:
          0.13670012 = score(doc=855,freq=2.0), product of:
            0.291877 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03442753 = queryNorm
            0.46834838 = fieldWeight in 855, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.0390625 = fieldNorm(doc=855)
        0.13670012 = weight(_text_:2f in 855) [ClassicSimilarity], result of:
          0.13670012 = score(doc=855,freq=2.0), product of:
            0.291877 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03442753 = queryNorm
            0.46834838 = fieldWeight in 855, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.0390625 = fieldNorm(doc=855)
        0.13670012 = weight(_text_:2f in 855) [ClassicSimilarity], result of:
          0.13670012 = score(doc=855,freq=2.0), product of:
            0.291877 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03442753 = queryNorm
            0.46834838 = fieldWeight in 855, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.0390625 = fieldNorm(doc=855)
      0.4 = coord(4/10)
    
    Content
    Vgl. auch: New automatic interpreter for complex UDC numbers. Unter: <https%3A%2F%2Fudcc.org%2Ffiles%2FAttilaPiros_EC_36-37_2014-2015.pdf&usg=AOvVaw3kc9CwDDCWP7aArpfjrs5b>
  9. Toussi, M.: Information Retrieval am Beispiel der Wide Area Information Server (WAIS) und dem World Wide Web (WWW) (1996) 0.12
    0.12463794 = product of:
      0.31159484 = sum of:
        0.028358221 = product of:
          0.056716442 = sum of:
            0.056716442 = weight(_text_:web in 5965) [ClassicSimilarity], result of:
              0.056716442 = score(doc=5965,freq=2.0), product of:
                0.11235461 = queryWeight, product of:
                  3.2635105 = idf(docFreq=4597, maxDocs=44218)
                  0.03442753 = queryNorm
                0.50479853 = fieldWeight in 5965, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.2635105 = idf(docFreq=4597, maxDocs=44218)
                  0.109375 = fieldNorm(doc=5965)
          0.5 = coord(1/2)
        0.07867401 = weight(_text_:world in 5965) [ClassicSimilarity], result of:
          0.07867401 = score(doc=5965,freq=2.0), product of:
            0.1323281 = queryWeight, product of:
              3.8436708 = idf(docFreq=2573, maxDocs=44218)
              0.03442753 = queryNorm
            0.5945375 = fieldWeight in 5965, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.8436708 = idf(docFreq=2573, maxDocs=44218)
              0.109375 = fieldNorm(doc=5965)
        0.14784616 = weight(_text_:wide in 5965) [ClassicSimilarity], result of:
          0.14784616 = score(doc=5965,freq=4.0), product of:
            0.15254007 = queryWeight, product of:
              4.4307585 = idf(docFreq=1430, maxDocs=44218)
              0.03442753 = queryNorm
            0.9692284 = fieldWeight in 5965, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              4.4307585 = idf(docFreq=1430, maxDocs=44218)
              0.109375 = fieldNorm(doc=5965)
        0.056716442 = weight(_text_:web in 5965) [ClassicSimilarity], result of:
          0.056716442 = score(doc=5965,freq=2.0), product of:
            0.11235461 = queryWeight, product of:
              3.2635105 = idf(docFreq=4597, maxDocs=44218)
              0.03442753 = queryNorm
            0.50479853 = fieldWeight in 5965, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.2635105 = idf(docFreq=4597, maxDocs=44218)
              0.109375 = fieldNorm(doc=5965)
      0.4 = coord(4/10)
    
  10. Schiefer, J.: Aufbau eines internationalen CDS/ISIS Nutzerforums im World Wide Web (1996) 0.12
    0.122647636 = product of:
      0.30661908 = sum of:
        0.032409392 = product of:
          0.064818785 = sum of:
            0.064818785 = weight(_text_:web in 5961) [ClassicSimilarity], result of:
              0.064818785 = score(doc=5961,freq=2.0), product of:
                0.11235461 = queryWeight, product of:
                  3.2635105 = idf(docFreq=4597, maxDocs=44218)
                  0.03442753 = queryNorm
                0.5769126 = fieldWeight in 5961, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.2635105 = idf(docFreq=4597, maxDocs=44218)
                  0.125 = fieldNorm(doc=5961)
          0.5 = coord(1/2)
        0.08991316 = weight(_text_:world in 5961) [ClassicSimilarity], result of:
          0.08991316 = score(doc=5961,freq=2.0), product of:
            0.1323281 = queryWeight, product of:
              3.8436708 = idf(docFreq=2573, maxDocs=44218)
              0.03442753 = queryNorm
            0.67947143 = fieldWeight in 5961, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.8436708 = idf(docFreq=2573, maxDocs=44218)
              0.125 = fieldNorm(doc=5961)
        0.11947774 = weight(_text_:wide in 5961) [ClassicSimilarity], result of:
          0.11947774 = score(doc=5961,freq=2.0), product of:
            0.15254007 = queryWeight, product of:
              4.4307585 = idf(docFreq=1430, maxDocs=44218)
              0.03442753 = queryNorm
            0.7832548 = fieldWeight in 5961, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              4.4307585 = idf(docFreq=1430, maxDocs=44218)
              0.125 = fieldNorm(doc=5961)
        0.064818785 = weight(_text_:web in 5961) [ClassicSimilarity], result of:
          0.064818785 = score(doc=5961,freq=2.0), product of:
            0.11235461 = queryWeight, product of:
              3.2635105 = idf(docFreq=4597, maxDocs=44218)
              0.03442753 = queryNorm
            0.5769126 = fieldWeight in 5961, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.2635105 = idf(docFreq=4597, maxDocs=44218)
              0.125 = fieldNorm(doc=5961)
      0.4 = coord(4/10)
    
  11. Glockner, M.: Semantik Web : Die nächste Generation des World Wide Web (2004) 0.12
    0.121412314 = product of:
      0.30353078 = sum of:
        0.04010458 = product of:
          0.08020916 = sum of:
            0.08020916 = weight(_text_:web in 4532) [ClassicSimilarity], result of:
              0.08020916 = score(doc=4532,freq=4.0), product of:
                0.11235461 = queryWeight, product of:
                  3.2635105 = idf(docFreq=4597, maxDocs=44218)
                  0.03442753 = queryNorm
                0.71389294 = fieldWeight in 4532, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  3.2635105 = idf(docFreq=4597, maxDocs=44218)
                  0.109375 = fieldNorm(doc=4532)
          0.5 = coord(1/2)
        0.07867401 = weight(_text_:world in 4532) [ClassicSimilarity], result of:
          0.07867401 = score(doc=4532,freq=2.0), product of:
            0.1323281 = queryWeight, product of:
              3.8436708 = idf(docFreq=2573, maxDocs=44218)
              0.03442753 = queryNorm
            0.5945375 = fieldWeight in 4532, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.8436708 = idf(docFreq=2573, maxDocs=44218)
              0.109375 = fieldNorm(doc=4532)
        0.10454303 = weight(_text_:wide in 4532) [ClassicSimilarity], result of:
          0.10454303 = score(doc=4532,freq=2.0), product of:
            0.15254007 = queryWeight, product of:
              4.4307585 = idf(docFreq=1430, maxDocs=44218)
              0.03442753 = queryNorm
            0.685348 = fieldWeight in 4532, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              4.4307585 = idf(docFreq=1430, maxDocs=44218)
              0.109375 = fieldNorm(doc=4532)
        0.08020916 = weight(_text_:web in 4532) [ClassicSimilarity], result of:
          0.08020916 = score(doc=4532,freq=4.0), product of:
            0.11235461 = queryWeight, product of:
              3.2635105 = idf(docFreq=4597, maxDocs=44218)
              0.03442753 = queryNorm
            0.71389294 = fieldWeight in 4532, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              3.2635105 = idf(docFreq=4597, maxDocs=44218)
              0.109375 = fieldNorm(doc=4532)
      0.4 = coord(4/10)
    
  12. Woitas, K.: Bibliografische Daten, Normdaten und Metadaten im Semantic Web : Konzepte der bibliografischen Kontrolle im Wandel (2010) 0.11
    0.10700202 = product of:
      0.26750505 = sum of:
        0.12967297 = sum of:
          0.045293506 = weight(_text_:web in 115) [ClassicSimilarity], result of:
            0.045293506 = score(doc=115,freq=10.0), product of:
              0.11235461 = queryWeight, product of:
                3.2635105 = idf(docFreq=4597, maxDocs=44218)
                0.03442753 = queryNorm
              0.40312994 = fieldWeight in 115, product of:
                3.1622777 = tf(freq=10.0), with freq of:
                  10.0 = termFreq=10.0
                3.2635105 = idf(docFreq=4597, maxDocs=44218)
                0.0390625 = fieldNorm(doc=115)
          0.084379464 = weight(_text_:seite in 115) [ClassicSimilarity], result of:
            0.084379464 = score(doc=115,freq=4.0), product of:
              0.19283076 = queryWeight, product of:
                5.601063 = idf(docFreq=443, maxDocs=44218)
                0.03442753 = queryNorm
              0.43758303 = fieldWeight in 115, product of:
                2.0 = tf(freq=4.0), with freq of:
                  4.0 = termFreq=4.0
                5.601063 = idf(docFreq=443, maxDocs=44218)
                0.0390625 = fieldNorm(doc=115)
        0.039736375 = weight(_text_:world in 115) [ClassicSimilarity], result of:
          0.039736375 = score(doc=115,freq=4.0), product of:
            0.1323281 = queryWeight, product of:
              3.8436708 = idf(docFreq=2573, maxDocs=44218)
              0.03442753 = queryNorm
            0.30028677 = fieldWeight in 115, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              3.8436708 = idf(docFreq=2573, maxDocs=44218)
              0.0390625 = fieldNorm(doc=115)
        0.05280221 = weight(_text_:wide in 115) [ClassicSimilarity], result of:
          0.05280221 = score(doc=115,freq=4.0), product of:
            0.15254007 = queryWeight, product of:
              4.4307585 = idf(docFreq=1430, maxDocs=44218)
              0.03442753 = queryNorm
            0.34615302 = fieldWeight in 115, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              4.4307585 = idf(docFreq=1430, maxDocs=44218)
              0.0390625 = fieldNorm(doc=115)
        0.045293506 = weight(_text_:web in 115) [ClassicSimilarity], result of:
          0.045293506 = score(doc=115,freq=10.0), product of:
            0.11235461 = queryWeight, product of:
              3.2635105 = idf(docFreq=4597, maxDocs=44218)
              0.03442753 = queryNorm
            0.40312994 = fieldWeight in 115, product of:
              3.1622777 = tf(freq=10.0), with freq of:
                10.0 = termFreq=10.0
              3.2635105 = idf(docFreq=4597, maxDocs=44218)
              0.0390625 = fieldNorm(doc=115)
      0.4 = coord(4/10)
    
    Abstract
    Bibliografische Daten, Normdaten und Metadaten im Semantic Web - Konzepte der Bibliografischen Kontrolle im Wandel. Der Titel dieser Arbeit zielt in ein essentielles Feld der Bibliotheks- und Informationswissenschaft, die Bibliografische Kontrolle. Als zweites zentrales Konzept wird der in der Weiterentwicklung des World Wide Webs (WWW) bedeutsame Begriff des Semantic Webs genannt. Auf den ersten Blick handelt es sich hier um einen ungleichen Wettstreit. Auf der einen Seite die Bibliografische Kontrolle, welche die Methoden und Mittel zur Erschließung von bibliothekarischen Objekten umfasst und traditionell in Form von formal-inhaltlichen Surrogaten in Katalogen daherkommt. Auf der anderen Seite das Buzzword Semantic Web mit seinen hochtrabenden Konnotationen eines durch Selbstreferenzialität "bedeutungstragenden", wenn nicht sogar "intelligenten" Webs. Wie kamen also eine wissenschaftliche Bibliothekarin und ein Mitglied des World Wide Web Consortiums 2007 dazu, gemeinsam einen Aufsatz zu publizieren und darin zu behaupten, das semantische Netz würde ein "bibliothekarischeres" Netz sein? Um sich dieser Frage zu nähern, soll zunächst kurz die historische Entwicklung der beiden Informationssphären Bibliothek und WWW gemeinsam betrachtet werden. Denn so oft - und völlig zurecht - die informationelle Revolution durch das Internet beschworen wird, so taucht auch immer wieder das Analogon einer weltweiten, virtuellen Bibliothek auf. Genauer gesagt, nahmen die theoretischen Überlegungen, die später zur Entwicklung des Internets führen sollten, ihren Ausgangspunkt (neben Kybernetik und entstehender Computertechnik) beim Konzept des Informationsspeichers Bibliothek.
    Theme
    Semantic Web
  13. Timm, A.: Fachinformation in den Bereichen Gentechnologie und Molekularbiologie am Beispiel ausgewählter Datenbanken und Dienstleistungen im World Wide Web (1996) 0.09
    0.091985725 = product of:
      0.2299643 = sum of:
        0.024307044 = product of:
          0.04861409 = sum of:
            0.04861409 = weight(_text_:web in 785) [ClassicSimilarity], result of:
              0.04861409 = score(doc=785,freq=2.0), product of:
                0.11235461 = queryWeight, product of:
                  3.2635105 = idf(docFreq=4597, maxDocs=44218)
                  0.03442753 = queryNorm
                0.43268442 = fieldWeight in 785, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.2635105 = idf(docFreq=4597, maxDocs=44218)
                  0.09375 = fieldNorm(doc=785)
          0.5 = coord(1/2)
        0.06743487 = weight(_text_:world in 785) [ClassicSimilarity], result of:
          0.06743487 = score(doc=785,freq=2.0), product of:
            0.1323281 = queryWeight, product of:
              3.8436708 = idf(docFreq=2573, maxDocs=44218)
              0.03442753 = queryNorm
            0.50960356 = fieldWeight in 785, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.8436708 = idf(docFreq=2573, maxDocs=44218)
              0.09375 = fieldNorm(doc=785)
        0.089608304 = weight(_text_:wide in 785) [ClassicSimilarity], result of:
          0.089608304 = score(doc=785,freq=2.0), product of:
            0.15254007 = queryWeight, product of:
              4.4307585 = idf(docFreq=1430, maxDocs=44218)
              0.03442753 = queryNorm
            0.5874411 = fieldWeight in 785, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              4.4307585 = idf(docFreq=1430, maxDocs=44218)
              0.09375 = fieldNorm(doc=785)
        0.04861409 = weight(_text_:web in 785) [ClassicSimilarity], result of:
          0.04861409 = score(doc=785,freq=2.0), product of:
            0.11235461 = queryWeight, product of:
              3.2635105 = idf(docFreq=4597, maxDocs=44218)
              0.03442753 = queryNorm
            0.43268442 = fieldWeight in 785, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.2635105 = idf(docFreq=4597, maxDocs=44218)
              0.09375 = fieldNorm(doc=785)
      0.4 = coord(4/10)
    
  14. Müller, C.: Allegro im World Wide Web : Programierung eines Interfaces (1997) 0.08
    0.07665478 = product of:
      0.19163693 = sum of:
        0.020255871 = product of:
          0.040511742 = sum of:
            0.040511742 = weight(_text_:web in 1486) [ClassicSimilarity], result of:
              0.040511742 = score(doc=1486,freq=2.0), product of:
                0.11235461 = queryWeight, product of:
                  3.2635105 = idf(docFreq=4597, maxDocs=44218)
                  0.03442753 = queryNorm
                0.36057037 = fieldWeight in 1486, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.2635105 = idf(docFreq=4597, maxDocs=44218)
                  0.078125 = fieldNorm(doc=1486)
          0.5 = coord(1/2)
        0.056195725 = weight(_text_:world in 1486) [ClassicSimilarity], result of:
          0.056195725 = score(doc=1486,freq=2.0), product of:
            0.1323281 = queryWeight, product of:
              3.8436708 = idf(docFreq=2573, maxDocs=44218)
              0.03442753 = queryNorm
            0.42466965 = fieldWeight in 1486, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.8436708 = idf(docFreq=2573, maxDocs=44218)
              0.078125 = fieldNorm(doc=1486)
        0.07467359 = weight(_text_:wide in 1486) [ClassicSimilarity], result of:
          0.07467359 = score(doc=1486,freq=2.0), product of:
            0.15254007 = queryWeight, product of:
              4.4307585 = idf(docFreq=1430, maxDocs=44218)
              0.03442753 = queryNorm
            0.48953426 = fieldWeight in 1486, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              4.4307585 = idf(docFreq=1430, maxDocs=44218)
              0.078125 = fieldNorm(doc=1486)
        0.040511742 = weight(_text_:web in 1486) [ClassicSimilarity], result of:
          0.040511742 = score(doc=1486,freq=2.0), product of:
            0.11235461 = queryWeight, product of:
              3.2635105 = idf(docFreq=4597, maxDocs=44218)
              0.03442753 = queryNorm
            0.36057037 = fieldWeight in 1486, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.2635105 = idf(docFreq=4597, maxDocs=44218)
              0.078125 = fieldNorm(doc=1486)
      0.4 = coord(4/10)
    
  15. Körber, S.: Suchmuster erfahrener und unerfahrener Suchmaschinennutzer im deutschsprachigen World Wide Web (2000) 0.07
    0.07115913 = product of:
      0.17789783 = sum of:
        0.07579959 = sum of:
          0.02806736 = weight(_text_:web in 5938) [ClassicSimilarity], result of:
            0.02806736 = score(doc=5938,freq=6.0), product of:
              0.11235461 = queryWeight, product of:
                3.2635105 = idf(docFreq=4597, maxDocs=44218)
                0.03442753 = queryNorm
              0.24981049 = fieldWeight in 5938, product of:
                2.4494898 = tf(freq=6.0), with freq of:
                  6.0 = termFreq=6.0
                3.2635105 = idf(docFreq=4597, maxDocs=44218)
                0.03125 = fieldNorm(doc=5938)
          0.047732234 = weight(_text_:seite in 5938) [ClassicSimilarity], result of:
            0.047732234 = score(doc=5938,freq=2.0), product of:
              0.19283076 = queryWeight, product of:
                5.601063 = idf(docFreq=443, maxDocs=44218)
                0.03442753 = queryNorm
              0.24753433 = fieldWeight in 5938, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                5.601063 = idf(docFreq=443, maxDocs=44218)
                0.03125 = fieldNorm(doc=5938)
        0.0317891 = weight(_text_:world in 5938) [ClassicSimilarity], result of:
          0.0317891 = score(doc=5938,freq=4.0), product of:
            0.1323281 = queryWeight, product of:
              3.8436708 = idf(docFreq=2573, maxDocs=44218)
              0.03442753 = queryNorm
            0.24022943 = fieldWeight in 5938, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              3.8436708 = idf(docFreq=2573, maxDocs=44218)
              0.03125 = fieldNorm(doc=5938)
        0.042241763 = weight(_text_:wide in 5938) [ClassicSimilarity], result of:
          0.042241763 = score(doc=5938,freq=4.0), product of:
            0.15254007 = queryWeight, product of:
              4.4307585 = idf(docFreq=1430, maxDocs=44218)
              0.03442753 = queryNorm
            0.2769224 = fieldWeight in 5938, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              4.4307585 = idf(docFreq=1430, maxDocs=44218)
              0.03125 = fieldNorm(doc=5938)
        0.02806736 = weight(_text_:web in 5938) [ClassicSimilarity], result of:
          0.02806736 = score(doc=5938,freq=6.0), product of:
            0.11235461 = queryWeight, product of:
              3.2635105 = idf(docFreq=4597, maxDocs=44218)
              0.03442753 = queryNorm
            0.24981049 = fieldWeight in 5938, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              3.2635105 = idf(docFreq=4597, maxDocs=44218)
              0.03125 = fieldNorm(doc=5938)
      0.4 = coord(4/10)
    
    Abstract
    In einem Labor-Experiment wurden insgesamt achtzehn Studenten und Studentinnen mit zwei offenen Web-Rechercheaufgaben konfrontiert. Während deren Bewältigung mit einer Suchmaschine wurden sie per Proxy-Logfile-Protokollierung verdeckt beobachtet. Sie machten demographische und ihre Webnutzungs-Gewohnheiten betreffende Angaben, bewerteten Aufgaben-, Performance- und Suchmaschinen-Eigenschaften in Fragebögen und stellten sich einem Multiple-Choice-Test zu ihrem Wissen über Suchmaschinen. Die Versuchspersonen wurden gezielt angeworben und eingeteilt: in eine erfahrene und eine unerfahrene Untergruppe mit je neun Teilnehmern. Die Untersuchung beruht auf dem Vergleich der beiden Gruppen: Im Zentrum stehen dabei die Lesezeichen, die sie als Lösungen ablegten, ihre Einschätzungen aus den Fragebögen, ihre Suchphrasen sowie die Muster ihrer Suchmaschinen-Interaktion und Navigation in Zielseiten. Diese aus den Logfiles gewonnen sequentiellen Aktionsmuster wurden vergleichend visualisiert, ausgezählt und interpretiert. Zunächst wird das World Wide Web als strukturell und inhaltlich komplexer Informationsraum beschrieben. Daraufhin beleuchtet der Autor die allgemeinen Aufgaben und Typen von Meta-Medienanwendungen, sowie die Komponenten Index-basierter Suchmaschinen. Im Anschluß daran wechselt die Perspektive von der strukturell-medialen Seite hin zu Nutzungsaspekten. Der Autor beschreibt Nutzung von Meta-Medienanwendungen als Ko-Selektion zwischen Nutzer und Suchmaschine auf der Basis von Entscheidungen und entwickelt ein einfaches, dynamisches Phasenmodell. Der Einfluß unterschiedlicher Wissensarten auf den Selektionsprozeß findet hier Beachtung.Darauf aufbauend werden im folgenden Schritt allgemeine Forschungsfragen und Hypothesen für das Experiment formuliert. Dessen Eigenschaften sind das anschließende Thema, wobei das Beobachtungsinstrument Logfile-Analyse, die Wahl des Suchdienstes, die Formulierung der Aufgaben, Ausarbeitung der Fragebögen und der Ablauf im Zentrum stehen. Im folgenden präsentiert der Autor die Ergebnisse in drei Schwerpunkten: erstens in bezug auf die Performance - was die Prüfung der Hypothesen erlaubt - zweitens in bezug auf die Bewertungen, Kommentare und Suchphrasen der Versuchspersonen und drittens in bezug auf die visuelle und rechnerische Auswertung der Suchmuster. Letztere erlauben einen Einblick in das Suchverhalten der Versuchspersonen. Zusammenfassende Interpretationen und ein Ausblick schließen die Arbeit ab
  16. Hüsken, P.: Information Retrieval im Semantic Web (2006) 0.06
    0.06401996 = product of:
      0.1600499 = sum of:
        0.027176104 = product of:
          0.05435221 = sum of:
            0.05435221 = weight(_text_:web in 4333) [ClassicSimilarity], result of:
              0.05435221 = score(doc=4333,freq=10.0), product of:
                0.11235461 = queryWeight, product of:
                  3.2635105 = idf(docFreq=4597, maxDocs=44218)
                  0.03442753 = queryNorm
                0.48375595 = fieldWeight in 4333, product of:
                  3.1622777 = tf(freq=10.0), with freq of:
                    10.0 = termFreq=10.0
                  3.2635105 = idf(docFreq=4597, maxDocs=44218)
                  0.046875 = fieldNorm(doc=4333)
          0.5 = coord(1/2)
        0.033717435 = weight(_text_:world in 4333) [ClassicSimilarity], result of:
          0.033717435 = score(doc=4333,freq=2.0), product of:
            0.1323281 = queryWeight, product of:
              3.8436708 = idf(docFreq=2573, maxDocs=44218)
              0.03442753 = queryNorm
            0.25480178 = fieldWeight in 4333, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.8436708 = idf(docFreq=2573, maxDocs=44218)
              0.046875 = fieldNorm(doc=4333)
        0.044804152 = weight(_text_:wide in 4333) [ClassicSimilarity], result of:
          0.044804152 = score(doc=4333,freq=2.0), product of:
            0.15254007 = queryWeight, product of:
              4.4307585 = idf(docFreq=1430, maxDocs=44218)
              0.03442753 = queryNorm
            0.29372054 = fieldWeight in 4333, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              4.4307585 = idf(docFreq=1430, maxDocs=44218)
              0.046875 = fieldNorm(doc=4333)
        0.05435221 = weight(_text_:web in 4333) [ClassicSimilarity], result of:
          0.05435221 = score(doc=4333,freq=10.0), product of:
            0.11235461 = queryWeight, product of:
              3.2635105 = idf(docFreq=4597, maxDocs=44218)
              0.03442753 = queryNorm
            0.48375595 = fieldWeight in 4333, product of:
              3.1622777 = tf(freq=10.0), with freq of:
                10.0 = termFreq=10.0
              3.2635105 = idf(docFreq=4597, maxDocs=44218)
              0.046875 = fieldNorm(doc=4333)
      0.4 = coord(4/10)
    
    Abstract
    Das Semantic Web bezeichnet ein erweitertes World Wide Web (WWW), das die Bedeutung von präsentierten Inhalten in neuen standardisierten Sprachen wie RDF Schema und OWL modelliert. Diese Arbeit befasst sich mit dem Aspekt des Information Retrieval, d.h. es wird untersucht, in wie weit Methoden der Informationssuche sich auf modelliertes Wissen übertragen lassen. Die kennzeichnenden Merkmale von IR-Systemen wie vage Anfragen sowie die Unterstützung unsicheren Wissens werden im Kontext des Semantic Web behandelt. Im Fokus steht die Suche nach Fakten innerhalb einer Wissensdomäne, die entweder explizit modelliert sind oder implizit durch die Anwendung von Inferenz abgeleitet werden können. Aufbauend auf der an der Universität Duisburg-Essen entwickelten Retrievalmaschine PIRE wird die Anwendung unsicherer Inferenz mit probabilistischer Prädikatenlogik (pDatalog) implementiert.
    Theme
    Semantic Web
  17. Nix, M.: ¬Die praktische Einsetzbarkeit des CIDOC CRM in Informationssystemen im Bereich des Kulturerbes (2004) 0.06
    0.05806596 = product of:
      0.14516489 = sum of:
        0.0175421 = product of:
          0.0350842 = sum of:
            0.0350842 = weight(_text_:web in 3742) [ClassicSimilarity], result of:
              0.0350842 = score(doc=3742,freq=6.0), product of:
                0.11235461 = queryWeight, product of:
                  3.2635105 = idf(docFreq=4597, maxDocs=44218)
                  0.03442753 = queryNorm
                0.3122631 = fieldWeight in 3742, product of:
                  2.4494898 = tf(freq=6.0), with freq of:
                    6.0 = termFreq=6.0
                  3.2635105 = idf(docFreq=4597, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=3742)
          0.5 = coord(1/2)
        0.039736375 = weight(_text_:world in 3742) [ClassicSimilarity], result of:
          0.039736375 = score(doc=3742,freq=4.0), product of:
            0.1323281 = queryWeight, product of:
              3.8436708 = idf(docFreq=2573, maxDocs=44218)
              0.03442753 = queryNorm
            0.30028677 = fieldWeight in 3742, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              3.8436708 = idf(docFreq=2573, maxDocs=44218)
              0.0390625 = fieldNorm(doc=3742)
        0.05280221 = weight(_text_:wide in 3742) [ClassicSimilarity], result of:
          0.05280221 = score(doc=3742,freq=4.0), product of:
            0.15254007 = queryWeight, product of:
              4.4307585 = idf(docFreq=1430, maxDocs=44218)
              0.03442753 = queryNorm
            0.34615302 = fieldWeight in 3742, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              4.4307585 = idf(docFreq=1430, maxDocs=44218)
              0.0390625 = fieldNorm(doc=3742)
        0.0350842 = weight(_text_:web in 3742) [ClassicSimilarity], result of:
          0.0350842 = score(doc=3742,freq=6.0), product of:
            0.11235461 = queryWeight, product of:
              3.2635105 = idf(docFreq=4597, maxDocs=44218)
              0.03442753 = queryNorm
            0.3122631 = fieldWeight in 3742, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              3.2635105 = idf(docFreq=4597, maxDocs=44218)
              0.0390625 = fieldNorm(doc=3742)
      0.4 = coord(4/10)
    
    Abstract
    Es steht uns eine praktisch unbegrenzte Menge an Informationen über das World Wide Web zur Verfügung. Das Problem, das daraus erwächst, ist, diese Menge zu bewältigen und an die Information zu gelangen, die im Augenblick benötigt wird. Das überwältigende Angebot zwingt sowohl professionelle Anwender als auch Laien zu suchen, ungeachtet ihrer Ansprüche an die gewünschten Informationen. Um dieses Suchen effizienter zu gestalten, gibt es einerseits die Möglichkeit, leistungsstärkere Suchmaschinen zu entwickeln. Eine andere Möglichkeit ist, Daten besser zu strukturieren, um an die darin enthaltenen Informationen zu gelangen. Hoch strukturierte Daten sind maschinell verarbeitbar, sodass ein Teil der Sucharbeit automatisiert werden kann. Das Semantic Web ist die Vision eines weiterentwickelten World Wide Web, in dem derart strukturierten Daten von so genannten Softwareagenten verarbeitet werden. Die fortschreitende inhaltliche Strukturierung von Daten wird Semantisierung genannt. Im ersten Teil der Arbeit sollen einige wichtige Methoden der inhaltlichen Strukturierung von Daten skizziert werden, um die Stellung von Ontologien innerhalb der Semantisierung zu klären. Im dritten Kapitel wird der Aufbau und die Aufgabe des CIDOC Conceptual Reference Model (CRM), einer Domain Ontologie im Bereich des Kulturerbes dargestellt. Im darauf folgenden praktischen Teil werden verschiedene Ansätze zur Verwendung des CRM diskutiert und umgesetzt. Es wird ein Vorschlag zur Implementierung des Modells in XML erarbeitet. Das ist eine Möglichkeit, die dem Datentransport dient. Außerdem wird der Entwurf einer Klassenbibliothek in Java dargelegt, auf die die Verarbeitung und Nutzung des Modells innerhalb eines Informationssystems aufbauen kann.
  18. Fischer, M.: Sacherschliessung - quo vadis? : Die Neuausrichtung der Sacherschliessung im deutschsprachigen Raum (2015) 0.06
    0.05806596 = product of:
      0.14516489 = sum of:
        0.0175421 = product of:
          0.0350842 = sum of:
            0.0350842 = weight(_text_:web in 2029) [ClassicSimilarity], result of:
              0.0350842 = score(doc=2029,freq=6.0), product of:
                0.11235461 = queryWeight, product of:
                  3.2635105 = idf(docFreq=4597, maxDocs=44218)
                  0.03442753 = queryNorm
                0.3122631 = fieldWeight in 2029, product of:
                  2.4494898 = tf(freq=6.0), with freq of:
                    6.0 = termFreq=6.0
                  3.2635105 = idf(docFreq=4597, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=2029)
          0.5 = coord(1/2)
        0.039736375 = weight(_text_:world in 2029) [ClassicSimilarity], result of:
          0.039736375 = score(doc=2029,freq=4.0), product of:
            0.1323281 = queryWeight, product of:
              3.8436708 = idf(docFreq=2573, maxDocs=44218)
              0.03442753 = queryNorm
            0.30028677 = fieldWeight in 2029, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              3.8436708 = idf(docFreq=2573, maxDocs=44218)
              0.0390625 = fieldNorm(doc=2029)
        0.05280221 = weight(_text_:wide in 2029) [ClassicSimilarity], result of:
          0.05280221 = score(doc=2029,freq=4.0), product of:
            0.15254007 = queryWeight, product of:
              4.4307585 = idf(docFreq=1430, maxDocs=44218)
              0.03442753 = queryNorm
            0.34615302 = fieldWeight in 2029, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              4.4307585 = idf(docFreq=1430, maxDocs=44218)
              0.0390625 = fieldNorm(doc=2029)
        0.0350842 = weight(_text_:web in 2029) [ClassicSimilarity], result of:
          0.0350842 = score(doc=2029,freq=6.0), product of:
            0.11235461 = queryWeight, product of:
              3.2635105 = idf(docFreq=4597, maxDocs=44218)
              0.03442753 = queryNorm
            0.3122631 = fieldWeight in 2029, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              3.2635105 = idf(docFreq=4597, maxDocs=44218)
              0.0390625 = fieldNorm(doc=2029)
      0.4 = coord(4/10)
    
    Abstract
    Informationen werden heute von den meisten Menschen vor allem im World Wide Web gesucht. Bibliothekskataloge und damit die in den wissenschaftlichen Bibliotheken gepflegte intellektuelle Sacherschliessung konkurrieren dabei mit einfach und intuitiv zu benutzenden Suchmaschinen. Die Anforderungen an die thematische Recherche hat sich in den letzten Jahren durch die rasante Entwicklung der Informationstechnologie grundlegend verändert. Darüber hinaus sehen sich die Bibliotheken heute mit dem Problem konfrontiert, dass die zunehmende Flut an elektronischen Publikationen bei gleichzeitig abnehmenden Ressourcen intellektuell nicht mehr bewältigt werden kann. Vor diesem Hintergrund hat die Expertengruppe Sacherschliessung - eine Arbeitsgruppe innerhalb der Arbeitsstelle für Standardisierung der Deutschen Nationalbibliothek (DNB), in welcher Vertreterinnen und Vertreter der deutschsprachigen Bibliotheksverbünde repräsentiert sind - 2013 damit begonnen, sich mit der Neuausrichtung der verbalen Sacherschliessung zu befassen. Bei der aktuellen Überarbeitung der Regeln für den Schlagwortkatalog (RSWK) sollen die verbale und klassifikatorische Sacherschliessung, ebenso wie die intellektuelle und automatische Indexierung in einem Zusammenhang betrachtet werden. Neben der neuen Suchmaschinentechnologie und den automatischen Indexierungsmethoden gewinnt dabei vor allem die Vernetzung der Bibliothekskataloge mit anderen Ressourcen im World Wide Web immer mehr an Bedeutung. Ausgehend von einer Analyse der grundlegenden Prinzipien der international verbreiteten Normen und Standards (FRBR, FRSAD und RDA) beschäftige ich mich in meiner Masterarbeit mit der in der Expertengruppe Sacherschliessung geführten Debatte über die aktuelle Überarbeitung der RSWK. Dabei stellt sich insbesondere die Frage, welche Auswirkungen die rasante Entwicklung der Informationstechnologie auf die zukünftige Neuausrichtung der intellektuellen Sacherschliessung haben wird? Welche Rolle spielen in Zukunft die Suchmaschinen und Discovery Systeme, die automatischen Indexierungsverfahren und das Semantic Web bzw. Linked Open Data bei der inhaltlichen Erschliessung von bibliografischen Ressourcen?
  19. Jackenkroll, M.: Nutzen von XML für die Herstellung verschiedener medialer Varianten von Informationsmitteln : dargestellt am Beispiel eines geografischen Lexikonartikels (2002) 0.05
    0.0548327 = product of:
      0.1096654 = sum of:
        0.038828824 = weight(_text_:gestaltung in 4804) [ClassicSimilarity], result of:
          0.038828824 = score(doc=4804,freq=2.0), product of:
            0.2008246 = queryWeight, product of:
              5.8332562 = idf(docFreq=351, maxDocs=44218)
              0.03442753 = queryNorm
            0.19334695 = fieldWeight in 4804, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              5.8332562 = idf(docFreq=351, maxDocs=44218)
              0.0234375 = fieldNorm(doc=4804)
        0.01052526 = product of:
          0.02105052 = sum of:
            0.02105052 = weight(_text_:web in 4804) [ClassicSimilarity], result of:
              0.02105052 = score(doc=4804,freq=6.0), product of:
                0.11235461 = queryWeight, product of:
                  3.2635105 = idf(docFreq=4597, maxDocs=44218)
                  0.03442753 = queryNorm
                0.18735787 = fieldWeight in 4804, product of:
                  2.4494898 = tf(freq=6.0), with freq of:
                    6.0 = termFreq=6.0
                  3.2635105 = idf(docFreq=4597, maxDocs=44218)
                  0.0234375 = fieldNorm(doc=4804)
          0.5 = coord(1/2)
        0.016858717 = weight(_text_:world in 4804) [ClassicSimilarity], result of:
          0.016858717 = score(doc=4804,freq=2.0), product of:
            0.1323281 = queryWeight, product of:
              3.8436708 = idf(docFreq=2573, maxDocs=44218)
              0.03442753 = queryNorm
            0.12740089 = fieldWeight in 4804, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.8436708 = idf(docFreq=2573, maxDocs=44218)
              0.0234375 = fieldNorm(doc=4804)
        0.022402076 = weight(_text_:wide in 4804) [ClassicSimilarity], result of:
          0.022402076 = score(doc=4804,freq=2.0), product of:
            0.15254007 = queryWeight, product of:
              4.4307585 = idf(docFreq=1430, maxDocs=44218)
              0.03442753 = queryNorm
            0.14686027 = fieldWeight in 4804, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              4.4307585 = idf(docFreq=1430, maxDocs=44218)
              0.0234375 = fieldNorm(doc=4804)
        0.02105052 = weight(_text_:web in 4804) [ClassicSimilarity], result of:
          0.02105052 = score(doc=4804,freq=6.0), product of:
            0.11235461 = queryWeight, product of:
              3.2635105 = idf(docFreq=4597, maxDocs=44218)
              0.03442753 = queryNorm
            0.18735787 = fieldWeight in 4804, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              3.2635105 = idf(docFreq=4597, maxDocs=44218)
              0.0234375 = fieldNorm(doc=4804)
      0.5 = coord(5/10)
    
    Content
    "Die Extensible Markup Language (XML) ist eine Metaauszeichnungssprache, die 1998 vom World Wide Web Consortium (W3C), einer Organisation, die sich mit der Erstellung von Web Standards und neuen Technologien für das Internet beschäftigt, als neue Empfehlung für Web-Anwendungen festgesetzt wurde. Seitdem ist viel über XML und die sich durch diese Sprache ergebenden neuen Möglichkeiten des Datenaustausches über das Internet publiziert worden. In XML-Dokumenten werden die hierarchische Struktur und der Inhalt der Dokumente festgelegt, aber keinerlei Angaben zum Layout gemacht. Dieses wird in so genannten Stylesheets definiert. Mit Hilfe mehrerer Stylesheets, die sich alle auf ein XML-Dokument beziehen, ist es möglich, aus einem Datenbestand verschiedene Ausgabeprodukte, z.B. eine Online-Version und eine druckbare Ausgabe eines Dokuments, zu erzeugen. Diese Möglichkeit der Herstellung verschiedener medialer Varianten eines Produkts ist auch für die Herstellung von Informationsmitteln interessant. Im Bereich der Produktion von Informationsmitteln, vor allem von Lexika und Enzyklopädien, ist in den letzten Jahren zu beobachten gewesen, dass neben der traditionellen, gedruckten Ausgabe des Nachschlagewerks zunehmend auch elektronische Varianten, die durch multimediale Elemente angereichert sind, angeboten werden. Diese elektronischen Nachschlagewerke werden sowohl offline, d.h. auf CD-ROM bzw. DVD, als auch online im Internet veröffentlicht. Im Gegensatz zu den gedruckten Versionen werden die neuen Produkte fast jährlich aktualisiert. Diese neue Situation erforderte Veränderungen im Herstellungsprozess. Ein Verfahren, das die Erzeugung verschiedener medialer Varianten eines Produkts möglichst einfach und problemlos ermöglicht, wurde benötigt. XML und ihr Vorgänger, die Standard Generalized Markup Language (SGML), schienen die perfekte Lösung für dieses Problem zu sein. Die Erwartungen an den Nutzen, den SGML und XML bringen könnten, waren hoch: "Allein dieses Spitzklammerformat, eingespeist in einen Datenpool, soll auf Knopfdruck die Generierung der verschiedensten Medienprodukte ermöglichen". Ziel dieser Arbeit ist es, darzustellen, wie der neue Standard XML bei der Publikation von Informationsmitteln eingesetzt werden kann, um aus einem einmal erfassten Datenbestand mit möglichst geringem Aufwand mehrere Ausgabeprodukte zu generieren. Es wird darauf eingegangen, welche Ausgabeformen sich in diesem Bereich für XML-Dokumente anbieten und mit welchen Verfahren und Hilfsmitteln die jeweiligen Ausgabeformate erstellt werden können. In diesem Zusammenhang sollen auch die Aspekte behandelt werden, die sich bei der Umwandlung von XML-Dokumenten in andere For mate unter Umständen als problematisch erweisen könnten.
    Ausgehend von dieser Sachlage ergibt sich die Struktur der vorliegenden Arbeit: Einleitend werden die Metaauszeichnungssprache XML sowie einige ausgewählte Spezifikationen, die im Zusammenhang mit XML entwickelt wurden und eine sinnvolle Anwendung dieser Sprache erst ermöglichen, vorgestellt (Kapitel 2). Dieses Kapitel soll einen knappen, theoretischen Überblick darüber geben, was XML und zugehörige Ergänzungen leisten können, welche Ziele sie jeweils verfolgen und mit welchen Methoden sie versuchen, diese Ziele zu erreichen. Damit soll dieser erste Teil dazu beitragen, das Vorgehen bei der Entwicklung der späteren Beispiel-DTD und den zugehörigen Stylesheets nachvollziehbar zu machen. Daher wird hier nur auf solche Spezifikationen eingegangen, die im Zusammenhang mit der Produktion von Informationsmitteln auf XML-Basis unbedingt benötigt werden bzw. in diesem Bereich von Nutzen sind. Neben der sogenannten Dokumenttypdefinition (DTD), die die Struktur der XML-Dokumente bestimmt, sollen daher die Spezifikationen zu den Themen Linking, Transformation und Formatierung behandelt werden. Sicherlich spielen auch Techniken zur Gestaltung des Retrieval bei elektronischen Ausgaben von Informationsmitteln eine Rolle. Dieser Bereich soll hier jedoch ausgeklammert werden, um den Rahmen dieser Arbeit nicht zu sprengen. Der Schwerpunkt liegt vielmehr auf den Bereichen der Transformation und Formatierung, da diese zur Erstellung von Stylesheets und damit zur Generierung der späteren Ausgabeprodukte von zentraler Bedeutung sind.
  20. Krüger, C.: Evaluation des WWW-Suchdienstes GERHARD unter besonderer Beachtung automatischer Indexierung (1999) 0.05
    0.05420311 = product of:
      0.13550778 = sum of:
        0.014323064 = product of:
          0.028646128 = sum of:
            0.028646128 = weight(_text_:web in 1777) [ClassicSimilarity], result of:
              0.028646128 = score(doc=1777,freq=4.0), product of:
                0.11235461 = queryWeight, product of:
                  3.2635105 = idf(docFreq=4597, maxDocs=44218)
                  0.03442753 = queryNorm
                0.25496176 = fieldWeight in 1777, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  3.2635105 = idf(docFreq=4597, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=1777)
          0.5 = coord(1/2)
        0.039736375 = weight(_text_:world in 1777) [ClassicSimilarity], result of:
          0.039736375 = score(doc=1777,freq=4.0), product of:
            0.1323281 = queryWeight, product of:
              3.8436708 = idf(docFreq=2573, maxDocs=44218)
              0.03442753 = queryNorm
            0.30028677 = fieldWeight in 1777, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              3.8436708 = idf(docFreq=2573, maxDocs=44218)
              0.0390625 = fieldNorm(doc=1777)
        0.05280221 = weight(_text_:wide in 1777) [ClassicSimilarity], result of:
          0.05280221 = score(doc=1777,freq=4.0), product of:
            0.15254007 = queryWeight, product of:
              4.4307585 = idf(docFreq=1430, maxDocs=44218)
              0.03442753 = queryNorm
            0.34615302 = fieldWeight in 1777, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              4.4307585 = idf(docFreq=1430, maxDocs=44218)
              0.0390625 = fieldNorm(doc=1777)
        0.028646128 = weight(_text_:web in 1777) [ClassicSimilarity], result of:
          0.028646128 = score(doc=1777,freq=4.0), product of:
            0.11235461 = queryWeight, product of:
              3.2635105 = idf(docFreq=4597, maxDocs=44218)
              0.03442753 = queryNorm
            0.25496176 = fieldWeight in 1777, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              3.2635105 = idf(docFreq=4597, maxDocs=44218)
              0.0390625 = fieldNorm(doc=1777)
      0.4 = coord(4/10)
    
    Abstract
    Die vorliegende Arbeit beinhaltet eine Beschreibung und Evaluation des WWW - Suchdienstes GERHARD (German Harvest Automated Retrieval and Directory). GERHARD ist ein Such- und Navigationssystem für das deutsche World Wide Web, weiches ausschließlich wissenschaftlich relevante Dokumente sammelt, und diese auf der Basis computerlinguistischer und statistischer Methoden automatisch mit Hilfe eines bibliothekarischen Klassifikationssystems klassifiziert. Mit dem DFG - Projekt GERHARD ist der Versuch unternommen worden, mit einem auf einem automatischen Klassifizierungsverfahren basierenden World Wide Web - Dienst eine Alternative zu herkömmlichen Methoden der Interneterschließung zu entwickeln. GERHARD ist im deutschsprachigen Raum das einzige Verzeichnis von Internetressourcen, dessen Erstellung und Aktualisierung vollständig automatisch (also maschinell) erfolgt. GERHARD beschränkt sich dabei auf den Nachweis von Dokumenten auf wissenschaftlichen WWW - Servern. Die Grundidee dabei war, kostenintensive intellektuelle Erschließung und Klassifizierung von lnternetseiten durch computerlinguistische und statistische Methoden zu ersetzen, um auf diese Weise die nachgewiesenen Internetressourcen automatisch auf das Vokabular eines bibliothekarischen Klassifikationssystems abzubilden. GERHARD steht für German Harvest Automated Retrieval and Directory. Die WWW - Adresse (URL) von GERHARD lautet: http://www.gerhard.de. Im Rahmen der vorliegenden Diplomarbeit soll eine Beschreibung des Dienstes mit besonderem Schwerpunkt auf dem zugrundeliegenden Indexierungs- bzw. Klassifizierungssystem erfolgen und anschließend mit Hilfe eines kleinen Retrievaltests die Effektivität von GERHARD überprüft werden.

Authors

Languages

  • d 125
  • e 26
  • f 1
  • hu 1
  • pt 1
  • More… Less…

Types