Search (52 results, page 1 of 3)

  • × type_ss:"x"
  1. Farazi, M.: Faceted lightweight ontologies : a formalization and some experiments (2010) 0.08
    0.083608165 = sum of:
      0.0690029 = product of:
        0.20700867 = sum of:
          0.20700867 = weight(_text_:3a in 4997) [ClassicSimilarity], result of:
            0.20700867 = score(doc=4997,freq=2.0), product of:
              0.4419972 = queryWeight, product of:
                8.478011 = idf(docFreq=24, maxDocs=44218)
                0.05213454 = queryNorm
              0.46834838 = fieldWeight in 4997, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                8.478011 = idf(docFreq=24, maxDocs=44218)
                0.0390625 = fieldNorm(doc=4997)
        0.33333334 = coord(1/3)
      0.014605265 = product of:
        0.02921053 = sum of:
          0.02921053 = weight(_text_:classification in 4997) [ClassicSimilarity], result of:
            0.02921053 = score(doc=4997,freq=2.0), product of:
              0.16603322 = queryWeight, product of:
                3.1847067 = idf(docFreq=4974, maxDocs=44218)
                0.05213454 = queryNorm
              0.17593184 = fieldWeight in 4997, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                3.1847067 = idf(docFreq=4974, maxDocs=44218)
                0.0390625 = fieldNorm(doc=4997)
        0.5 = coord(1/2)
    
    Abstract
    While classifications are heavily used to categorize web content, the evolution of the web foresees a more formal structure - ontology - which can serve this purpose. Ontologies are core artifacts of the Semantic Web which enable machines to use inference rules to conduct automated reasoning on data. Lightweight ontologies bridge the gap between classifications and ontologies. A lightweight ontology (LO) is an ontology representing a backbone taxonomy where the concept of the child node is more specific than the concept of the parent node. Formal lightweight ontologies can be generated from their informal ones. The key applications of formal lightweight ontologies are document classification, semantic search, and data integration. However, these applications suffer from the following problems: the disambiguation accuracy of the state of the art NLP tools used in generating formal lightweight ontologies from their informal ones; the lack of background knowledge needed for the formal lightweight ontologies; and the limitation of ontology reuse. In this dissertation, we propose a novel solution to these problems in formal lightweight ontologies; namely, faceted lightweight ontology (FLO). FLO is a lightweight ontology in which terms, present in each node label, and their concepts, are available in the background knowledge (BK), which is organized as a set of facets. A facet can be defined as a distinctive property of the groups of concepts that can help in differentiating one group from another. Background knowledge can be defined as a subset of a knowledge base, such as WordNet, and often represents a specific domain.
    Content
    PhD Dissertation at International Doctorate School in Information and Communication Technology. Vgl.: https%3A%2F%2Fcore.ac.uk%2Fdownload%2Fpdf%2F150083013.pdf&usg=AOvVaw2n-qisNagpyT0lli_6QbAQ.
  2. Verwer, K.: Freiheit und Verantwortung bei Hans Jonas (2011) 0.08
    0.08280347 = product of:
      0.16560695 = sum of:
        0.16560695 = product of:
          0.4968208 = sum of:
            0.4968208 = weight(_text_:3a in 973) [ClassicSimilarity], result of:
              0.4968208 = score(doc=973,freq=2.0), product of:
                0.4419972 = queryWeight, product of:
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.05213454 = queryNorm
                1.1240361 = fieldWeight in 973, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.09375 = fieldNorm(doc=973)
          0.33333334 = coord(1/3)
      0.5 = coord(1/2)
    
    Content
    Vgl.: http%3A%2F%2Fcreativechoice.org%2Fdoc%2FHansJonas.pdf&usg=AOvVaw1TM3teaYKgABL5H9yoIifA&opi=89978449.
  3. Thielemann, A.: Sacherschließung für die Kunstgeschichte : Möglichkeiten und Grenzen von DDC 700: The Arts (2007) 0.05
    0.051622465 = product of:
      0.10324493 = sum of:
        0.10324493 = sum of:
          0.04673685 = weight(_text_:classification in 1409) [ClassicSimilarity], result of:
            0.04673685 = score(doc=1409,freq=2.0), product of:
              0.16603322 = queryWeight, product of:
                3.1847067 = idf(docFreq=4974, maxDocs=44218)
                0.05213454 = queryNorm
              0.28149095 = fieldWeight in 1409, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                3.1847067 = idf(docFreq=4974, maxDocs=44218)
                0.0625 = fieldNorm(doc=1409)
          0.056508083 = weight(_text_:22 in 1409) [ClassicSimilarity], result of:
            0.056508083 = score(doc=1409,freq=2.0), product of:
              0.18256627 = queryWeight, product of:
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.05213454 = queryNorm
              0.30952093 = fieldWeight in 1409, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.0625 = fieldNorm(doc=1409)
      0.5 = coord(1/2)
    
    Abstract
    Nach der Veröffentlichung einer deutschen Übersetzung der Dewey Decimal Classification 22 im Oktober 2005 und ihrer Nutzung zur Inhaltserschließung in der Deutschen Nationalbibliographie seit Januar 2006 stellt sich aus Sicht der deutschen kunsthistorischen Spezialbibliotheken die Frage nach einer möglichen Verwendung der DDC und ihrer generellen Eignung zur Inhalterschließung kunsthistorischer Publikationen. Diese Frage wird vor dem Hintergrund der bestehenden bibliothekarischen Strukturen für die Kunstgeschichte sowie mit Blick auf die inhaltlichen Besonderheiten, die Forschungsmethodik und die publizistischen Traditionen dieses Faches erörtert.
  4. Köbler, J.; Niederklapfer, T.: Kreuzkonkordanzen zwischen RVK-BK-MSC-PACS der Fachbereiche Mathematik un Physik (2010) 0.05
    0.051547006 = product of:
      0.10309401 = sum of:
        0.10309401 = sum of:
          0.060712952 = weight(_text_:classification in 4408) [ClassicSimilarity], result of:
            0.060712952 = score(doc=4408,freq=6.0), product of:
              0.16603322 = queryWeight, product of:
                3.1847067 = idf(docFreq=4974, maxDocs=44218)
                0.05213454 = queryNorm
              0.3656675 = fieldWeight in 4408, product of:
                2.4494898 = tf(freq=6.0), with freq of:
                  6.0 = termFreq=6.0
                3.1847067 = idf(docFreq=4974, maxDocs=44218)
                0.046875 = fieldNorm(doc=4408)
          0.04238106 = weight(_text_:22 in 4408) [ClassicSimilarity], result of:
            0.04238106 = score(doc=4408,freq=2.0), product of:
              0.18256627 = queryWeight, product of:
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.05213454 = queryNorm
              0.23214069 = fieldWeight in 4408, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.046875 = fieldNorm(doc=4408)
      0.5 = coord(1/2)
    
    Abstract
    Unser Projekt soll eine Kreuzkonkordanz zwischen den Universalklassifikationen wie der "Regensburger Verbundsklassifikation (RVK)" und der "Basisklassifikation (BK)" sowie den Fachklassifikationen "Mathematics Subject Classification (MSC2010)" und "Physics and Astronomy Classification Scheme (PACS2010)" in den Fachgebieten Mathematik und Physik herstellen. Fazit: "Die klassifikatorische Übereinstmmung zwischen Regensburger Verbundklassifikation und Physics and Astronomy Classification Scheme war in einzelnen Fachbereichen (z. B. Kernphysik) recht gut. Doch andere Fachbereiche (z.B. Polymerphysik, Mineralogie) stimmten sehr wenig überein. Insgesamt konnten wir 890 einfache Verbindungen erstellen. Mehrfachverbindungen wurden aus technischen Gründen nicht mitgezählt. Das Projekt war insgesamt sehr umfangreich, daher konnte es im Rahmen der zwanzig Projekttage nicht erschöpfend behandelt werden. Eine Weiterentwicklung, insbesondere hinsichtlich des kollektiven Zuganges in Form eines Webformulars und der automatischen Klassifizierung erscheint jedoch sinnvoll."
    Pages
    22 S
  5. Huo, W.: Automatic multi-word term extraction and its application to Web-page summarization (2012) 0.04
    0.03871685 = product of:
      0.0774337 = sum of:
        0.0774337 = sum of:
          0.03505264 = weight(_text_:classification in 563) [ClassicSimilarity], result of:
            0.03505264 = score(doc=563,freq=2.0), product of:
              0.16603322 = queryWeight, product of:
                3.1847067 = idf(docFreq=4974, maxDocs=44218)
                0.05213454 = queryNorm
              0.21111822 = fieldWeight in 563, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                3.1847067 = idf(docFreq=4974, maxDocs=44218)
                0.046875 = fieldNorm(doc=563)
          0.04238106 = weight(_text_:22 in 563) [ClassicSimilarity], result of:
            0.04238106 = score(doc=563,freq=2.0), product of:
              0.18256627 = queryWeight, product of:
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.05213454 = queryNorm
              0.23214069 = fieldWeight in 563, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.046875 = fieldNorm(doc=563)
      0.5 = coord(1/2)
    
    Abstract
    In this thesis we propose three new word association measures for multi-word term extraction. We combine these association measures with LocalMaxs algorithm in our extraction model and compare the results of different multi-word term extraction methods. Our approach is language and domain independent and requires no training data. It can be applied to such tasks as text summarization, information retrieval, and document classification. We further explore the potential of using multi-word terms as an effective representation for general web-page summarization. We extract multi-word terms from human written summaries in a large collection of web-pages, and generate the summaries by aligning document words with these multi-word terms. Our system applies machine translation technology to learn the aligning process from a training set and focuses on selecting high quality multi-word terms from human written summaries to generate suitable results for web-page summarization.
    Date
    10. 1.2013 19:22:47
  6. Shala, E.: ¬Die Autonomie des Menschen und der Maschine : gegenwärtige Definitionen von Autonomie zwischen philosophischem Hintergrund und technologischer Umsetzbarkeit (2014) 0.03
    0.03450145 = product of:
      0.0690029 = sum of:
        0.0690029 = product of:
          0.20700867 = sum of:
            0.20700867 = weight(_text_:3a in 4388) [ClassicSimilarity], result of:
              0.20700867 = score(doc=4388,freq=2.0), product of:
                0.4419972 = queryWeight, product of:
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.05213454 = queryNorm
                0.46834838 = fieldWeight in 4388, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=4388)
          0.33333334 = coord(1/3)
      0.5 = coord(1/2)
    
    Footnote
    Vgl. unter: https://www.google.de/url?sa=t&rct=j&q=&esrc=s&source=web&cd=2&cad=rja&uact=8&ved=2ahUKEwizweHljdbcAhVS16QKHXcFD9QQFjABegQICRAB&url=https%3A%2F%2Fwww.researchgate.net%2Fpublication%2F271200105_Die_Autonomie_des_Menschen_und_der_Maschine_-_gegenwartige_Definitionen_von_Autonomie_zwischen_philosophischem_Hintergrund_und_technologischer_Umsetzbarkeit_Redigierte_Version_der_Magisterarbeit_Karls&usg=AOvVaw06orrdJmFF2xbCCp_hL26q.
  7. Piros, A.: Az ETO-jelzetek automatikus interpretálásának és elemzésének kérdései (2018) 0.03
    0.03450145 = product of:
      0.0690029 = sum of:
        0.0690029 = product of:
          0.20700867 = sum of:
            0.20700867 = weight(_text_:3a in 855) [ClassicSimilarity], result of:
              0.20700867 = score(doc=855,freq=2.0), product of:
                0.4419972 = queryWeight, product of:
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.05213454 = queryNorm
                0.46834838 = fieldWeight in 855, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=855)
          0.33333334 = coord(1/3)
      0.5 = coord(1/2)
    
    Content
    Vgl. auch: New automatic interpreter for complex UDC numbers. Unter: <https%3A%2F%2Fudcc.org%2Ffiles%2FAttilaPiros_EC_36-37_2014-2015.pdf&usg=AOvVaw3kc9CwDDCWP7aArpfjrs5b>
  8. Gabler, S.: Vergabe von DDC-Sachgruppen mittels eines Schlagwort-Thesaurus (2021) 0.03
    0.03450145 = product of:
      0.0690029 = sum of:
        0.0690029 = product of:
          0.20700867 = sum of:
            0.20700867 = weight(_text_:3a in 1000) [ClassicSimilarity], result of:
              0.20700867 = score(doc=1000,freq=2.0), product of:
                0.4419972 = queryWeight, product of:
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.05213454 = queryNorm
                0.46834838 = fieldWeight in 1000, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=1000)
          0.33333334 = coord(1/3)
      0.5 = coord(1/2)
    
    Content
    Master thesis Master of Science (Library and Information Studies) (MSc), Universität Wien. Advisor: Christoph Steiner. Vgl.: https://www.researchgate.net/publication/371680244_Vergabe_von_DDC-Sachgruppen_mittels_eines_Schlagwort-Thesaurus. DOI: 10.25365/thesis.70030. Vgl. dazu die Präsentation unter: https://www.google.com/url?sa=i&rct=j&q=&esrc=s&source=web&cd=&ved=0CAIQw7AJahcKEwjwoZzzytz_AhUAAAAAHQAAAAAQAg&url=https%3A%2F%2Fwiki.dnb.de%2Fdownload%2Fattachments%2F252121510%2FDA3%2520Workshop-Gabler.pdf%3Fversion%3D1%26modificationDate%3D1671093170000%26api%3Dv2&psig=AOvVaw0szwENK1or3HevgvIDOfjx&ust=1687719410889597&opi=89978449.
  9. Düring, M.: ¬Die Dewey Decimal Classification : Entstehung, Aufbau und Ausblick auf eine Nutzung in deutschen Bibliotheken (2003) 0.03
    0.03226404 = product of:
      0.06452808 = sum of:
        0.06452808 = sum of:
          0.02921053 = weight(_text_:classification in 2460) [ClassicSimilarity], result of:
            0.02921053 = score(doc=2460,freq=2.0), product of:
              0.16603322 = queryWeight, product of:
                3.1847067 = idf(docFreq=4974, maxDocs=44218)
                0.05213454 = queryNorm
              0.17593184 = fieldWeight in 2460, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                3.1847067 = idf(docFreq=4974, maxDocs=44218)
                0.0390625 = fieldNorm(doc=2460)
          0.03531755 = weight(_text_:22 in 2460) [ClassicSimilarity], result of:
            0.03531755 = score(doc=2460,freq=2.0), product of:
              0.18256627 = queryWeight, product of:
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.05213454 = queryNorm
              0.19345059 = fieldWeight in 2460, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.0390625 = fieldNorm(doc=2460)
      0.5 = coord(1/2)
    
    Abstract
    Die ständig steigende Zahl an publizierter Information in immer neuen Formen verlangt besonders von Informations- und Dokumentationseinrichtungen immer präzisere Lösungen zur Erschließung dieser Informationen und ihrer benutzerfreundlichen Aufbereitung. Besonders im derzeitigen Zeitalter der Datenbanken und Online-Kataloge ist die Kombination von verbaler und klassifikatorischer Sacherschließung gefordert, ohne dabei die Verbindung zu den älteren, vielerorts noch (zumindest zusätzlich) in Verwendung befindlichen, Zettelkatalogen zu verlieren. Weltweit ist eine Vielzahl an verschiedenen Klassifikationen im Einsatz. Die Wahl der für eine Einrichtung passenden Klassifikation ist abhängig von ihrer thematischen und informationellen Ausrichtung, der Größe und Art der Bestände und nicht zuletzt von technischen und personellen Voraussetzungen. Auf Seiten der zu wählenden Klassifikation sind die Einfachheit der Handhabung für den Bibliothekar, die Verständlichkeit für den Benutzer, die Erweiterungsfähigkeit der Klassifikation durch das Aufkommen neuer Wissensgebiete und die Einbindung in informationelle Netze mit anderen Einrichtungen von entscheidender Bedeutung. In dieser Arbeit soll die Dewey Dezimalklassifikation (DDC) hinsichtlich dieser Punkte näher beleuchtet werden. Sie ist die weltweit am häufigsten benutzte Klassifikation. Etwa 200.000 Bibliotheken in 135 Ländern erschließen ihre Bestände mit diesem System. Sie liegt derzeit bereits in der 22. ungekürzten Auflage vor und wurde bisher in 30 Sprachen übersetzt. Eine deutsche Komplettübersetzung wird im Jahre 2005 erscheinen. Trotz teils heftig geführter Standardisierungsdebatten und Plänen für die Übernahme von amerikanischen Formalerschließungsregeln herrscht in Bezug auf die Sacherschließung unter deutschen Bibliotheken wenig Einigkeit. Die DDC ist in Deutschland und anderen europäischen Ländern kaum verbreitet, sieht von Großbritannien und von der Verwendung in Bibliografien ab. Diese Arbeit geht demzufolge auf die historischen Gründe dieser Entwicklung ein und wagt einen kurzen Ausblick in die Zukunft der Dezimalklassifikation.
  10. Adams, B.: Charles Ami Cutters 'Expansive classification' : eine kritsche Darstellung (1965) 0.03
    0.028916951 = product of:
      0.057833903 = sum of:
        0.057833903 = product of:
          0.115667805 = sum of:
            0.115667805 = weight(_text_:classification in 4943) [ClassicSimilarity], result of:
              0.115667805 = score(doc=4943,freq=4.0), product of:
                0.16603322 = queryWeight, product of:
                  3.1847067 = idf(docFreq=4974, maxDocs=44218)
                  0.05213454 = queryNorm
                0.69665456 = fieldWeight in 4943, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  3.1847067 = idf(docFreq=4974, maxDocs=44218)
                  0.109375 = fieldNorm(doc=4943)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Object
    Cutter expansive classification
  11. Stünkel, M.: Neuere Methoden der inhaltlichen Erschließung schöner Literatur in öffentlichen Bibliotheken (1986) 0.03
    0.028254041 = product of:
      0.056508083 = sum of:
        0.056508083 = product of:
          0.113016166 = sum of:
            0.113016166 = weight(_text_:22 in 5815) [ClassicSimilarity], result of:
              0.113016166 = score(doc=5815,freq=2.0), product of:
                0.18256627 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.05213454 = queryNorm
                0.61904186 = fieldWeight in 5815, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.125 = fieldNorm(doc=5815)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Date
    4. 8.2006 21:35:22
  12. Stojanovic, N.: Ontology-based Information Retrieval : methods and tools for cooperative query answering (2005) 0.03
    0.027601156 = product of:
      0.055202313 = sum of:
        0.055202313 = product of:
          0.16560693 = sum of:
            0.16560693 = weight(_text_:3a in 701) [ClassicSimilarity], result of:
              0.16560693 = score(doc=701,freq=2.0), product of:
                0.4419972 = queryWeight, product of:
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.05213454 = queryNorm
                0.3746787 = fieldWeight in 701, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.03125 = fieldNorm(doc=701)
          0.33333334 = coord(1/3)
      0.5 = coord(1/2)
    
    Content
    Vgl.: http%3A%2F%2Fdigbib.ubka.uni-karlsruhe.de%2Fvolltexte%2Fdocuments%2F1627&ei=tAtYUYrBNoHKtQb3l4GYBw&usg=AFQjCNHeaxKkKU3-u54LWxMNYGXaaDLCGw&sig2=8WykXWQoDKjDSdGtAakH2Q&bvm=bv.44442042,d.Yms.
  13. Xiong, C.: Knowledge based text representations for information retrieval (2016) 0.03
    0.027601156 = product of:
      0.055202313 = sum of:
        0.055202313 = product of:
          0.16560693 = sum of:
            0.16560693 = weight(_text_:3a in 5820) [ClassicSimilarity], result of:
              0.16560693 = score(doc=5820,freq=2.0), product of:
                0.4419972 = queryWeight, product of:
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.05213454 = queryNorm
                0.3746787 = fieldWeight in 5820, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.03125 = fieldNorm(doc=5820)
          0.33333334 = coord(1/3)
      0.5 = coord(1/2)
    
    Content
    Submitted in partial fulfillment of the requirements for the degree of Doctor of Philosophy in Language and Information Technologies. Vgl.: https%3A%2F%2Fwww.cs.cmu.edu%2F~cx%2Fpapers%2Fknowledge_based_text_representation.pdf&usg=AOvVaw0SaTSvhWLTh__Uz_HtOtl3.
  14. Menges, T.: Möglichkeiten und Grenzen der Übertragbarkeit eines Buches auf Hypertext am Beispiel einer französischen Grundgrammatik (Klein; Kleineidam) (1997) 0.02
    0.024722286 = product of:
      0.04944457 = sum of:
        0.04944457 = product of:
          0.09888914 = sum of:
            0.09888914 = weight(_text_:22 in 1496) [ClassicSimilarity], result of:
              0.09888914 = score(doc=1496,freq=2.0), product of:
                0.18256627 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.05213454 = queryNorm
                0.5416616 = fieldWeight in 1496, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.109375 = fieldNorm(doc=1496)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Date
    22. 7.1998 18:23:25
  15. Schneider, A.: ¬Die Verzeichnung und sachliche Erschließung der Belletristik in Kaysers Bücherlexikon und im Schlagwortkatalog Georg/Ost (1980) 0.02
    0.024722286 = product of:
      0.04944457 = sum of:
        0.04944457 = product of:
          0.09888914 = sum of:
            0.09888914 = weight(_text_:22 in 5309) [ClassicSimilarity], result of:
              0.09888914 = score(doc=5309,freq=2.0), product of:
                0.18256627 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.05213454 = queryNorm
                0.5416616 = fieldWeight in 5309, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.109375 = fieldNorm(doc=5309)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Date
    5. 8.2006 13:07:22
  16. Sperling, R.: Anlage von Literaturreferenzen für Onlineressourcen auf einer virtuellen Lernplattform (2004) 0.02
    0.024722286 = product of:
      0.04944457 = sum of:
        0.04944457 = product of:
          0.09888914 = sum of:
            0.09888914 = weight(_text_:22 in 4635) [ClassicSimilarity], result of:
              0.09888914 = score(doc=4635,freq=2.0), product of:
                0.18256627 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.05213454 = queryNorm
                0.5416616 = fieldWeight in 4635, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.109375 = fieldNorm(doc=4635)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Date
    26.11.2005 18:39:22
  17. Engbarth, M.: ¬Die Library of Congress Classification : Geschichte, Struktur, Verbreitung und Auswirkungen auf deutsche Bibliotheksklassifikationen (1980) 0.02
    0.023368426 = product of:
      0.04673685 = sum of:
        0.04673685 = product of:
          0.0934737 = sum of:
            0.0934737 = weight(_text_:classification in 6784) [ClassicSimilarity], result of:
              0.0934737 = score(doc=6784,freq=2.0), product of:
                0.16603322 = queryWeight, product of:
                  3.1847067 = idf(docFreq=4974, maxDocs=44218)
                  0.05213454 = queryNorm
                0.5629819 = fieldWeight in 6784, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.1847067 = idf(docFreq=4974, maxDocs=44218)
                  0.125 = fieldNorm(doc=6784)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
  18. Urban, A.: ¬Die Dewey Decimal Classification als Normklassifikation : Untersuchungen zur Entwicklung und Verbreitung der DDC unter besonderer Berücksichtigung der zentralen Sacherschließung (1977) 0.02
    0.023368426 = product of:
      0.04673685 = sum of:
        0.04673685 = product of:
          0.0934737 = sum of:
            0.0934737 = weight(_text_:classification in 6824) [ClassicSimilarity], result of:
              0.0934737 = score(doc=6824,freq=2.0), product of:
                0.16603322 = queryWeight, product of:
                  3.1847067 = idf(docFreq=4974, maxDocs=44218)
                  0.05213454 = queryNorm
                0.5629819 = fieldWeight in 6824, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.1847067 = idf(docFreq=4974, maxDocs=44218)
                  0.125 = fieldNorm(doc=6824)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
  19. Stanz, G.: Medienarchive: Analyse einer unterschätzten Ressource : Archivierung, Dokumentation, und Informationsvermittlung in Medien bei besonderer Berücksichtigung von Pressearchiven (1994) 0.02
    0.02119053 = product of:
      0.04238106 = sum of:
        0.04238106 = product of:
          0.08476212 = sum of:
            0.08476212 = weight(_text_:22 in 9) [ClassicSimilarity], result of:
              0.08476212 = score(doc=9,freq=2.0), product of:
                0.18256627 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.05213454 = queryNorm
                0.46428138 = fieldWeight in 9, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.09375 = fieldNorm(doc=9)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Date
    22. 2.1997 19:50:29
  20. Hartwieg, U.: ¬Die nationalbibliographische Situation im 18. Jahrhundert : Vorüberlegungen zur Verzeichnung der deutschen Drucke in einem VD18 (1999) 0.02
    0.02119053 = product of:
      0.04238106 = sum of:
        0.04238106 = product of:
          0.08476212 = sum of:
            0.08476212 = weight(_text_:22 in 3813) [ClassicSimilarity], result of:
              0.08476212 = score(doc=3813,freq=2.0), product of:
                0.18256627 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.05213454 = queryNorm
                0.46428138 = fieldWeight in 3813, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.09375 = fieldNorm(doc=3813)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Date
    18. 6.1999 9:22:36

Languages

  • d 38
  • e 12
  • f 1
  • hu 1
  • More… Less…