Search (9977 results, page 2 of 499)

  1. Farazi, M.: Faceted lightweight ontologies : a formalization and some experiments (2010) 0.15
    0.15368767 = product of:
      0.2561461 = sum of:
        0.059636947 = product of:
          0.17891084 = sum of:
            0.17891084 = weight(_text_:3a in 4997) [ClassicSimilarity], result of:
              0.17891084 = score(doc=4997,freq=2.0), product of:
                0.38200375 = queryWeight, product of:
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.04505818 = queryNorm
                0.46834838 = fieldWeight in 4997, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=4997)
          0.33333334 = coord(1/3)
        0.17891084 = weight(_text_:2f in 4997) [ClassicSimilarity], result of:
          0.17891084 = score(doc=4997,freq=2.0), product of:
            0.38200375 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.04505818 = queryNorm
            0.46834838 = fieldWeight in 4997, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.0390625 = fieldNorm(doc=4997)
        0.017598324 = product of:
          0.035196647 = sum of:
            0.035196647 = weight(_text_:data in 4997) [ClassicSimilarity], result of:
              0.035196647 = score(doc=4997,freq=4.0), product of:
                0.14247625 = queryWeight, product of:
                  3.1620505 = idf(docFreq=5088, maxDocs=44218)
                  0.04505818 = queryNorm
                0.24703519 = fieldWeight in 4997, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  3.1620505 = idf(docFreq=5088, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=4997)
          0.5 = coord(1/2)
      0.6 = coord(3/5)
    
    Abstract
    While classifications are heavily used to categorize web content, the evolution of the web foresees a more formal structure - ontology - which can serve this purpose. Ontologies are core artifacts of the Semantic Web which enable machines to use inference rules to conduct automated reasoning on data. Lightweight ontologies bridge the gap between classifications and ontologies. A lightweight ontology (LO) is an ontology representing a backbone taxonomy where the concept of the child node is more specific than the concept of the parent node. Formal lightweight ontologies can be generated from their informal ones. The key applications of formal lightweight ontologies are document classification, semantic search, and data integration. However, these applications suffer from the following problems: the disambiguation accuracy of the state of the art NLP tools used in generating formal lightweight ontologies from their informal ones; the lack of background knowledge needed for the formal lightweight ontologies; and the limitation of ontology reuse. In this dissertation, we propose a novel solution to these problems in formal lightweight ontologies; namely, faceted lightweight ontology (FLO). FLO is a lightweight ontology in which terms, present in each node label, and their concepts, are available in the background knowledge (BK), which is organized as a set of facets. A facet can be defined as a distinctive property of the groups of concepts that can help in differentiating one group from another. Background knowledge can be defined as a subset of a knowledge base, such as WordNet, and often represents a specific domain.
    Content
    PhD Dissertation at International Doctorate School in Information and Communication Technology. Vgl.: https%3A%2F%2Fcore.ac.uk%2Fdownload%2Fpdf%2F150083013.pdf&usg=AOvVaw2n-qisNagpyT0lli_6QbAQ.
  2. Furrie, B.; Data Base Development Department of The Follett Software Company: Understanding MARC Bibliographic : Machine-readable cataloging (2000) 0.15
    0.15336421 = product of:
      0.255607 = sum of:
        0.15033525 = weight(_text_:readable in 6772) [ClassicSimilarity], result of:
          0.15033525 = score(doc=6772,freq=2.0), product of:
            0.2768342 = queryWeight, product of:
              6.1439276 = idf(docFreq=257, maxDocs=44218)
              0.04505818 = queryNorm
            0.5430516 = fieldWeight in 6772, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              6.1439276 = idf(docFreq=257, maxDocs=44218)
              0.0625 = fieldNorm(doc=6772)
        0.08536155 = weight(_text_:bibliographic in 6772) [ClassicSimilarity], result of:
          0.08536155 = score(doc=6772,freq=4.0), product of:
            0.17541347 = queryWeight, product of:
              3.893044 = idf(docFreq=2449, maxDocs=44218)
              0.04505818 = queryNorm
            0.4866305 = fieldWeight in 6772, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              3.893044 = idf(docFreq=2449, maxDocs=44218)
              0.0625 = fieldNorm(doc=6772)
        0.01991023 = product of:
          0.03982046 = sum of:
            0.03982046 = weight(_text_:data in 6772) [ClassicSimilarity], result of:
              0.03982046 = score(doc=6772,freq=2.0), product of:
                0.14247625 = queryWeight, product of:
                  3.1620505 = idf(docFreq=5088, maxDocs=44218)
                  0.04505818 = queryNorm
                0.2794884 = fieldWeight in 6772, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.1620505 = idf(docFreq=5088, maxDocs=44218)
                  0.0625 = fieldNorm(doc=6772)
          0.5 = coord(1/2)
      0.6 = coord(3/5)
    
    Footnote
    Vgl. auch unter: http://lcweb.loc.gov/marc/umb/. - Understanding MARC: Bibliographic was a copyrighted work originally published by the Follett Software Co. in 1988 (second edition, 1989, third edition, 1990, fourth edition, 1994, fifth edition, 1998)
  3. Schrodt, R.: Tiefen und Untiefen im wissenschaftlichen Sprachgebrauch (2008) 0.15
    0.15267058 = product of:
      0.38167644 = sum of:
        0.09541911 = product of:
          0.28625733 = sum of:
            0.28625733 = weight(_text_:3a in 140) [ClassicSimilarity], result of:
              0.28625733 = score(doc=140,freq=2.0), product of:
                0.38200375 = queryWeight, product of:
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.04505818 = queryNorm
                0.7493574 = fieldWeight in 140, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.0625 = fieldNorm(doc=140)
          0.33333334 = coord(1/3)
        0.28625733 = weight(_text_:2f in 140) [ClassicSimilarity], result of:
          0.28625733 = score(doc=140,freq=2.0), product of:
            0.38200375 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.04505818 = queryNorm
            0.7493574 = fieldWeight in 140, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.0625 = fieldNorm(doc=140)
      0.4 = coord(2/5)
    
    Content
    Vgl. auch: https://studylibde.com/doc/13053640/richard-schrodt. Vgl. auch: http%3A%2F%2Fwww.univie.ac.at%2FGermanistik%2Fschrodt%2Fvorlesung%2Fwissenschaftssprache.doc&usg=AOvVaw1lDLDR6NFf1W0-oC9mEUJf.
  4. Popper, K.R.: Three worlds : the Tanner lecture on human values. Deliverd at the University of Michigan, April 7, 1978 (1978) 0.15
    0.15267058 = product of:
      0.38167644 = sum of:
        0.09541911 = product of:
          0.28625733 = sum of:
            0.28625733 = weight(_text_:3a in 230) [ClassicSimilarity], result of:
              0.28625733 = score(doc=230,freq=2.0), product of:
                0.38200375 = queryWeight, product of:
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.04505818 = queryNorm
                0.7493574 = fieldWeight in 230, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.0625 = fieldNorm(doc=230)
          0.33333334 = coord(1/3)
        0.28625733 = weight(_text_:2f in 230) [ClassicSimilarity], result of:
          0.28625733 = score(doc=230,freq=2.0), product of:
            0.38200375 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.04505818 = queryNorm
            0.7493574 = fieldWeight in 230, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.0625 = fieldNorm(doc=230)
      0.4 = coord(2/5)
    
    Source
    https%3A%2F%2Ftannerlectures.utah.edu%2F_documents%2Fa-to-z%2Fp%2Fpopper80.pdf&usg=AOvVaw3f4QRTEH-OEBmoYr2J_c7H
  5. Gabler, S.: Vergabe von DDC-Sachgruppen mittels eines Schlagwort-Thesaurus (2021) 0.15
    0.15059501 = product of:
      0.25099167 = sum of:
        0.059636947 = product of:
          0.17891084 = sum of:
            0.17891084 = weight(_text_:3a in 1000) [ClassicSimilarity], result of:
              0.17891084 = score(doc=1000,freq=2.0), product of:
                0.38200375 = queryWeight, product of:
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.04505818 = queryNorm
                0.46834838 = fieldWeight in 1000, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=1000)
          0.33333334 = coord(1/3)
        0.17891084 = weight(_text_:2f in 1000) [ClassicSimilarity], result of:
          0.17891084 = score(doc=1000,freq=2.0), product of:
            0.38200375 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.04505818 = queryNorm
            0.46834838 = fieldWeight in 1000, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.0390625 = fieldNorm(doc=1000)
        0.012443894 = product of:
          0.024887787 = sum of:
            0.024887787 = weight(_text_:data in 1000) [ClassicSimilarity], result of:
              0.024887787 = score(doc=1000,freq=2.0), product of:
                0.14247625 = queryWeight, product of:
                  3.1620505 = idf(docFreq=5088, maxDocs=44218)
                  0.04505818 = queryNorm
                0.17468026 = fieldWeight in 1000, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.1620505 = idf(docFreq=5088, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=1000)
          0.5 = coord(1/2)
      0.6 = coord(3/5)
    
    Abstract
    Vorgestellt wird die Konstruktion eines thematisch geordneten Thesaurus auf Basis der Sachschlagwörter der Gemeinsamen Normdatei (GND) unter Nutzung der darin enthaltenen DDC-Notationen. Oberste Ordnungsebene dieses Thesaurus werden die DDC-Sachgruppen der Deutschen Nationalbibliothek. Die Konstruktion des Thesaurus erfolgt regelbasiert unter der Nutzung von Linked Data Prinzipien in einem SPARQL Prozessor. Der Thesaurus dient der automatisierten Gewinnung von Metadaten aus wissenschaftlichen Publikationen mittels eines computerlinguistischen Extraktors. Hierzu werden digitale Volltexte verarbeitet. Dieser ermittelt die gefundenen Schlagwörter über Vergleich der Zeichenfolgen Benennungen im Thesaurus, ordnet die Treffer nach Relevanz im Text und gibt die zugeordne-ten Sachgruppen rangordnend zurück. Die grundlegende Annahme dabei ist, dass die gesuchte Sachgruppe unter den oberen Rängen zurückgegeben wird. In einem dreistufigen Verfahren wird die Leistungsfähigkeit des Verfahrens validiert. Hierzu wird zunächst anhand von Metadaten und Erkenntnissen einer Kurzautopsie ein Goldstandard aus Dokumenten erstellt, die im Online-Katalog der DNB abrufbar sind. Die Dokumente vertei-len sich über 14 der Sachgruppen mit einer Losgröße von jeweils 50 Dokumenten. Sämtliche Dokumente werden mit dem Extraktor erschlossen und die Ergebnisse der Kategorisierung do-kumentiert. Schließlich wird die sich daraus ergebende Retrievalleistung sowohl für eine harte (binäre) Kategorisierung als auch eine rangordnende Rückgabe der Sachgruppen beurteilt.
    Content
    Master thesis Master of Science (Library and Information Studies) (MSc), Universität Wien. Advisor: Christoph Steiner. Vgl.: https://www.researchgate.net/publication/371680244_Vergabe_von_DDC-Sachgruppen_mittels_eines_Schlagwort-Thesaurus. DOI: 10.25365/thesis.70030. Vgl. dazu die Präsentation unter: https://www.google.com/url?sa=i&rct=j&q=&esrc=s&source=web&cd=&ved=0CAIQw7AJahcKEwjwoZzzytz_AhUAAAAAHQAAAAAQAg&url=https%3A%2F%2Fwiki.dnb.de%2Fdownload%2Fattachments%2F252121510%2FDA3%2520Workshop-Gabler.pdf%3Fversion%3D1%26modificationDate%3D1671093170000%26api%3Dv2&psig=AOvVaw0szwENK1or3HevgvIDOfjx&ust=1687719410889597&opi=89978449.
  6. Li, L.; Shang, Y.; Zhang, W.: Improvement of HITS-based algorithms on Web documents 0.15
    0.15007444 = product of:
      0.3751861 = sum of:
        0.07156433 = product of:
          0.214693 = sum of:
            0.214693 = weight(_text_:3a in 2514) [ClassicSimilarity], result of:
              0.214693 = score(doc=2514,freq=2.0), product of:
                0.38200375 = queryWeight, product of:
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.04505818 = queryNorm
                0.56201804 = fieldWeight in 2514, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.046875 = fieldNorm(doc=2514)
          0.33333334 = coord(1/3)
        0.30362174 = weight(_text_:2f in 2514) [ClassicSimilarity], result of:
          0.30362174 = score(doc=2514,freq=4.0), product of:
            0.38200375 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.04505818 = queryNorm
            0.7948135 = fieldWeight in 2514, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.046875 = fieldNorm(doc=2514)
      0.4 = coord(2/5)
    
    Content
    Vgl.: http%3A%2F%2Fdelab.csd.auth.gr%2F~dimitris%2Fcourses%2Fir_spring06%2Fpage_rank_computing%2Fp527-li.pdf. Vgl. auch: http://www2002.org/CDROM/refereed/643/.
  7. Suchenwirth, L.: Sacherschliessung in Zeiten von Corona : neue Herausforderungen und Chancen (2019) 0.15
    0.15007444 = product of:
      0.3751861 = sum of:
        0.07156433 = product of:
          0.214693 = sum of:
            0.214693 = weight(_text_:3a in 484) [ClassicSimilarity], result of:
              0.214693 = score(doc=484,freq=2.0), product of:
                0.38200375 = queryWeight, product of:
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.04505818 = queryNorm
                0.56201804 = fieldWeight in 484, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.046875 = fieldNorm(doc=484)
          0.33333334 = coord(1/3)
        0.30362174 = weight(_text_:2f in 484) [ClassicSimilarity], result of:
          0.30362174 = score(doc=484,freq=4.0), product of:
            0.38200375 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.04505818 = queryNorm
            0.7948135 = fieldWeight in 484, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.046875 = fieldNorm(doc=484)
      0.4 = coord(2/5)
    
    Footnote
    https%3A%2F%2Fjournals.univie.ac.at%2Findex.php%2Fvoebm%2Farticle%2Fdownload%2F5332%2F5271%2F&usg=AOvVaw2yQdFGHlmOwVls7ANCpTii.
  8. Maxwell, R.L.: Bibliographic control (2009) 0.15
    0.14847405 = product of:
      0.24745674 = sum of:
        0.11275144 = weight(_text_:readable in 3750) [ClassicSimilarity], result of:
          0.11275144 = score(doc=3750,freq=2.0), product of:
            0.2768342 = queryWeight, product of:
              6.1439276 = idf(docFreq=257, maxDocs=44218)
              0.04505818 = queryNorm
            0.4072887 = fieldWeight in 3750, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              6.1439276 = idf(docFreq=257, maxDocs=44218)
              0.046875 = fieldNorm(doc=3750)
        0.11977262 = weight(_text_:bibliographic in 3750) [ClassicSimilarity], result of:
          0.11977262 = score(doc=3750,freq=14.0), product of:
            0.17541347 = queryWeight, product of:
              3.893044 = idf(docFreq=2449, maxDocs=44218)
              0.04505818 = queryNorm
            0.6828017 = fieldWeight in 3750, product of:
              3.7416575 = tf(freq=14.0), with freq of:
                14.0 = termFreq=14.0
              3.893044 = idf(docFreq=2449, maxDocs=44218)
              0.046875 = fieldNorm(doc=3750)
        0.014932672 = product of:
          0.029865343 = sum of:
            0.029865343 = weight(_text_:data in 3750) [ClassicSimilarity], result of:
              0.029865343 = score(doc=3750,freq=2.0), product of:
                0.14247625 = queryWeight, product of:
                  3.1620505 = idf(docFreq=5088, maxDocs=44218)
                  0.04505818 = queryNorm
                0.2096163 = fieldWeight in 3750, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.1620505 = idf(docFreq=5088, maxDocs=44218)
                  0.046875 = fieldNorm(doc=3750)
          0.5 = coord(1/2)
      0.6 = coord(3/5)
    
    Abstract
    Bibliographic control is the process of creation, exchange, preservation, and use of data about information resources. Formal bibliographic control has been practiced for millennia, but modern techniques began to be developed and implemented in the nineteenth and twentieth centuries. A series of cataloging codes characterized this period. These codes governed the creation of library catalogs, first in book form, then on cards, and finally in electronic formats, including MAchine-Readable Cataloging (MARC). The period was also characterized by the rise of shared cataloging programs, allowing the development of resource-saving copy cataloging procedures. Such programs were assisted by the development of cataloging networks such as OCLC and RLG. The twentieth century saw progress in the theory of bibliographic control, including the 1961 Paris Principles, culminating with the early twenty-first century Statement of International Cataloguing Principles and IFLA's Functional Requirements for Bibliographic Records (FRBR). Toward the end of the period bibliographic control began to be applied to newly invented electronic media, as "metadata." Trends point toward continued development of collaborative and international approaches to bibliographic control.
  9. Austin, D.: ¬The exchange of subject information (1975) 0.15
    0.1474865 = product of:
      0.3687162 = sum of:
        0.26308668 = weight(_text_:readable in 3176) [ClassicSimilarity], result of:
          0.26308668 = score(doc=3176,freq=2.0), product of:
            0.2768342 = queryWeight, product of:
              6.1439276 = idf(docFreq=257, maxDocs=44218)
              0.04505818 = queryNorm
            0.9503403 = fieldWeight in 3176, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              6.1439276 = idf(docFreq=257, maxDocs=44218)
              0.109375 = fieldNorm(doc=3176)
        0.105629526 = weight(_text_:bibliographic in 3176) [ClassicSimilarity], result of:
          0.105629526 = score(doc=3176,freq=2.0), product of:
            0.17541347 = queryWeight, product of:
              3.893044 = idf(docFreq=2449, maxDocs=44218)
              0.04505818 = queryNorm
            0.6021745 = fieldWeight in 3176, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.893044 = idf(docFreq=2449, maxDocs=44218)
              0.109375 = fieldNorm(doc=3176)
      0.4 = coord(2/5)
    
    Source
    The interchange of bibliographic information in machine readable form. Ed.: R.E. Coward u. M. Yelland
  10. Guenther, R.S.: ¬The USMARC Format for Classification Data : development and implementation (1992) 0.15
    0.14710832 = product of:
      0.24518052 = sum of:
        0.15033525 = weight(_text_:readable in 2996) [ClassicSimilarity], result of:
          0.15033525 = score(doc=2996,freq=2.0), product of:
            0.2768342 = queryWeight, product of:
              6.1439276 = idf(docFreq=257, maxDocs=44218)
              0.04505818 = queryNorm
            0.5430516 = fieldWeight in 2996, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              6.1439276 = idf(docFreq=257, maxDocs=44218)
              0.0625 = fieldNorm(doc=2996)
        0.060359728 = weight(_text_:bibliographic in 2996) [ClassicSimilarity], result of:
          0.060359728 = score(doc=2996,freq=2.0), product of:
            0.17541347 = queryWeight, product of:
              3.893044 = idf(docFreq=2449, maxDocs=44218)
              0.04505818 = queryNorm
            0.34409973 = fieldWeight in 2996, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.893044 = idf(docFreq=2449, maxDocs=44218)
              0.0625 = fieldNorm(doc=2996)
        0.03448553 = product of:
          0.06897106 = sum of:
            0.06897106 = weight(_text_:data in 2996) [ClassicSimilarity], result of:
              0.06897106 = score(doc=2996,freq=6.0), product of:
                0.14247625 = queryWeight, product of:
                  3.1620505 = idf(docFreq=5088, maxDocs=44218)
                  0.04505818 = queryNorm
                0.48408815 = fieldWeight in 2996, product of:
                  2.4494898 = tf(freq=6.0), with freq of:
                    6.0 = termFreq=6.0
                  3.1620505 = idf(docFreq=5088, maxDocs=44218)
                  0.0625 = fieldNorm(doc=2996)
          0.5 = coord(1/2)
      0.6 = coord(3/5)
    
    Abstract
    This paper discusses the newly developed USMARC Format for Classification Data. It reviews its potential uses within an online system and its development as one of the USMARC standards for representing bibliographic and related information in machine-readable form. It provides a summary of the fields in the format, and considers the prospects for its implementation.
    Object
    USMARC for classification data
  11. Keyser, P.d.: Conversie van bibliografische gegevens (1997) 0.15
    0.14710832 = product of:
      0.24518052 = sum of:
        0.15033525 = weight(_text_:readable in 96) [ClassicSimilarity], result of:
          0.15033525 = score(doc=96,freq=2.0), product of:
            0.2768342 = queryWeight, product of:
              6.1439276 = idf(docFreq=257, maxDocs=44218)
              0.04505818 = queryNorm
            0.5430516 = fieldWeight in 96, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              6.1439276 = idf(docFreq=257, maxDocs=44218)
              0.0625 = fieldNorm(doc=96)
        0.060359728 = weight(_text_:bibliographic in 96) [ClassicSimilarity], result of:
          0.060359728 = score(doc=96,freq=2.0), product of:
            0.17541347 = queryWeight, product of:
              3.893044 = idf(docFreq=2449, maxDocs=44218)
              0.04505818 = queryNorm
            0.34409973 = fieldWeight in 96, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.893044 = idf(docFreq=2449, maxDocs=44218)
              0.0625 = fieldNorm(doc=96)
        0.03448553 = product of:
          0.06897106 = sum of:
            0.06897106 = weight(_text_:data in 96) [ClassicSimilarity], result of:
              0.06897106 = score(doc=96,freq=6.0), product of:
                0.14247625 = queryWeight, product of:
                  3.1620505 = idf(docFreq=5088, maxDocs=44218)
                  0.04505818 = queryNorm
                0.48408815 = fieldWeight in 96, product of:
                  2.4494898 = tf(freq=6.0), with freq of:
                    6.0 = termFreq=6.0
                  3.1620505 = idf(docFreq=5088, maxDocs=44218)
                  0.0625 = fieldNorm(doc=96)
          0.5 = coord(1/2)
      0.6 = coord(3/5)
    
    Abstract
    Programs for converting bibligraphic data are not only of interest to libraries but also to researchers compiling bibliographies. However, few programs are currently available. In choosing a suitable program care must be taken to ensure that it is capable of identifying and converting all fields likely to be encountered, to the required format. Optical scanning can provide a convenient solution for converting printed output to machine-readable format. Increasing acceptance of standardised formats will facilitate exchange of data
    Footnote
    Übers. des Titels: Conversion of bibliographic data
  12. Nasatir, M.: ¬The cataloging and classification of machine-readable data files : Pt.3: Subject description of machine-readable data files (1982) 0.14
    0.14445807 = product of:
      0.36114517 = sum of:
        0.3189092 = weight(_text_:readable in 3921) [ClassicSimilarity], result of:
          0.3189092 = score(doc=3921,freq=4.0), product of:
            0.2768342 = queryWeight, product of:
              6.1439276 = idf(docFreq=257, maxDocs=44218)
              0.04505818 = queryNorm
            1.1519864 = fieldWeight in 3921, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              6.1439276 = idf(docFreq=257, maxDocs=44218)
              0.09375 = fieldNorm(doc=3921)
        0.042235978 = product of:
          0.084471956 = sum of:
            0.084471956 = weight(_text_:data in 3921) [ClassicSimilarity], result of:
              0.084471956 = score(doc=3921,freq=4.0), product of:
                0.14247625 = queryWeight, product of:
                  3.1620505 = idf(docFreq=5088, maxDocs=44218)
                  0.04505818 = queryNorm
                0.5928845 = fieldWeight in 3921, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  3.1620505 = idf(docFreq=5088, maxDocs=44218)
                  0.09375 = fieldNorm(doc=3921)
          0.5 = coord(1/2)
      0.4 = coord(2/5)
    
  13. Nasatir, M.: ¬The cataloging and classification of machine-readable data files : Part 1: a case for incorporating records of machine-readable data files into the public catalog (1981) 0.14
    0.14445807 = product of:
      0.36114517 = sum of:
        0.3189092 = weight(_text_:readable in 269) [ClassicSimilarity], result of:
          0.3189092 = score(doc=269,freq=4.0), product of:
            0.2768342 = queryWeight, product of:
              6.1439276 = idf(docFreq=257, maxDocs=44218)
              0.04505818 = queryNorm
            1.1519864 = fieldWeight in 269, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              6.1439276 = idf(docFreq=257, maxDocs=44218)
              0.09375 = fieldNorm(doc=269)
        0.042235978 = product of:
          0.084471956 = sum of:
            0.084471956 = weight(_text_:data in 269) [ClassicSimilarity], result of:
              0.084471956 = score(doc=269,freq=4.0), product of:
                0.14247625 = queryWeight, product of:
                  3.1620505 = idf(docFreq=5088, maxDocs=44218)
                  0.04505818 = queryNorm
                0.5928845 = fieldWeight in 269, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  3.1620505 = idf(docFreq=5088, maxDocs=44218)
                  0.09375 = fieldNorm(doc=269)
          0.5 = coord(1/2)
      0.4 = coord(2/5)
    
  14. El-Sherbini, M.A.: Cataloging and classification : review of the literature 2005-06 (2008) 0.14
    0.14106841 = product of:
      0.23511402 = sum of:
        0.15033525 = weight(_text_:readable in 249) [ClassicSimilarity], result of:
          0.15033525 = score(doc=249,freq=2.0), product of:
            0.2768342 = queryWeight, product of:
              6.1439276 = idf(docFreq=257, maxDocs=44218)
              0.04505818 = queryNorm
            0.5430516 = fieldWeight in 249, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              6.1439276 = idf(docFreq=257, maxDocs=44218)
              0.0625 = fieldNorm(doc=249)
        0.060359728 = weight(_text_:bibliographic in 249) [ClassicSimilarity], result of:
          0.060359728 = score(doc=249,freq=2.0), product of:
            0.17541347 = queryWeight, product of:
              3.893044 = idf(docFreq=2449, maxDocs=44218)
              0.04505818 = queryNorm
            0.34409973 = fieldWeight in 249, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.893044 = idf(docFreq=2449, maxDocs=44218)
              0.0625 = fieldNorm(doc=249)
        0.024419045 = product of:
          0.04883809 = sum of:
            0.04883809 = weight(_text_:22 in 249) [ClassicSimilarity], result of:
              0.04883809 = score(doc=249,freq=2.0), product of:
                0.15778607 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.04505818 = queryNorm
                0.30952093 = fieldWeight in 249, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0625 = fieldNorm(doc=249)
          0.5 = coord(1/2)
      0.6 = coord(3/5)
    
    Abstract
    This paper reviews library literature on cataloging and classification published in 2005-06. It covers pertinent literature in the following areas: the future of cataloging; Functional Requirement for Bibliographic Records (FRBR); metadata and its applications and relation to Machine-Readable Cataloging (MARC); cataloging tools and standards; authority control; and recruitment, training, and the changing role of catalogers.
    Date
    10. 9.2000 17:38:22
  15. Svenonius, E.; Molto, M.: Automatic derivation of name access points in cataloging (1990) 0.14
    0.13836314 = product of:
      0.23060521 = sum of:
        0.15033525 = weight(_text_:readable in 3569) [ClassicSimilarity], result of:
          0.15033525 = score(doc=3569,freq=2.0), product of:
            0.2768342 = queryWeight, product of:
              6.1439276 = idf(docFreq=257, maxDocs=44218)
              0.04505818 = queryNorm
            0.5430516 = fieldWeight in 3569, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              6.1439276 = idf(docFreq=257, maxDocs=44218)
              0.0625 = fieldNorm(doc=3569)
        0.060359728 = weight(_text_:bibliographic in 3569) [ClassicSimilarity], result of:
          0.060359728 = score(doc=3569,freq=2.0), product of:
            0.17541347 = queryWeight, product of:
              3.893044 = idf(docFreq=2449, maxDocs=44218)
              0.04505818 = queryNorm
            0.34409973 = fieldWeight in 3569, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.893044 = idf(docFreq=2449, maxDocs=44218)
              0.0625 = fieldNorm(doc=3569)
        0.01991023 = product of:
          0.03982046 = sum of:
            0.03982046 = weight(_text_:data in 3569) [ClassicSimilarity], result of:
              0.03982046 = score(doc=3569,freq=2.0), product of:
                0.14247625 = queryWeight, product of:
                  3.1620505 = idf(docFreq=5088, maxDocs=44218)
                  0.04505818 = queryNorm
                0.2794884 = fieldWeight in 3569, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.1620505 = idf(docFreq=5088, maxDocs=44218)
                  0.0625 = fieldNorm(doc=3569)
          0.5 = coord(1/2)
      0.6 = coord(3/5)
    
    Abstract
    Reports the results of research designed to explore the feasibility of automatically deriving name access points from machine readable title pages of English language monographs. Results show that approximately 88% of the access points selected by the Library of Congress or the National Library of Medicine could be automatically derived from title page data. These results have implications for the design of bibliographic standards and on-line catalogues.
  16. Barry, R.K.: ¬The role of character sets in library automation : the development of 8 bit sets and Unicode (1997) 0.14
    0.13836314 = product of:
      0.23060521 = sum of:
        0.15033525 = weight(_text_:readable in 7891) [ClassicSimilarity], result of:
          0.15033525 = score(doc=7891,freq=2.0), product of:
            0.2768342 = queryWeight, product of:
              6.1439276 = idf(docFreq=257, maxDocs=44218)
              0.04505818 = queryNorm
            0.5430516 = fieldWeight in 7891, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              6.1439276 = idf(docFreq=257, maxDocs=44218)
              0.0625 = fieldNorm(doc=7891)
        0.060359728 = weight(_text_:bibliographic in 7891) [ClassicSimilarity], result of:
          0.060359728 = score(doc=7891,freq=2.0), product of:
            0.17541347 = queryWeight, product of:
              3.893044 = idf(docFreq=2449, maxDocs=44218)
              0.04505818 = queryNorm
            0.34409973 = fieldWeight in 7891, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.893044 = idf(docFreq=2449, maxDocs=44218)
              0.0625 = fieldNorm(doc=7891)
        0.01991023 = product of:
          0.03982046 = sum of:
            0.03982046 = weight(_text_:data in 7891) [ClassicSimilarity], result of:
              0.03982046 = score(doc=7891,freq=2.0), product of:
                0.14247625 = queryWeight, product of:
                  3.1620505 = idf(docFreq=5088, maxDocs=44218)
                  0.04505818 = queryNorm
                0.2794884 = fieldWeight in 7891, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.1620505 = idf(docFreq=5088, maxDocs=44218)
                  0.0625 = fieldNorm(doc=7891)
          0.5 = coord(1/2)
      0.6 = coord(3/5)
    
    Abstract
    Offers a basic understanding of coded character sets in machine readable data with a particular focus on their role in library automation. Discusses character sets in general and assesses the many 8-bit character sets now in use and the impact of the development of universal (16 bit character set) Unicode. Considers parallels with the development of alphabets
    Source
    International cataloguing and bibliographic control. 26(1997) no.1, S.14-17
  17. Guenther, R.S.: Automating the Library of Congress Classification Scheme : implementation of the USMARC format for classification data (1996) 0.14
    0.13621907 = product of:
      0.22703177 = sum of:
        0.13154334 = weight(_text_:readable in 5578) [ClassicSimilarity], result of:
          0.13154334 = score(doc=5578,freq=2.0), product of:
            0.2768342 = queryWeight, product of:
              6.1439276 = idf(docFreq=257, maxDocs=44218)
              0.04505818 = queryNorm
            0.47517014 = fieldWeight in 5578, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              6.1439276 = idf(docFreq=257, maxDocs=44218)
              0.0546875 = fieldNorm(doc=5578)
        0.052814763 = weight(_text_:bibliographic in 5578) [ClassicSimilarity], result of:
          0.052814763 = score(doc=5578,freq=2.0), product of:
            0.17541347 = queryWeight, product of:
              3.893044 = idf(docFreq=2449, maxDocs=44218)
              0.04505818 = queryNorm
            0.30108726 = fieldWeight in 5578, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.893044 = idf(docFreq=2449, maxDocs=44218)
              0.0546875 = fieldNorm(doc=5578)
        0.042673662 = product of:
          0.085347325 = sum of:
            0.085347325 = weight(_text_:data in 5578) [ClassicSimilarity], result of:
              0.085347325 = score(doc=5578,freq=12.0), product of:
                0.14247625 = queryWeight, product of:
                  3.1620505 = idf(docFreq=5088, maxDocs=44218)
                  0.04505818 = queryNorm
                0.59902847 = fieldWeight in 5578, product of:
                  3.4641016 = tf(freq=12.0), with freq of:
                    12.0 = termFreq=12.0
                  3.1620505 = idf(docFreq=5088, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=5578)
          0.5 = coord(1/2)
      0.6 = coord(3/5)
    
    Abstract
    Potential uses for classification data in machine readable form and reasons for the development of a standard, the USMARC Format for Classification Data, which allows for classification data to interact with other USMARC bibliographic and authority data are discussed. The development, structure, content, and use of the standard is reviewed with implementation decisions for the Library of Congress Classification scheme noted. The author examines the implementation of USMARC classification at LC, the conversion of the schedules, and the functionality of the software being used. Problems in the effort are explored, and enhancements desired for the online classification system are considered.
    Object
    USMARC for classification data
  18. Fayen, E.G.: Loading local machine readable data files : issues, problems, and answers (1989) 0.14
    0.13619639 = product of:
      0.34049097 = sum of:
        0.3006705 = weight(_text_:readable in 2157) [ClassicSimilarity], result of:
          0.3006705 = score(doc=2157,freq=2.0), product of:
            0.2768342 = queryWeight, product of:
              6.1439276 = idf(docFreq=257, maxDocs=44218)
              0.04505818 = queryNorm
            1.0861032 = fieldWeight in 2157, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              6.1439276 = idf(docFreq=257, maxDocs=44218)
              0.125 = fieldNorm(doc=2157)
        0.03982046 = product of:
          0.07964092 = sum of:
            0.07964092 = weight(_text_:data in 2157) [ClassicSimilarity], result of:
              0.07964092 = score(doc=2157,freq=2.0), product of:
                0.14247625 = queryWeight, product of:
                  3.1620505 = idf(docFreq=5088, maxDocs=44218)
                  0.04505818 = queryNorm
                0.5589768 = fieldWeight in 2157, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.1620505 = idf(docFreq=5088, maxDocs=44218)
                  0.125 = fieldNorm(doc=2157)
          0.5 = coord(1/2)
      0.4 = coord(2/5)
    
  19. Dodd, S.A.: Cataloging machine-readable data files : an interpretive manual (1982) 0.14
    0.13619639 = product of:
      0.34049097 = sum of:
        0.3006705 = weight(_text_:readable in 4176) [ClassicSimilarity], result of:
          0.3006705 = score(doc=4176,freq=2.0), product of:
            0.2768342 = queryWeight, product of:
              6.1439276 = idf(docFreq=257, maxDocs=44218)
              0.04505818 = queryNorm
            1.0861032 = fieldWeight in 4176, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              6.1439276 = idf(docFreq=257, maxDocs=44218)
              0.125 = fieldNorm(doc=4176)
        0.03982046 = product of:
          0.07964092 = sum of:
            0.07964092 = weight(_text_:data in 4176) [ClassicSimilarity], result of:
              0.07964092 = score(doc=4176,freq=2.0), product of:
                0.14247625 = queryWeight, product of:
                  3.1620505 = idf(docFreq=5088, maxDocs=44218)
                  0.04505818 = queryNorm
                0.5589768 = fieldWeight in 4176, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.1620505 = idf(docFreq=5088, maxDocs=44218)
                  0.125 = fieldNorm(doc=4176)
          0.5 = coord(1/2)
      0.4 = coord(2/5)
    
  20. Schipman, P.B.: Generation and uses of machine-readable data bases (1975) 0.14
    0.13619639 = product of:
      0.34049097 = sum of:
        0.3006705 = weight(_text_:readable in 266) [ClassicSimilarity], result of:
          0.3006705 = score(doc=266,freq=2.0), product of:
            0.2768342 = queryWeight, product of:
              6.1439276 = idf(docFreq=257, maxDocs=44218)
              0.04505818 = queryNorm
            1.0861032 = fieldWeight in 266, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              6.1439276 = idf(docFreq=257, maxDocs=44218)
              0.125 = fieldNorm(doc=266)
        0.03982046 = product of:
          0.07964092 = sum of:
            0.07964092 = weight(_text_:data in 266) [ClassicSimilarity], result of:
              0.07964092 = score(doc=266,freq=2.0), product of:
                0.14247625 = queryWeight, product of:
                  3.1620505 = idf(docFreq=5088, maxDocs=44218)
                  0.04505818 = queryNorm
                0.5589768 = fieldWeight in 266, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.1620505 = idf(docFreq=5088, maxDocs=44218)
                  0.125 = fieldNorm(doc=266)
          0.5 = coord(1/2)
      0.4 = coord(2/5)
    

Authors

Languages

Types

  • a 8470
  • m 811
  • el 603
  • s 389
  • x 69
  • r 68
  • b 45
  • i 36
  • p 34
  • n 19
  • ? 13
  • d 6
  • l 4
  • h 2
  • u 2
  • z 2
  • au 1
  • More… Less…

Themes

Subjects

Classifications