Search (224 results, page 1 of 12)

  • × theme_ss:"Metadaten"
  1. Philips, J.T.: Metadata - information about electronic records (1995) 0.05
    0.048837826 = product of:
      0.09767565 = sum of:
        0.08266887 = weight(_text_:management in 4556) [ClassicSimilarity], result of:
          0.08266887 = score(doc=4556,freq=8.0), product of:
            0.13874207 = queryWeight, product of:
              3.3706124 = idf(docFreq=4130, maxDocs=44218)
              0.041162275 = queryNorm
            0.5958457 = fieldWeight in 4556, product of:
              2.828427 = tf(freq=8.0), with freq of:
                8.0 = termFreq=8.0
              3.3706124 = idf(docFreq=4130, maxDocs=44218)
              0.0625 = fieldNorm(doc=4556)
        0.015006783 = product of:
          0.04502035 = sum of:
            0.04502035 = weight(_text_:29 in 4556) [ClassicSimilarity], result of:
              0.04502035 = score(doc=4556,freq=2.0), product of:
                0.14479601 = queryWeight, product of:
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.041162275 = queryNorm
                0.31092256 = fieldWeight in 4556, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.0625 = fieldNorm(doc=4556)
          0.33333334 = coord(1/3)
      0.5 = coord(2/4)
    
    Abstract
    Metadata is a term to describe the information required to documents the characteristics of information contained within databases. Describes the elements that make up metadata. A number of software tools exist to help apply document management principles to electronic records but they have, so far, been inadequately applied. Describes 2 initiative currently under way to develop software to automate many records management functions. Understanding document management principles as applied to electronic records are vital to records managers
    Source
    Records management quarterly. 29(1995) no.4, S.53-55
  2. Kurth, M.; Ruddy, D.; Rupp, N.: Repurposing MARC metadata : using digital project experience to develop a metadata management design (2004) 0.04
    0.040236898 = product of:
      0.080473796 = sum of:
        0.069319956 = weight(_text_:management in 4748) [ClassicSimilarity], result of:
          0.069319956 = score(doc=4748,freq=10.0), product of:
            0.13874207 = queryWeight, product of:
              3.3706124 = idf(docFreq=4130, maxDocs=44218)
              0.041162275 = queryNorm
            0.49963182 = fieldWeight in 4748, product of:
              3.1622777 = tf(freq=10.0), with freq of:
                10.0 = termFreq=10.0
              3.3706124 = idf(docFreq=4130, maxDocs=44218)
              0.046875 = fieldNorm(doc=4748)
        0.0111538395 = product of:
          0.03346152 = sum of:
            0.03346152 = weight(_text_:22 in 4748) [ClassicSimilarity], result of:
              0.03346152 = score(doc=4748,freq=2.0), product of:
                0.14414327 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.041162275 = queryNorm
                0.23214069 = fieldWeight in 4748, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.046875 = fieldNorm(doc=4748)
          0.33333334 = coord(1/3)
      0.5 = coord(2/4)
    
    Abstract
    Metadata and information technology staff in libraries that are building digital collections typically extract and manipulate MARC metadata sets to provide access to digital content via non-MARC schemes. Metadata processing in these libraries involves defining the relationships between metadata schemes, moving metadata between schemes, and coordinating the intellectual activity and physical resources required to create and manipulate metadata. Actively managing the non-MARC metadata resources used to build digital collections is something most of these libraries have only begun to do. This article proposes strategies for managing MARC metadata repurposing efforts as the first step in a coordinated approach to library metadata management. Guided by lessons learned from Cornell University library mapping and transformation activities, the authors apply the literature of data resource management to library metadata management and propose a model for managing MARC metadata repurposing processes through the implementation of a metadata management design.
    Source
    Library hi tech. 22(2004) no.2, S.144-152
  3. Stubley, P.: Cataloguing standards and metadata for e-commerce (1999) 0.03
    0.025574377 = product of:
      0.10229751 = sum of:
        0.10229751 = weight(_text_:management in 1915) [ClassicSimilarity], result of:
          0.10229751 = score(doc=1915,freq=4.0), product of:
            0.13874207 = queryWeight, product of:
              3.3706124 = idf(docFreq=4130, maxDocs=44218)
              0.041162275 = queryNorm
            0.73732144 = fieldWeight in 1915, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              3.3706124 = idf(docFreq=4130, maxDocs=44218)
              0.109375 = fieldNorm(doc=1915)
      0.25 = coord(1/4)
    
    Source
    Information management report. 1999, Dec., S.16-18
    Theme
    Information Resources Management
  4. Lupovici, C.: ¬L'¬information secondaire du document primaire : format MARC ou SGML? (1997) 0.02
    0.024649281 = product of:
      0.049298562 = sum of:
        0.03616763 = weight(_text_:management in 892) [ClassicSimilarity], result of:
          0.03616763 = score(doc=892,freq=2.0), product of:
            0.13874207 = queryWeight, product of:
              3.3706124 = idf(docFreq=4130, maxDocs=44218)
              0.041162275 = queryNorm
            0.2606825 = fieldWeight in 892, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.3706124 = idf(docFreq=4130, maxDocs=44218)
              0.0546875 = fieldNorm(doc=892)
        0.013130935 = product of:
          0.039392803 = sum of:
            0.039392803 = weight(_text_:29 in 892) [ClassicSimilarity], result of:
              0.039392803 = score(doc=892,freq=2.0), product of:
                0.14479601 = queryWeight, product of:
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.041162275 = queryNorm
                0.27205724 = fieldWeight in 892, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=892)
          0.33333334 = coord(1/3)
      0.5 = coord(2/4)
    
    Abstract
    Secondary information, e.g. MARC based bibliographic records, comprises structured data for identifying, tagging, retrieving and management of primary documents. SGML, the standard format for coding content and structure of primary documents, was introduced in 1986 as a publishing tool but is now being applied to bibliographic records. SGML now comprises standard definitions (DTD) for books, serials, articles and mathematical formulae. A simplified version (HTML) is used for Web pages. Pilot projects to develop SGML as a standard for bibliographic exchange include the Dublin Core, listing 13 descriptive elements for Internet documents; the French GRISELI programme using SGML for exchanging grey literature and US experiments on reformatting USMARC for use with SGML-based records
    Date
    29. 1.1996 16:50:24
  5. Clark, C.: Audio-visual resource discovery on the Web (1998) 0.02
    0.024649281 = product of:
      0.049298562 = sum of:
        0.03616763 = weight(_text_:management in 3201) [ClassicSimilarity], result of:
          0.03616763 = score(doc=3201,freq=2.0), product of:
            0.13874207 = queryWeight, product of:
              3.3706124 = idf(docFreq=4130, maxDocs=44218)
              0.041162275 = queryNorm
            0.2606825 = fieldWeight in 3201, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.3706124 = idf(docFreq=4130, maxDocs=44218)
              0.0546875 = fieldNorm(doc=3201)
        0.013130935 = product of:
          0.039392803 = sum of:
            0.039392803 = weight(_text_:29 in 3201) [ClassicSimilarity], result of:
              0.039392803 = score(doc=3201,freq=2.0), product of:
                0.14479601 = queryWeight, product of:
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.041162275 = queryNorm
                0.27205724 = fieldWeight in 3201, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=3201)
          0.33333334 = coord(1/3)
      0.5 = coord(2/4)
    
    Abstract
    Outlines the meatadata standard known as the Dublin Core, as well as the Instructional Management Systems Projects, an American Educom NLII initiative which is developing a specification and software for managing online learning resources. Gives the list of fields with brief descriptions from the IMS Metadata Dictionary, and describes the UK Performing Arts Data Service (PADS) workshops on moving image and sound resources with particular reference to the use of the Dublin Core for cataloguing sound recordings. The slow rate of audiovisual progress is touched on: 5 other relevant initiatives connected wit metadata are listed
    Source
    IASA journal. 1998, no.11, S.18-29
  6. Cole, T.: Qualified Dublin Core metadata for online journal articles (2002) 0.02
    0.024649281 = product of:
      0.049298562 = sum of:
        0.03616763 = weight(_text_:management in 962) [ClassicSimilarity], result of:
          0.03616763 = score(doc=962,freq=2.0), product of:
            0.13874207 = queryWeight, product of:
              3.3706124 = idf(docFreq=4130, maxDocs=44218)
              0.041162275 = queryNorm
            0.2606825 = fieldWeight in 962, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.3706124 = idf(docFreq=4130, maxDocs=44218)
              0.0546875 = fieldNorm(doc=962)
        0.013130935 = product of:
          0.039392803 = sum of:
            0.039392803 = weight(_text_:29 in 962) [ClassicSimilarity], result of:
              0.039392803 = score(doc=962,freq=2.0), product of:
                0.14479601 = queryWeight, product of:
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.041162275 = queryNorm
                0.27205724 = fieldWeight in 962, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=962)
          0.33333334 = coord(1/3)
      0.5 = coord(2/4)
    
    Abstract
    Timothy is the Mathematics Librarian and Associate Professor of Library Administration at the University of Illinois at Urbana-Champaign, IL. He describes his experience with encoding Qualified Dublin Core (DCQ) metadata in RDF/XML and concludes that although there are still some issues that need to be resolved, generally speaking expressing DCQ metadata in RDF/XML is still worthwhile. However, initial investment in some cases may not be warranted. The draft DCMI guidelines for expressing DCQ metadata in RDF/XML are adequate for generating DCQ/RDF metadata instances. The current need is to develop applications that will use these metadata instances to enhance resources management and discovery.
    Date
    28. 8.2002 19:38:29
  7. Desconnets, J.-C.; Chahdi, H.; Mougenot, I.: Application profile for earth observation images (2014) 0.02
    0.024649281 = product of:
      0.049298562 = sum of:
        0.03616763 = weight(_text_:management in 1573) [ClassicSimilarity], result of:
          0.03616763 = score(doc=1573,freq=2.0), product of:
            0.13874207 = queryWeight, product of:
              3.3706124 = idf(docFreq=4130, maxDocs=44218)
              0.041162275 = queryNorm
            0.2606825 = fieldWeight in 1573, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.3706124 = idf(docFreq=4130, maxDocs=44218)
              0.0546875 = fieldNorm(doc=1573)
        0.013130935 = product of:
          0.039392803 = sum of:
            0.039392803 = weight(_text_:29 in 1573) [ClassicSimilarity], result of:
              0.039392803 = score(doc=1573,freq=2.0), product of:
                0.14479601 = queryWeight, product of:
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.041162275 = queryNorm
                0.27205724 = fieldWeight in 1573, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=1573)
          0.33333334 = coord(1/3)
      0.5 = coord(2/4)
    
    Abstract
    Based on the concept of an application profile as proposed by the Dublin Core initiative, the work presented in this manuscript attempts to propose an application profile for the Earth Observation images. This approach aims to provide an open and extensible model facilitating the sharing and management of distributed images within decentralized architectures. It is intended to eventually cover the needs of discovery, localization, consulting, preservation and processing of data for decision support. We are using the Singapore framework recommendations to build the application profile. A particular focus on the formalization and representation of Description Set Profile (DSP) in RDF is proposed.
    Source
    Metadata and semantics research: 8th Research Conference, MTSR 2014, Karlsruhe, Germany, November 27-29, 2014, Proceedings. Eds.: S. Closs et al
  8. Heng, G.; Cole, T.W.; Tian, T.(C.); Han, M.-J.: Rethinking authority reconciliation process (2022) 0.02
    0.024649281 = product of:
      0.049298562 = sum of:
        0.03616763 = weight(_text_:management in 727) [ClassicSimilarity], result of:
          0.03616763 = score(doc=727,freq=2.0), product of:
            0.13874207 = queryWeight, product of:
              3.3706124 = idf(docFreq=4130, maxDocs=44218)
              0.041162275 = queryNorm
            0.2606825 = fieldWeight in 727, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.3706124 = idf(docFreq=4130, maxDocs=44218)
              0.0546875 = fieldNorm(doc=727)
        0.013130935 = product of:
          0.039392803 = sum of:
            0.039392803 = weight(_text_:29 in 727) [ClassicSimilarity], result of:
              0.039392803 = score(doc=727,freq=2.0), product of:
                0.14479601 = queryWeight, product of:
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.041162275 = queryNorm
                0.27205724 = fieldWeight in 727, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=727)
          0.33333334 = coord(1/3)
      0.5 = coord(2/4)
    
    Abstract
    Entity identity management and name reconciliation are intrinsic to both Linked Open Data (LOD) and traditional library authority control. Does this mean that LOD sources can facilitate authority control? This Emblematica Online case study examines the utility of five LOD sources for name reconciliation, comparing design differences regarding ontologies, linking models, and entity properties. It explores the challenges of name reconciliation in the LOD environment and provides lessons learned during a semi-automated name reconciliation process. It also briefly discusses the potential values and benefits of LOD authorities to the authority reconciliation process itself and library services in general.
    Date
    29. 9.2022 17:15:12
  9. Metadata and semantics research : 8th Research Conference, MTSR 2014, Karlsruhe, Germany, November 27-29, 2014, Proceedings (2014) 0.02
    0.02295703 = product of:
      0.04591406 = sum of:
        0.036534823 = weight(_text_:management in 2192) [ClassicSimilarity], result of:
          0.036534823 = score(doc=2192,freq=4.0), product of:
            0.13874207 = queryWeight, product of:
              3.3706124 = idf(docFreq=4130, maxDocs=44218)
              0.041162275 = queryNorm
            0.2633291 = fieldWeight in 2192, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              3.3706124 = idf(docFreq=4130, maxDocs=44218)
              0.0390625 = fieldNorm(doc=2192)
        0.00937924 = product of:
          0.02813772 = sum of:
            0.02813772 = weight(_text_:29 in 2192) [ClassicSimilarity], result of:
              0.02813772 = score(doc=2192,freq=2.0), product of:
                0.14479601 = queryWeight, product of:
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.041162275 = queryNorm
                0.19432661 = fieldWeight in 2192, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=2192)
          0.33333334 = coord(1/3)
      0.5 = coord(2/4)
    
    LCSH
    Database management
    Subject
    Database management
  10. Hooland, S. van; Bontemps, Y.; Kaufman, S.: Answering the call for more accountability : applying data profiling to museum metadata (2008) 0.02
    0.021077333 = product of:
      0.042154666 = sum of:
        0.031000827 = weight(_text_:management in 2644) [ClassicSimilarity], result of:
          0.031000827 = score(doc=2644,freq=2.0), product of:
            0.13874207 = queryWeight, product of:
              3.3706124 = idf(docFreq=4130, maxDocs=44218)
              0.041162275 = queryNorm
            0.22344214 = fieldWeight in 2644, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.3706124 = idf(docFreq=4130, maxDocs=44218)
              0.046875 = fieldNorm(doc=2644)
        0.0111538395 = product of:
          0.03346152 = sum of:
            0.03346152 = weight(_text_:22 in 2644) [ClassicSimilarity], result of:
              0.03346152 = score(doc=2644,freq=2.0), product of:
                0.14414327 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.041162275 = queryNorm
                0.23214069 = fieldWeight in 2644, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.046875 = fieldNorm(doc=2644)
          0.33333334 = coord(1/3)
      0.5 = coord(2/4)
    
    Abstract
    Although the issue of metadata quality is recognized as an important topic within the metadata research community, the cultural heritage sector has been slow to develop methodologies, guidelines and tools for addressing this topic in practice. This paper concentrates on metadata quality specifically within the museum sector and describes the potential of data-profiling techniques for metadata quality evaluation. A case study illustrates the application of a generalpurpose data-profiling tool on a large collection of metadata records from an ethnographic collection. After an analysis of the results of the case-study the paper reviews further steps in our research and presents the implementation of a metadata quality tool within an open-source collection management software.
    Source
    Metadata for semantic and social applications : proceedings of the International Conference on Dublin Core and Metadata Applications, Berlin, 22 - 26 September 2008, DC 2008: Berlin, Germany / ed. by Jane Greenberg and Wolfgang Klas
  11. Jun, W.: ¬A knowledge network constructed by integrating classification, thesaurus and metadata in a digital library (2003) 0.02
    0.018365625 = product of:
      0.03673125 = sum of:
        0.02922786 = weight(_text_:management in 1254) [ClassicSimilarity], result of:
          0.02922786 = score(doc=1254,freq=4.0), product of:
            0.13874207 = queryWeight, product of:
              3.3706124 = idf(docFreq=4130, maxDocs=44218)
              0.041162275 = queryNorm
            0.21066327 = fieldWeight in 1254, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              3.3706124 = idf(docFreq=4130, maxDocs=44218)
              0.03125 = fieldNorm(doc=1254)
        0.0075033917 = product of:
          0.022510175 = sum of:
            0.022510175 = weight(_text_:29 in 1254) [ClassicSimilarity], result of:
              0.022510175 = score(doc=1254,freq=2.0), product of:
                0.14479601 = queryWeight, product of:
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.041162275 = queryNorm
                0.15546128 = fieldWeight in 1254, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.03125 = fieldNorm(doc=1254)
          0.33333334 = coord(1/3)
      0.5 = coord(2/4)
    
    Abstract
    Knowledge management in digital libraries is a universal problem. Keyword-based searching is applied everywhere no matter whether the resources are indexed databases or full-text Web pages. In keyword matching, the valuable content description and indexing of the metadata, such as the subject descriptors and the classification notations, are merely treated as common keywords to be matched with the user query. Without the support of vocabulary control tools, such as classification systems and thesauri, the intelligent labor of content analysis, description and indexing in metadata production are seriously wasted. New retrieval paradigms are needed to exploit the potential of the metadata resources. Could classification and thesauri, which contain the condensed intelligence of generations of librarians, be used in a digital library to organize the networked information, especially metadata, to facilitate their usability and change the digital library into a knowledge management environment? To examine that question, we designed and implemented a new paradigm that incorporates a classification system, a thesaurus and metadata. The classification and the thesaurus are merged into a concept network, and the metadata are distributed into the nodes of the concept network according to their subjects. The abstract concept node instantiated with the related metadata records becomes a knowledge node. A coherent and consistent knowledge network is thus formed. It is not only a framework for resource organization but also a structure for knowledge navigation, retrieval and learning. We have built an experimental system based on the Chinese Classification and Thesaurus, which is the most comprehensive and authoritative in China, and we have incorporated more than 5000 bibliographic records in the computing domain from the Peking University Library. The result is encouraging. In this article, we review the tools, the architecture and the implementation of our experimental system, which is called Vision.
    Source
    Bulletin of the American Society for Information Science. 29(2003) no.2, S.24-28
  12. Chivers, A.; Feather, J.: ¬The management of digital data : a metadata approach (1998) 0.02
    0.018267412 = product of:
      0.07306965 = sum of:
        0.07306965 = weight(_text_:management in 2363) [ClassicSimilarity], result of:
          0.07306965 = score(doc=2363,freq=4.0), product of:
            0.13874207 = queryWeight, product of:
              3.3706124 = idf(docFreq=4130, maxDocs=44218)
              0.041162275 = queryNorm
            0.5266582 = fieldWeight in 2363, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              3.3706124 = idf(docFreq=4130, maxDocs=44218)
              0.078125 = fieldNorm(doc=2363)
      0.25 = coord(1/4)
    
    Abstract
    Reports on a research study, conducted at the Department of Information and Library Studies, Loughborough University, to investigate the potential of metadata for universal data management and explore the attitudes of UK information professionals to these issues
  13. Godby, C.J.; Smith, D.; Childress, E.: Encoding application profiles in a computational model of the crosswalk (2008) 0.02
    0.017564444 = product of:
      0.035128888 = sum of:
        0.025834022 = weight(_text_:management in 2649) [ClassicSimilarity], result of:
          0.025834022 = score(doc=2649,freq=2.0), product of:
            0.13874207 = queryWeight, product of:
              3.3706124 = idf(docFreq=4130, maxDocs=44218)
              0.041162275 = queryNorm
            0.18620178 = fieldWeight in 2649, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.3706124 = idf(docFreq=4130, maxDocs=44218)
              0.0390625 = fieldNorm(doc=2649)
        0.009294867 = product of:
          0.027884599 = sum of:
            0.027884599 = weight(_text_:22 in 2649) [ClassicSimilarity], result of:
              0.027884599 = score(doc=2649,freq=2.0), product of:
                0.14414327 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.041162275 = queryNorm
                0.19345059 = fieldWeight in 2649, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=2649)
          0.33333334 = coord(1/3)
      0.5 = coord(2/4)
    
    Abstract
    OCLC's Crosswalk Web Service (Godby, Smith and Childress, 2008) formalizes the notion of crosswalk, as defined in Gill,et al. (n.d.), by hiding technical details and permitting the semantic equivalences to emerge as the centerpiece. One outcome is that metadata experts, who are typically not programmers, can enter the translation logic into a spreadsheet that can be automatically converted into executable code. In this paper, we describe the implementation of the Dublin Core Terms application profile in the management of crosswalks involving MARC. A crosswalk that encodes an application profile extends the typical format with two columns: one that annotates the namespace to which an element belongs, and one that annotates a 'broader-narrower' relation between a pair of elements, such as Dublin Core coverage and Dublin Core Terms spatial. This information is sufficient to produce scripts written in OCLC's Semantic Equivalence Expression Language (or Seel), which are called from the Crosswalk Web Service to generate production-grade translations. With its focus on elements that can be mixed, matched, added, and redefined, the application profile (Heery and Patel, 2000) is a natural fit with the translation model of the Crosswalk Web Service, which attempts to achieve interoperability by mapping one pair of elements at a time.
    Source
    Metadata for semantic and social applications : proceedings of the International Conference on Dublin Core and Metadata Applications, Berlin, 22 - 26 September 2008, DC 2008: Berlin, Germany / ed. by Jane Greenberg and Wolfgang Klas
  14. Büttner, G.: Integration audiovisueller Aufzeichnungen in das Records Management einer Organisation : ein konzeptionelles Metadatenmodell (2017) 0.02
    0.015661046 = product of:
      0.06264418 = sum of:
        0.06264418 = weight(_text_:management in 4202) [ClassicSimilarity], result of:
          0.06264418 = score(doc=4202,freq=6.0), product of:
            0.13874207 = queryWeight, product of:
              3.3706124 = idf(docFreq=4130, maxDocs=44218)
              0.041162275 = queryNorm
            0.45151538 = fieldWeight in 4202, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              3.3706124 = idf(docFreq=4130, maxDocs=44218)
              0.0546875 = fieldNorm(doc=4202)
      0.25 = coord(1/4)
    
    Abstract
    Dieser Artikel stellt ein konzeptionelles Metadatenmodell vor, das auf Records verschiedener Medientypen anwendbar ist. Organisationen, die im Zuge ihrer Tätigkeit regelmäßig sowohl textbasierte als auch audiovisuelle Records erstellen, haben beide Medien im Sinne des Records Management zu verwalten. Dazu sind Metadaten, einschließlich der des zentralen Ordnungssystems für Records, ein Hauptwerkzeug. Inspiriert durch medienübergreifende, auf gemeinsamen Zugriff ausgerichtete Metadatenmodelle, wird ein neues Modell vorgeschlagen. Es kombiniert die hierarchische Abstraktion der existierenden Modelle mit den Prinzipien des Records Management. Das Modell kann Organisationen dabei helfen, Entscheidungen über Metadaten für ihre Records zu treffen.
  15. Chen, C.C.; Chen, H.H.; Chen, K.H.: ¬The design of the XML/Metadata management system (2000) 0.02
    0.015500413 = product of:
      0.062001653 = sum of:
        0.062001653 = weight(_text_:management in 4633) [ClassicSimilarity], result of:
          0.062001653 = score(doc=4633,freq=2.0), product of:
            0.13874207 = queryWeight, product of:
              3.3706124 = idf(docFreq=4130, maxDocs=44218)
              0.041162275 = queryNorm
            0.44688427 = fieldWeight in 4633, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.3706124 = idf(docFreq=4130, maxDocs=44218)
              0.09375 = fieldNorm(doc=4633)
      0.25 = coord(1/4)
    
  16. White, H.: Examining scientific vocabulary : mapping controlled vocabularies with free text keywords (2013) 0.01
    0.014939285 = product of:
      0.05975714 = sum of:
        0.05975714 = product of:
          0.08963571 = sum of:
            0.04502035 = weight(_text_:29 in 1953) [ClassicSimilarity], result of:
              0.04502035 = score(doc=1953,freq=2.0), product of:
                0.14479601 = queryWeight, product of:
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.041162275 = queryNorm
                0.31092256 = fieldWeight in 1953, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.0625 = fieldNorm(doc=1953)
            0.044615358 = weight(_text_:22 in 1953) [ClassicSimilarity], result of:
              0.044615358 = score(doc=1953,freq=2.0), product of:
                0.14414327 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.041162275 = queryNorm
                0.30952093 = fieldWeight in 1953, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0625 = fieldNorm(doc=1953)
          0.6666667 = coord(2/3)
      0.25 = coord(1/4)
    
    Date
    29. 5.2015 19:09:22
  17. Wolfekuhler, M.R.; Punch, W.F.: Finding salient features for personal Web pages categories (1997) 0.01
    0.013071874 = product of:
      0.052287497 = sum of:
        0.052287497 = product of:
          0.07843124 = sum of:
            0.039392803 = weight(_text_:29 in 2673) [ClassicSimilarity], result of:
              0.039392803 = score(doc=2673,freq=2.0), product of:
                0.14479601 = queryWeight, product of:
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.041162275 = queryNorm
                0.27205724 = fieldWeight in 2673, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=2673)
            0.03903844 = weight(_text_:22 in 2673) [ClassicSimilarity], result of:
              0.03903844 = score(doc=2673,freq=2.0), product of:
                0.14414327 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.041162275 = queryNorm
                0.2708308 = fieldWeight in 2673, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=2673)
          0.6666667 = coord(2/3)
      0.25 = coord(1/4)
    
    Date
    1. 8.1996 22:08:06
    Source
    Computer networks and ISDN systems. 29(1997) no.8, S.1147-1156
  18. Integrating multiple overlapping metadata standards (1999) 0.01
    0.012917011 = product of:
      0.051668044 = sum of:
        0.051668044 = weight(_text_:management in 4052) [ClassicSimilarity], result of:
          0.051668044 = score(doc=4052,freq=2.0), product of:
            0.13874207 = queryWeight, product of:
              3.3706124 = idf(docFreq=4130, maxDocs=44218)
              0.041162275 = queryNorm
            0.37240356 = fieldWeight in 4052, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.3706124 = idf(docFreq=4130, maxDocs=44218)
              0.078125 = fieldNorm(doc=4052)
      0.25 = coord(1/4)
    
    Abstract
    This special issue of JASIS addresses different applications of metadata standards in geospatial collections, education, historical costume collection, data management, and information retrieval, end explores the future thinking of metadata standards for digital libraries
  19. Madsen, M.S.; Fogg, I.; Ruggles, C.: Metadata systems : integrative information technologies (1994) 0.01
    0.012787188 = product of:
      0.051148754 = sum of:
        0.051148754 = weight(_text_:management in 1055) [ClassicSimilarity], result of:
          0.051148754 = score(doc=1055,freq=4.0), product of:
            0.13874207 = queryWeight, product of:
              3.3706124 = idf(docFreq=4130, maxDocs=44218)
              0.041162275 = queryNorm
            0.36866072 = fieldWeight in 1055, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              3.3706124 = idf(docFreq=4130, maxDocs=44218)
              0.0546875 = fieldNorm(doc=1055)
      0.25 = coord(1/4)
    
    Abstract
    Metadata systems are concerned with the management of data which describes other data (datasets, catalogues, or actual database management systems) and are presently the subject of intensive research. Metadata systems can be used to store richly detailed forms of information, perform seamless wide ranging searches of information distributed across networks, and to integrate information stored in disparate repositories. Describes a model design and methods of implementation derived from the experience of the Leicester University Metadata Project. The approach utilizes the incorporation of semantic metadata in addition to resource metadata, resulting in a generally more powerful system than existing global directory services. Feature of the class of design is flexibility or implementation, with the ability to provide a coherent metadata system functioning above heterogeneous autonomous distributed databases
  20. Reed, B.: Metadata: core record or core business? (1997) 0.01
    0.012787188 = product of:
      0.051148754 = sum of:
        0.051148754 = weight(_text_:management in 1764) [ClassicSimilarity], result of:
          0.051148754 = score(doc=1764,freq=4.0), product of:
            0.13874207 = queryWeight, product of:
              3.3706124 = idf(docFreq=4130, maxDocs=44218)
              0.041162275 = queryNorm
            0.36866072 = fieldWeight in 1764, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              3.3706124 = idf(docFreq=4130, maxDocs=44218)
              0.0546875 = fieldNorm(doc=1764)
      0.25 = coord(1/4)
    
    Abstract
    Raises critical questions about the way archivists should be managing the metadata associated with records management and recordkeeping processes in order to maintain records in their context through time in complex and rapidly changing environments. Explores some current models for specifying record metadata, drawing on the outcomes of research projects and standards activities. Speculates on the potential value of defining a core set of record metadata. The mapping of the overlap between the metadata specified in the Pittsburgh University and British Columbia University projects, and the Australian Records Management Standards, reveals a possible core set of record metadada, analysis of which has shown that it would essentially enable the descriptions of the records as passive objects

Authors

Years

Languages

Types

  • a 200
  • el 14
  • m 13
  • s 13
  • b 2
  • x 1
  • More… Less…

Subjects