Search (5 results, page 1 of 1)

  • × author_ss:"Becker, C."
  • × type_ss:"a"
  1. Maemura, E.; Moles, N.; Becker, C.: Organizational assessment frameworks for digital preservation : a literature review and mapping (2017) 0.02
    0.024083475 = product of:
      0.0963339 = sum of:
        0.0963339 = weight(_text_:digital in 3743) [ClassicSimilarity], result of:
          0.0963339 = score(doc=3743,freq=10.0), product of:
            0.19770671 = queryWeight, product of:
              3.944552 = idf(docFreq=2326, maxDocs=44218)
              0.050121464 = queryNorm
            0.4872566 = fieldWeight in 3743, product of:
              3.1622777 = tf(freq=10.0), with freq of:
                10.0 = termFreq=10.0
              3.944552 = idf(docFreq=2326, maxDocs=44218)
              0.0390625 = fieldNorm(doc=3743)
      0.25 = coord(1/4)
    
    Abstract
    As the field of digital preservation (DP) matures, there is an increasing need to systematically assess an organization's abilities to achieve its digital preservation goals, and a wide variety of assessment tools have been created for this purpose. This article aims to map the landscape of research in this area, evaluate the current maturity of knowledge on this central question in DP and provide direction for future research. To do so, this paper reviews assessment frameworks in digital preservation through a systematic literature search and categorizes the literature by type of research. The analysis shows that publication output around assessment in digital preservation has increased markedly over time, but most existing work focuses on developing new models rather than rigorous evaluation and validation of existing frameworks. Significant gaps are present in the application of robust conceptual foundations and design methods, and in the level of empirical evidence available to enable the evaluation and validation of assessment models. The analysis and comparison with other fields suggest that the design of assessment models in DP should be studied rigorously in both theory and practice, and that the development of future models will benefit from applying existing methods, processes, and principles for model design.
  2. Becker, C.; Rauber, A.: Decision criteria in digital preservation : what to measure and how (2011) 0.02
    0.021540914 = product of:
      0.086163655 = sum of:
        0.086163655 = weight(_text_:digital in 4456) [ClassicSimilarity], result of:
          0.086163655 = score(doc=4456,freq=8.0), product of:
            0.19770671 = queryWeight, product of:
              3.944552 = idf(docFreq=2326, maxDocs=44218)
              0.050121464 = queryNorm
            0.4358155 = fieldWeight in 4456, product of:
              2.828427 = tf(freq=8.0), with freq of:
                8.0 = termFreq=8.0
              3.944552 = idf(docFreq=2326, maxDocs=44218)
              0.0390625 = fieldNorm(doc=4456)
      0.25 = coord(1/4)
    
    Abstract
    The enormous amount of valuable information that is produced today and needs to be made available over the long-term has led to increased efforts in scalable, automated solutions for long-term digital preservation. The mission of preservation planning is to define the optimal actions to ensure future access to digital content and react to changes that require adjustments in repository operations. Considerable effort has been spent in the past on defining, implementing, and validating a framework and system for preservation planning. This article sheds light on the actual decision criteria and influence factors to be considered when choosing digital preservation actions. It is based on an extensive evaluation of case studies on preservation planning for a range of different types of objects with partners from different institutional backgrounds. We categorize decision criteria from a number of real-world decision-making instances in a taxonomy. We show that a majority of the criteria can be evaluated by applying automated measurements under realistic conditions, and demonstrate that controlled experimentation and automated measurements can be used to substantially improve repeatability of decisions and reduce the effort needed to evaluate preservation components. The presented measurement framework enables scalable preservation and monitoring and supports trust in preservation decisions because extensive evidence is produced in a reproducible, automated way and documented as the basis of decision making in a standardized form.
  3. Becker, C.; Maemura, E.; Moles, N.: ¬The design and use of assessment frameworks in digital curation (2020) 0.02
    0.01865498 = product of:
      0.07461992 = sum of:
        0.07461992 = weight(_text_:digital in 5508) [ClassicSimilarity], result of:
          0.07461992 = score(doc=5508,freq=6.0), product of:
            0.19770671 = queryWeight, product of:
              3.944552 = idf(docFreq=2326, maxDocs=44218)
              0.050121464 = queryNorm
            0.37742734 = fieldWeight in 5508, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              3.944552 = idf(docFreq=2326, maxDocs=44218)
              0.0390625 = fieldNorm(doc=5508)
      0.25 = coord(1/4)
    
    Abstract
    To understand and improve their current abilities and maturity, organizations use diagnostic instruments such as maturity models and other assessment frameworks. Increasing numbers of these are being developed in digital curation. Their central role in strategic decision making raises the need to evaluate their fitness for this purpose and develop guidelines for their design and evaluation. A comprehensive review of assessment frameworks, however, found little evidence that existing assessment frameworks have been evaluated systematically, and no methods for their evaluation. This article proposes a new methodology for evaluating the design and use of assessment frameworks. It builds on prior research on maturity models and combines analytic and empirical evaluation methods to explain how the design of assessment frameworks influences their application in practice, and how the design process can effectively take this into account. We present the evaluation methodology and its application to two frameworks. The evaluation results lead to guidelines for the design process of assessment frameworks in digital curation. The methodology provides insights to the designers of the evaluated frameworks that they can consider in future revisions; methodical guidance for researchers in the field; and practical insights and words of caution to organizations keen on diagnosing their abilities.
  4. Duretec, K.; Becker, C.: Format technology lifecycle analysis (2017) 0.01
    0.012924549 = product of:
      0.051698197 = sum of:
        0.051698197 = weight(_text_:digital in 3836) [ClassicSimilarity], result of:
          0.051698197 = score(doc=3836,freq=2.0), product of:
            0.19770671 = queryWeight, product of:
              3.944552 = idf(docFreq=2326, maxDocs=44218)
              0.050121464 = queryNorm
            0.26148933 = fieldWeight in 3836, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.944552 = idf(docFreq=2326, maxDocs=44218)
              0.046875 = fieldNorm(doc=3836)
      0.25 = coord(1/4)
    
    Abstract
    The lifecycles of format technology have been a defining concern for digital stewardship research and practice. However, little evidence exists to provide robust methods for assessing the state of any given format technology and describing its evolution over time. This article introduces relevant models from diffusion theory and market research and presents a replicable analysis method to compute models of technology evolution. Data cleansing and the combination of multiple data sources enable the application of nonlinear regression to estimate the parameters of the Bass diffusion model on format technology market lifecycles. Through its application to a longitudinal data set from the UK Web Archive, we demonstrate that the method produces reliable results and show that the Bass model can be used to describe format lifecycles. By analyzing adoption patterns across market segments, new insights are inferred about how the diffusion of formats and products such as applications occurs over time. The analysis provides a stepping stone to a more robust and evidence-based approach to model technology evolution.
  5. Bizer, C.; Lehmann, J.; Kobilarov, G.; Auer, S.; Becker, C.; Cyganiak, R.; Hellmann, S.: DBpedia: a crystallization point for the Web of Data. (2009) 0.01
    0.0061664553 = product of:
      0.024665821 = sum of:
        0.024665821 = product of:
          0.049331643 = sum of:
            0.049331643 = weight(_text_:project in 1643) [ClassicSimilarity], result of:
              0.049331643 = score(doc=1643,freq=2.0), product of:
                0.21156175 = queryWeight, product of:
                  4.220981 = idf(docFreq=1764, maxDocs=44218)
                  0.050121464 = queryNorm
                0.23317845 = fieldWeight in 1643, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  4.220981 = idf(docFreq=1764, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=1643)
          0.5 = coord(1/2)
      0.25 = coord(1/4)
    
    Abstract
    The DBpedia project is a community effort to extract structured information from Wikipedia and to make this information accessible on the Web. The resulting DBpedia knowledge base currently describes over 2.6 million entities. For each of these entities, DBpedia defines a globally unique identifier that can be dereferenced over the Web into a rich RDF description of the entity, including human-readable definitions in 30 languages, relationships to other resources, classifications in four concept hierarchies, various facts as well as data-level links to other Web data sources describing the entity. Over the last year, an increasing number of data publishers have begun to set data-level links to DBpedia resources, making DBpedia a central interlinking hub for the emerging Web of data. Currently, the Web of interlinked data sources around DBpedia provides approximately 4.7 billion pieces of information and covers domains suc as geographic information, people, companies, films, music, genes, drugs, books, and scientific publications. This article describes the extraction of the DBpedia knowledge base, the current status of interlinking DBpedia with other data sources on the Web, and gives an overview of applications that facilitate the Web of Data around DBpedia.