Search (2 results, page 1 of 1)

  • × author_ss:"Coughlin, D.M."
  • × author_ss:"Jansen, B.J."
  1. Coughlin, D.M.; Campbell, M.C.; Jansen, B.J.: ¬A web analytics approach for appraising electronic resources in academic libraries (2016) 0.01
    0.013273074 = product of:
      0.026546149 = sum of:
        0.026546149 = product of:
          0.053092297 = sum of:
            0.053092297 = weight(_text_:web in 2770) [ClassicSimilarity], result of:
              0.053092297 = score(doc=2770,freq=6.0), product of:
                0.17002425 = queryWeight, product of:
                  3.2635105 = idf(docFreq=4597, maxDocs=44218)
                  0.052098576 = queryNorm
                0.3122631 = fieldWeight in 2770, product of:
                  2.4494898 = tf(freq=6.0), with freq of:
                    6.0 = termFreq=6.0
                  3.2635105 = idf(docFreq=4597, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=2770)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    University libraries provide access to thousands of journals and spend millions of dollars annually on electronic resources. With several commercial entities providing these electronic resources, the result can be silo systems and processes to evaluate cost and usage of these resources, making it difficult to provide meaningful analytics. In this research, we examine a subset of journals from a large research library using a web analytics approach with the goal of developing a framework for the analysis of library subscriptions. This foundational approach is implemented by comparing the impact to the cost, titles, and usage for the subset of journals and by assessing the funding area. Overall, the results highlight the benefit of a web analytics evaluation framework for university libraries and the impact of classifying titles based on the funding area. Furthermore, they show the statistical difference in both use and cost among the various funding areas when ranked by cost, eliminating the outliers of heavily used and highly expensive journals. Future work includes refining this model for a larger scale analysis tying metrics to library organizational objectives and for the creation of an online application to automate this analysis.
  2. Coughlin, D.M.; Jansen, B.J.: Modeling journal bibliometrics to predict downloads and inform purchase decisions at university research libraries (2016) 0.01
    0.007663213 = product of:
      0.015326426 = sum of:
        0.015326426 = product of:
          0.030652853 = sum of:
            0.030652853 = weight(_text_:web in 3094) [ClassicSimilarity], result of:
              0.030652853 = score(doc=3094,freq=2.0), product of:
                0.17002425 = queryWeight, product of:
                  3.2635105 = idf(docFreq=4597, maxDocs=44218)
                  0.052098576 = queryNorm
                0.18028519 = fieldWeight in 3094, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.2635105 = idf(docFreq=4597, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=3094)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    University libraries provide access to thousands of online journals and other content, spending millions of dollars annually on these electronic resources. Providing access to these online resources is costly, and it is difficult both to analyze the value of this content to the institution and to discern those journals that comparatively provide more value. In this research, we examine 1,510 journals from a large research university library, representing more than 40% of the university's annual subscription cost for electronic resources at the time of the study. We utilize a web analytics approach for the creation of a linear regression model to predict usage among these journals. We categorize metrics into two classes: global (journal focused) and local (institution dependent). Using 275 journals for our training set, our analysis shows that a combination of global and local metrics creates the strongest model for predicting full-text downloads. Our linear regression model has an accuracy of more than 80% in predicting downloads for the 1,235 journals in our test set. The implications of the findings are that university libraries that use local metrics have better insight into the value of a journal and therefore more efficient cost content management.