Search (3 results, page 1 of 1)

  • × author_ss:"Abramo, G."
  1. D'Angelo, C.A.; Giuffrida, C.; Abramo, G.: ¬A heuristic approach to author name disambiguation in bibliometrics databases for large-scale research assessments (2011) 0.01
    0.005567615 = product of:
      0.01948665 = sum of:
        0.0047638514 = product of:
          0.023819257 = sum of:
            0.023819257 = weight(_text_:system in 4190) [ClassicSimilarity], result of:
              0.023819257 = score(doc=4190,freq=2.0), product of:
                0.11408355 = queryWeight, product of:
                  3.1495528 = idf(docFreq=5152, maxDocs=44218)
                  0.03622214 = queryNorm
                0.20878783 = fieldWeight in 4190, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.1495528 = idf(docFreq=5152, maxDocs=44218)
                  0.046875 = fieldNorm(doc=4190)
          0.2 = coord(1/5)
        0.0147228 = product of:
          0.0294456 = sum of:
            0.0294456 = weight(_text_:22 in 4190) [ClassicSimilarity], result of:
              0.0294456 = score(doc=4190,freq=2.0), product of:
                0.12684377 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.03622214 = queryNorm
                0.23214069 = fieldWeight in 4190, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.046875 = fieldNorm(doc=4190)
          0.5 = coord(1/2)
      0.2857143 = coord(2/7)
    
    Abstract
    National exercises for the evaluation of research activity by universities are becoming regular practice in ever more countries. These exercises have mainly been conducted through the application of peer-review methods. Bibliometrics has not been able to offer a valid large-scale alternative because of almost overwhelming difficulties in identifying the true author of each publication. We will address this problem by presenting a heuristic approach to author name disambiguation in bibliometric datasets for large-scale research assessments. The application proposed concerns the Italian university system, comprising 80 universities and a research staff of over 60,000 scientists. The key advantage of the proposed approach is the ease of implementation. The algorithms are of practical application and have considerably better scalability and expandability properties than state-of-the-art unsupervised approaches. Moreover, the performance in terms of precision and recall, which can be further improved, seems thoroughly adequate for the typical needs of large-scale bibliometric research assessments.
    Date
    22. 1.2011 13:06:52
  2. Abramo, G.; D'Angelo, C.A.: ¬A decision support system for public research organizations participating in national research assessment exercises (2009) 0.00
    8.0203614E-4 = product of:
      0.0056142528 = sum of:
        0.0056142528 = product of:
          0.028071264 = sum of:
            0.028071264 = weight(_text_:system in 3123) [ClassicSimilarity], result of:
              0.028071264 = score(doc=3123,freq=4.0), product of:
                0.11408355 = queryWeight, product of:
                  3.1495528 = idf(docFreq=5152, maxDocs=44218)
                  0.03622214 = queryNorm
                0.24605882 = fieldWeight in 3123, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  3.1495528 = idf(docFreq=5152, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=3123)
          0.2 = coord(1/5)
      0.14285715 = coord(1/7)
    
    Abstract
    We are witnessing a rapid trend toward the adoption of exercises for evaluation of national research systems, generally based on peer review. They respond to two main needs: stimulating higher efficiency in research activities by public laboratories, and realizing better allocative efficiency in government funding of such institutions. However, the peer review approach is typified by several limitations that raise doubts for the achievement of the ultimate objectives. In particular, subjectivity of judgment, which occurs during the step of selecting research outputs to be submitted for the evaluations, risks heavily distorting both the final ratings of the organizations evaluated and the ultimate funding they receive. These distortions become ever more relevant if the evaluation is limited to small samples of the scientific production of the research institutions. The objective of the current study is to propose a quantitative methodology based on bibliometric data that would provide a reliable support for the process of selecting the best products of a laboratory, and thus limit distortions. Benefits are twofold: single research institutions can maximize the probability of receiving a fair evaluation coherent with the real quality of their research. At the same time, broader adoptions of this approach could also provide strong advantages at the macroeconomic level, since it guarantees financial allocations based on the real value of the institutions under evaluation. In this study the proposed methodology was applied to the hard science sectors of the Italian university research system for the period 2004-2006.
  3. Abramo, G.; D'Angelo, C.A.; Di Costa, F.: Testing the trade-off between productivity and quality in research activities (2009) 0.00
    5.6712516E-4 = product of:
      0.003969876 = sum of:
        0.003969876 = product of:
          0.01984938 = sum of:
            0.01984938 = weight(_text_:system in 3317) [ClassicSimilarity], result of:
              0.01984938 = score(doc=3317,freq=2.0), product of:
                0.11408355 = queryWeight, product of:
                  3.1495528 = idf(docFreq=5152, maxDocs=44218)
                  0.03622214 = queryNorm
                0.17398985 = fieldWeight in 3317, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.1495528 = idf(docFreq=5152, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=3317)
          0.2 = coord(1/5)
      0.14285715 = coord(1/7)
    
    Abstract
    In recent years there has been an increasingly pressing need for the evaluation of results from public-sector research activity, particularly to permit the efficient allocation of ever scarcer resources. Many of the studies and evaluation exercises that have been conducted at the national and international levels emphasize the quality dimension of research output, while neglecting that of productivity. This work is intended to test for the possible existence of correlation between quantity and quality of scientific production and determine whether the most productive researchers are also those that achieve results that are qualitatively better than those of their colleagues. The analysis proposed refers to the entire Italian university system and is based on the observation of production in the hard sciences by more than 26,000 researchers in the period 2001-2005. The results show that the output of more-productive researchers is superior in quality than that of less-productive researchers. The relation between productivity and quality results is largely insensitive to the types of indicators or the test methods applied and also seems to differ little among the various disciplines examined.