Search (12 results, page 1 of 1)

  • × year_i:[2010 TO 2020}
  • × theme_ss:"Retrievalalgorithmen"
  1. Dang, E.K.F.; Luk, R.W.P.; Allan, J.; Ho, K.S.; Chung, K.F.L.; Lee, D.L.: ¬A new context-dependent term weight computed by boost and discount using relevance information (2010) 0.02
    0.01684578 = product of:
      0.03369156 = sum of:
        0.03369156 = product of:
          0.06738312 = sum of:
            0.06738312 = weight(_text_:t in 4120) [ClassicSimilarity], result of:
              0.06738312 = score(doc=4120,freq=6.0), product of:
                0.17876579 = queryWeight, product of:
                  3.9394085 = idf(docFreq=2338, maxDocs=44218)
                  0.04537884 = queryNorm
                0.37693518 = fieldWeight in 4120, product of:
                  2.4494898 = tf(freq=6.0), with freq of:
                    6.0 = termFreq=6.0
                  3.9394085 = idf(docFreq=2338, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=4120)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    We studied the effectiveness of a new class of context-dependent term weights for information retrieval. Unlike the traditional term frequency-inverse document frequency (TF-IDF), the new weighting of a term t in a document d depends not only on the occurrence statistics of t alone but also on the terms found within a text window (or "document-context") centered on t. We introduce a Boost and Discount (B&D) procedure which utilizes partial relevance information to compute the context-dependent term weights of query terms according to a logistic regression model. We investigate the effectiveness of the new term weights compared with the context-independent BM25 weights in the setting of relevance feedback. We performed experiments with title queries of the TREC-6, -7, -8, and 2005 collections, comparing the residual Mean Average Precision (MAP) measures obtained using B&D term weights and those obtained by a baseline using BM25 weights. Given either 10 or 20 relevance judgments of the top retrieved documents, using the new term weights yields improvement over the baseline for all collections tested. The MAP obtained with the new weights has relative improvement over the baseline by 3.3 to 15.2%, with statistical significance at the 95% confidence level across all four collections.
  2. Behnert, C.; Borst, T.: Neue Formen der Relevanz-Sortierung in bibliothekarischen Informationssystemen : das DFG-Projekt LibRank (2015) 0.02
    0.015561464 = product of:
      0.031122928 = sum of:
        0.031122928 = product of:
          0.062245857 = sum of:
            0.062245857 = weight(_text_:t in 5392) [ClassicSimilarity], result of:
              0.062245857 = score(doc=5392,freq=2.0), product of:
                0.17876579 = queryWeight, product of:
                  3.9394085 = idf(docFreq=2338, maxDocs=44218)
                  0.04537884 = queryNorm
                0.34819782 = fieldWeight in 5392, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.9394085 = idf(docFreq=2338, maxDocs=44218)
                  0.0625 = fieldNorm(doc=5392)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
  3. Behnert, C.; Plassmeier, K.; Borst, T.; Lewandowski, D.: Evaluierung von Rankingverfahren für bibliothekarische Informationssysteme (2019) 0.01
    0.013616281 = product of:
      0.027232561 = sum of:
        0.027232561 = product of:
          0.054465123 = sum of:
            0.054465123 = weight(_text_:t in 5023) [ClassicSimilarity], result of:
              0.054465123 = score(doc=5023,freq=2.0), product of:
                0.17876579 = queryWeight, product of:
                  3.9394085 = idf(docFreq=2338, maxDocs=44218)
                  0.04537884 = queryNorm
                0.30467308 = fieldWeight in 5023, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.9394085 = idf(docFreq=2338, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=5023)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
  4. Bornmann, L.; Mutz, R.: From P100 to P100' : a new citation-rank approach (2014) 0.01
    0.012296414 = product of:
      0.024592828 = sum of:
        0.024592828 = product of:
          0.049185656 = sum of:
            0.049185656 = weight(_text_:22 in 1431) [ClassicSimilarity], result of:
              0.049185656 = score(doc=1431,freq=2.0), product of:
                0.15890898 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.04537884 = queryNorm
                0.30952093 = fieldWeight in 1431, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0625 = fieldNorm(doc=1431)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Date
    22. 8.2014 17:05:18
  5. Tober, M.; Hennig, L.; Furch, D.: SEO Ranking-Faktoren und Rang-Korrelationen 2014 : Google Deutschland (2014) 0.01
    0.012296414 = product of:
      0.024592828 = sum of:
        0.024592828 = product of:
          0.049185656 = sum of:
            0.049185656 = weight(_text_:22 in 1484) [ClassicSimilarity], result of:
              0.049185656 = score(doc=1484,freq=2.0), product of:
                0.15890898 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.04537884 = queryNorm
                0.30952093 = fieldWeight in 1484, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0625 = fieldNorm(doc=1484)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Date
    13. 9.2014 14:45:22
  6. Efron, M.: Linear time series models for term weighting in information retrieval (2010) 0.01
    0.011671098 = product of:
      0.023342196 = sum of:
        0.023342196 = product of:
          0.04668439 = sum of:
            0.04668439 = weight(_text_:t in 3688) [ClassicSimilarity], result of:
              0.04668439 = score(doc=3688,freq=2.0), product of:
                0.17876579 = queryWeight, product of:
                  3.9394085 = idf(docFreq=2338, maxDocs=44218)
                  0.04537884 = queryNorm
                0.26114836 = fieldWeight in 3688, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.9394085 = idf(docFreq=2338, maxDocs=44218)
                  0.046875 = fieldNorm(doc=3688)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    Common measures of term importance in information retrieval (IR) rely on counts of term frequency; rare terms receive higher weight in document ranking than common terms receive. However, realistic scenarios yield additional information about terms in a collection. Of interest in this article is the temporal behavior of terms as a collection changes over time. We propose capturing each term's collection frequency at discrete time intervals over the lifespan of a corpus and analyzing the resulting time series. We hypothesize the collection frequency of a weakly discriminative term x at time t is predictable by a linear model of the term's prior observations. On the other hand, a linear time series model for a strong discriminators' collection frequency will yield a poor fit to the data. Operationalizing this hypothesis, we induce three time-based measures of term importance and test these against state-of-the-art term weighting models.
  7. Zhang, W.; Yoshida, T.; Tang, X.: ¬A comparative study of TF*IDF, LSI and multi-words for text classification (2011) 0.01
    0.011671098 = product of:
      0.023342196 = sum of:
        0.023342196 = product of:
          0.04668439 = sum of:
            0.04668439 = weight(_text_:t in 1165) [ClassicSimilarity], result of:
              0.04668439 = score(doc=1165,freq=2.0), product of:
                0.17876579 = queryWeight, product of:
                  3.9394085 = idf(docFreq=2338, maxDocs=44218)
                  0.04537884 = queryNorm
                0.26114836 = fieldWeight in 1165, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.9394085 = idf(docFreq=2338, maxDocs=44218)
                  0.046875 = fieldNorm(doc=1165)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
  8. Ravana, S.D.; Rajagopal, P.; Balakrishnan, V.: Ranking retrieval systems using pseudo relevance judgments (2015) 0.01
    0.010868597 = product of:
      0.021737194 = sum of:
        0.021737194 = product of:
          0.043474387 = sum of:
            0.043474387 = weight(_text_:22 in 2591) [ClassicSimilarity], result of:
              0.043474387 = score(doc=2591,freq=4.0), product of:
                0.15890898 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.04537884 = queryNorm
                0.27358043 = fieldWeight in 2591, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=2591)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Date
    20. 1.2015 18:30:22
    18. 9.2018 18:22:56
  9. Lee, J.-T.; Seo, J.; Jeon, J.; Rim, H.-C.: Sentence-based relevance flow analysis for high accuracy retrieval (2011) 0.01
    0.009725915 = product of:
      0.01945183 = sum of:
        0.01945183 = product of:
          0.03890366 = sum of:
            0.03890366 = weight(_text_:t in 4746) [ClassicSimilarity], result of:
              0.03890366 = score(doc=4746,freq=2.0), product of:
                0.17876579 = queryWeight, product of:
                  3.9394085 = idf(docFreq=2338, maxDocs=44218)
                  0.04537884 = queryNorm
                0.21762364 = fieldWeight in 4746, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.9394085 = idf(docFreq=2338, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=4746)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
  10. Jacucci, G.; Barral, O.; Daee, P.; Wenzel, M.; Serim, B.; Ruotsalo, T.; Pluchino, P.; Freeman, J.; Gamberini, L.; Kaski, S.; Blankertz, B.: Integrating neurophysiologic relevance feedback in intent modeling for information retrieval (2019) 0.01
    0.009725915 = product of:
      0.01945183 = sum of:
        0.01945183 = product of:
          0.03890366 = sum of:
            0.03890366 = weight(_text_:t in 5356) [ClassicSimilarity], result of:
              0.03890366 = score(doc=5356,freq=2.0), product of:
                0.17876579 = queryWeight, product of:
                  3.9394085 = idf(docFreq=2338, maxDocs=44218)
                  0.04537884 = queryNorm
                0.21762364 = fieldWeight in 5356, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.9394085 = idf(docFreq=2338, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=5356)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
  11. Baloh, P.; Desouza, K.C.; Hackney, R.: Contextualizing organizational interventions of knowledge management systems : a design science perspectiveA domain analysis (2012) 0.01
    0.0076852585 = product of:
      0.015370517 = sum of:
        0.015370517 = product of:
          0.030741034 = sum of:
            0.030741034 = weight(_text_:22 in 241) [ClassicSimilarity], result of:
              0.030741034 = score(doc=241,freq=2.0), product of:
                0.15890898 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.04537884 = queryNorm
                0.19345059 = fieldWeight in 241, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=241)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Date
    11. 6.2012 14:22:34
  12. Soulier, L.; Jabeur, L.B.; Tamine, L.; Bahsoun, W.: On ranking relevant entities in heterogeneous networks using a language-based model (2013) 0.01
    0.0076852585 = product of:
      0.015370517 = sum of:
        0.015370517 = product of:
          0.030741034 = sum of:
            0.030741034 = weight(_text_:22 in 664) [ClassicSimilarity], result of:
              0.030741034 = score(doc=664,freq=2.0), product of:
                0.15890898 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.04537884 = queryNorm
                0.19345059 = fieldWeight in 664, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=664)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Date
    22. 3.2013 19:34:49