Search (22 results, page 1 of 2)

  • × language_ss:"e"
  • × theme_ss:"Computerlinguistik"
  • × year_i:[2000 TO 2010}
  1. Hotho, A.; Bloehdorn, S.: Data Mining 2004 : Text classification by boosting weak learners based on terms and concepts (2004) 0.07
    0.07269405 = sum of:
      0.05420002 = product of:
        0.21680008 = sum of:
          0.21680008 = weight(_text_:3a in 562) [ClassicSimilarity], result of:
            0.21680008 = score(doc=562,freq=2.0), product of:
              0.3857529 = queryWeight, product of:
                8.478011 = idf(docFreq=24, maxDocs=44218)
                0.045500398 = queryNorm
              0.56201804 = fieldWeight in 562, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                8.478011 = idf(docFreq=24, maxDocs=44218)
                0.046875 = fieldNorm(doc=562)
        0.25 = coord(1/4)
      0.018494027 = product of:
        0.036988053 = sum of:
          0.036988053 = weight(_text_:22 in 562) [ClassicSimilarity], result of:
            0.036988053 = score(doc=562,freq=2.0), product of:
              0.15933464 = queryWeight, product of:
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.045500398 = queryNorm
              0.23214069 = fieldWeight in 562, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.046875 = fieldNorm(doc=562)
        0.5 = coord(1/2)
    
    Content
    Vgl.: http://www.google.de/url?sa=t&rct=j&q=&esrc=s&source=web&cd=1&cad=rja&ved=0CEAQFjAA&url=http%3A%2F%2Fciteseerx.ist.psu.edu%2Fviewdoc%2Fdownload%3Fdoi%3D10.1.1.91.4940%26rep%3Drep1%26type%3Dpdf&ei=dOXrUMeIDYHDtQahsIGACg&usg=AFQjCNHFWVh6gNPvnOrOS9R3rkrXCNVD-A&sig2=5I2F5evRfMnsttSgFF9g7Q&bvm=bv.1357316858,d.Yms.
    Date
    8. 1.2013 10:22:32
  2. Schneider, J.W.; Borlund, P.: ¬A bibliometric-based semiautomatic approach to identification of candidate thesaurus terms : parsing and filtering of noun phrases from citation contexts (2005) 0.04
    0.044322617 = product of:
      0.088645235 = sum of:
        0.088645235 = sum of:
          0.045492508 = weight(_text_:p in 156) [ClassicSimilarity], result of:
            0.045492508 = score(doc=156,freq=2.0), product of:
              0.16359726 = queryWeight, product of:
                3.5955126 = idf(docFreq=3298, maxDocs=44218)
                0.045500398 = queryNorm
              0.27807623 = fieldWeight in 156, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                3.5955126 = idf(docFreq=3298, maxDocs=44218)
                0.0546875 = fieldNorm(doc=156)
          0.04315273 = weight(_text_:22 in 156) [ClassicSimilarity], result of:
            0.04315273 = score(doc=156,freq=2.0), product of:
              0.15933464 = queryWeight, product of:
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.045500398 = queryNorm
              0.2708308 = fieldWeight in 156, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.0546875 = fieldNorm(doc=156)
      0.5 = coord(1/2)
    
    Date
    8. 3.2007 19:55:22
  3. Drouin, P.: Term extraction using non-technical corpora as a point of leverage (2003) 0.03
    0.02599572 = product of:
      0.05199144 = sum of:
        0.05199144 = product of:
          0.10398288 = sum of:
            0.10398288 = weight(_text_:p in 8797) [ClassicSimilarity], result of:
              0.10398288 = score(doc=8797,freq=2.0), product of:
                0.16359726 = queryWeight, product of:
                  3.5955126 = idf(docFreq=3298, maxDocs=44218)
                  0.045500398 = queryNorm
                0.63560283 = fieldWeight in 8797, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5955126 = idf(docFreq=3298, maxDocs=44218)
                  0.125 = fieldNorm(doc=8797)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
  4. Ekmekcioglu, F.C.; Willett, P.: Effectiveness of stemming for Turkish text retrieval (2000) 0.02
    0.022746254 = product of:
      0.045492508 = sum of:
        0.045492508 = product of:
          0.090985015 = sum of:
            0.090985015 = weight(_text_:p in 5423) [ClassicSimilarity], result of:
              0.090985015 = score(doc=5423,freq=2.0), product of:
                0.16359726 = queryWeight, product of:
                  3.5955126 = idf(docFreq=3298, maxDocs=44218)
                  0.045500398 = queryNorm
                0.55615246 = fieldWeight in 5423, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5955126 = idf(docFreq=3298, maxDocs=44218)
                  0.109375 = fieldNorm(doc=5423)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
  5. Hull, D.; Ait-Mokhtar, S.; Chuat, M.; Eisele, A.; Gaussier, E.; Grefenstette, G.; Isabelle, P.; Samulesson, C.; Segand, F.: Language technologies and patent search and classification (2001) 0.02
    0.01949679 = product of:
      0.03899358 = sum of:
        0.03899358 = product of:
          0.07798716 = sum of:
            0.07798716 = weight(_text_:p in 6318) [ClassicSimilarity], result of:
              0.07798716 = score(doc=6318,freq=2.0), product of:
                0.16359726 = queryWeight, product of:
                  3.5955126 = idf(docFreq=3298, maxDocs=44218)
                  0.045500398 = queryNorm
                0.47670212 = fieldWeight in 6318, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5955126 = idf(docFreq=3298, maxDocs=44218)
                  0.09375 = fieldNorm(doc=6318)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
  6. Boleda, G.; Evert, S.: Multiword expressions : a pain in the neck of lexical semantics (2009) 0.02
    0.018494027 = product of:
      0.036988053 = sum of:
        0.036988053 = product of:
          0.07397611 = sum of:
            0.07397611 = weight(_text_:22 in 4888) [ClassicSimilarity], result of:
              0.07397611 = score(doc=4888,freq=2.0), product of:
                0.15933464 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.045500398 = queryNorm
                0.46428138 = fieldWeight in 4888, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.09375 = fieldNorm(doc=4888)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Date
    1. 3.2013 14:56:22
  7. Perera, P.; Witte, R.: ¬A self-learning context-aware lemmatizer for German (2005) 0.01
    0.01299786 = product of:
      0.02599572 = sum of:
        0.02599572 = product of:
          0.05199144 = sum of:
            0.05199144 = weight(_text_:p in 4638) [ClassicSimilarity], result of:
              0.05199144 = score(doc=4638,freq=2.0), product of:
                0.16359726 = queryWeight, product of:
                  3.5955126 = idf(docFreq=3298, maxDocs=44218)
                  0.045500398 = queryNorm
                0.31780142 = fieldWeight in 4638, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5955126 = idf(docFreq=3298, maxDocs=44218)
                  0.0625 = fieldNorm(doc=4638)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
  8. Ramisch, C.; Schreiner, P.; Idiart, M.; Villavicencio, A.: ¬An evaluation of methods for the extraction of multiword expressions (20xx) 0.01
    0.01299786 = product of:
      0.02599572 = sum of:
        0.02599572 = product of:
          0.05199144 = sum of:
            0.05199144 = weight(_text_:p in 962) [ClassicSimilarity], result of:
              0.05199144 = score(doc=962,freq=2.0), product of:
                0.16359726 = queryWeight, product of:
                  3.5955126 = idf(docFreq=3298, maxDocs=44218)
                  0.045500398 = queryNorm
                0.31780142 = fieldWeight in 962, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5955126 = idf(docFreq=3298, maxDocs=44218)
                  0.0625 = fieldNorm(doc=962)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
  9. Ruiz, M.E.; Srinivasan, P.: Combining machine learning and hierarchical indexing structures for text categorization (2001) 0.01
    0.011373127 = product of:
      0.022746254 = sum of:
        0.022746254 = product of:
          0.045492508 = sum of:
            0.045492508 = weight(_text_:p in 1595) [ClassicSimilarity], result of:
              0.045492508 = score(doc=1595,freq=2.0), product of:
                0.16359726 = queryWeight, product of:
                  3.5955126 = idf(docFreq=3298, maxDocs=44218)
                  0.045500398 = queryNorm
                0.27807623 = fieldWeight in 1595, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5955126 = idf(docFreq=3298, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=1595)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
  10. Doszkocs, T.E.; Zamora, A.: Dictionary services and spelling aids for Web searching (2004) 0.01
    0.01089771 = product of:
      0.02179542 = sum of:
        0.02179542 = product of:
          0.04359084 = sum of:
            0.04359084 = weight(_text_:22 in 2541) [ClassicSimilarity], result of:
              0.04359084 = score(doc=2541,freq=4.0), product of:
                0.15933464 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.045500398 = queryNorm
                0.27358043 = fieldWeight in 2541, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=2541)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Date
    14. 8.2004 17:22:56
    Source
    Online. 28(2004) no.3, S.22-29
  11. Hammwöhner, R.: TransRouter revisited : Decision support in the routing of translation projects (2000) 0.01
    0.010788183 = product of:
      0.021576365 = sum of:
        0.021576365 = product of:
          0.04315273 = sum of:
            0.04315273 = weight(_text_:22 in 5483) [ClassicSimilarity], result of:
              0.04315273 = score(doc=5483,freq=2.0), product of:
                0.15933464 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.045500398 = queryNorm
                0.2708308 = fieldWeight in 5483, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=5483)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Date
    10.12.2000 18:22:35
  12. Paolillo, J.C.: Linguistics and the information sciences (2009) 0.01
    0.010788183 = product of:
      0.021576365 = sum of:
        0.021576365 = product of:
          0.04315273 = sum of:
            0.04315273 = weight(_text_:22 in 3840) [ClassicSimilarity], result of:
              0.04315273 = score(doc=3840,freq=2.0), product of:
                0.15933464 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.045500398 = queryNorm
                0.2708308 = fieldWeight in 3840, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=3840)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Date
    27. 8.2011 14:22:33
  13. Working with conceptual structures : contributions to ICCS 2000. 8th International Conference on Conceptual Structures: Logical, Linguistic, and Computational Issues. Darmstadt, August 14-18, 2000 (2000) 0.01
    0.009849418 = product of:
      0.019698836 = sum of:
        0.019698836 = product of:
          0.03939767 = sum of:
            0.03939767 = weight(_text_:p in 5089) [ClassicSimilarity], result of:
              0.03939767 = score(doc=5089,freq=6.0), product of:
                0.16359726 = queryWeight, product of:
                  3.5955126 = idf(docFreq=3298, maxDocs=44218)
                  0.045500398 = queryNorm
                0.24082111 = fieldWeight in 5089, product of:
                  2.4494898 = tf(freq=6.0), with freq of:
                    6.0 = termFreq=6.0
                  3.5955126 = idf(docFreq=3298, maxDocs=44218)
                  0.02734375 = fieldNorm(doc=5089)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Content
    Concepts & Language: Knowledge organization by procedures of natural language processing. A case study using the method GABEK (J. Zelger, J. Gadner) - Computer aided narrative analysis using conceptual graphs (H. Schärfe, P. 0hrstrom) - Pragmatic representation of argumentative text: a challenge for the conceptual graph approach (H. Irandoust, B. Moulin) - Conceptual graphs as a knowledge representation core in a complex language learning environment (G. Angelova, A. Nenkova, S. Boycheva, T. Nikolov) - Conceptual Modeling and Ontologies: Relationships and actions in conceptual categories (Ch. Landauer, K.L. Bellman) - Concept approximations for formal concept analysis (J. Saquer, J.S. Deogun) - Faceted information representation (U. Priß) - Simple concept graphs with universal quantifiers (J. Tappe) - A framework for comparing methods for using or reusing multiple ontologies in an application (J. van ZyI, D. Corbett) - Designing task/method knowledge-based systems with conceptual graphs (M. Leclère, F.Trichet, Ch. Choquet) - A logical ontology (J. Farkas, J. Sarbo) - Algorithms and Tools: Fast concept analysis (Ch. Lindig) - A framework for conceptual graph unification (D. Corbett) - Visual CP representation of knowledge (H.D. Pfeiffer, R.T. Hartley) - Maximal isojoin for representing software textual specifications and detecting semantic anomalies (Th. Charnois) - Troika: using grids, lattices and graphs in knowledge acquisition (H.S. Delugach, B.E. Lampkin) - Open world theorem prover for conceptual graphs (J.E. Heaton, P. Kocura) - NetCare: a practical conceptual graphs software tool (S. Polovina, D. Strang) - CGWorld - a web based workbench for conceptual graphs management and applications (P. Dobrev, K. Toutanova) - Position papers: The edition project: Peirce's existential graphs (R. Mülller) - Mining association rules using formal concept analysis (N. Pasquier) - Contextual logic summary (R Wille) - Information channels and conceptual scaling (K.E. Wolff) - Spatial concepts - a rule exploration (S. Rudolph) - The TEXT-TO-ONTO learning environment (A. Mädche, St. Staab) - Controlling the semantics of metadata on audio-visual documents using ontologies (Th. Dechilly, B. Bachimont) - Building the ontological foundations of a terminology from natural language to conceptual graphs with Ribosome, a knowledge extraction system (Ch. Jacquelinet, A. Burgun) - CharGer: some lessons learned and new directions (H.S. Delugach) - Knowledge management using conceptual graphs (W.K. Pun)
  14. Nait-Baha, L.; Jackiewicz, A.; Djioua, B.; Laublet, P.: Query reformulation for information retrieval on the Web using the point of view methodology : preliminary results (2001) 0.01
    0.009748395 = product of:
      0.01949679 = sum of:
        0.01949679 = product of:
          0.03899358 = sum of:
            0.03899358 = weight(_text_:p in 249) [ClassicSimilarity], result of:
              0.03899358 = score(doc=249,freq=2.0), product of:
                0.16359726 = queryWeight, product of:
                  3.5955126 = idf(docFreq=3298, maxDocs=44218)
                  0.045500398 = queryNorm
                0.23835106 = fieldWeight in 249, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5955126 = idf(docFreq=3298, maxDocs=44218)
                  0.046875 = fieldNorm(doc=249)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
  15. Cimiano, P.; Völker, J.; Studer, R.: Ontologies on demand? : a description of the state-of-the-art, applications, challenges and trends for ontology learning from text (2006) 0.01
    0.009748395 = product of:
      0.01949679 = sum of:
        0.01949679 = product of:
          0.03899358 = sum of:
            0.03899358 = weight(_text_:p in 6014) [ClassicSimilarity], result of:
              0.03899358 = score(doc=6014,freq=2.0), product of:
                0.16359726 = queryWeight, product of:
                  3.5955126 = idf(docFreq=3298, maxDocs=44218)
                  0.045500398 = queryNorm
                0.23835106 = fieldWeight in 6014, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5955126 = idf(docFreq=3298, maxDocs=44218)
                  0.046875 = fieldNorm(doc=6014)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
  16. Argamon, S.; Whitelaw, C.; Chase, P.; Hota, S.R.; Garg, N.; Levitan, S.: Stylistic text classification using functional lexical features (2007) 0.01
    0.009748395 = product of:
      0.01949679 = sum of:
        0.01949679 = product of:
          0.03899358 = sum of:
            0.03899358 = weight(_text_:p in 280) [ClassicSimilarity], result of:
              0.03899358 = score(doc=280,freq=2.0), product of:
                0.16359726 = queryWeight, product of:
                  3.5955126 = idf(docFreq=3298, maxDocs=44218)
                  0.045500398 = queryNorm
                0.23835106 = fieldWeight in 280, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5955126 = idf(docFreq=3298, maxDocs=44218)
                  0.046875 = fieldNorm(doc=280)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
  17. Bian, G.-W.; Chen, H.-H.: Cross-language information access to multilingual collections on the Internet (2000) 0.01
    0.009247013 = product of:
      0.018494027 = sum of:
        0.018494027 = product of:
          0.036988053 = sum of:
            0.036988053 = weight(_text_:22 in 4436) [ClassicSimilarity], result of:
              0.036988053 = score(doc=4436,freq=2.0), product of:
                0.15933464 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.045500398 = queryNorm
                0.23214069 = fieldWeight in 4436, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.046875 = fieldNorm(doc=4436)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Date
    16. 2.2000 14:22:39
  18. Kettunen, K.; Kunttu, T.; Järvelin, K.: To stem or lemmatize a highly inflectional language in a probabilistic IR environment? (2005) 0.01
    0.008123662 = product of:
      0.016247325 = sum of:
        0.016247325 = product of:
          0.03249465 = sum of:
            0.03249465 = weight(_text_:p in 4395) [ClassicSimilarity], result of:
              0.03249465 = score(doc=4395,freq=2.0), product of:
                0.16359726 = queryWeight, product of:
                  3.5955126 = idf(docFreq=3298, maxDocs=44218)
                  0.045500398 = queryNorm
                0.19862589 = fieldWeight in 4395, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5955126 = idf(docFreq=3298, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=4395)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    Purpose - To show that stem generation compares well with lemmatization as a morphological tool for a highly inflectional language for IR purposes in a best-match retrieval system. Design/methodology/approach - Effects of three different morphological methods - lemmatization, stemming and stem production - for Finnish are compared in a probabilistic IR environment (INQUERY). Evaluation is done using a four-point relevance scale which is partitioned differently in different test settings. Findings - Results show that stem production, a lighter method than morphological lemmatization, compares well with lemmatization in a best-match IR environment. Differences in performance between stem production and lemmatization are small and they are not statistically significant in most of the tested settings. It is also shown that hitherto a rather neglected method of morphological processing for Finnish, stemming, performs reasonably well although the stemmer used - a Porter stemmer implementation - is far from optimal for a morphologically complex language like Finnish. In another series of tests, the effects of compound splitting and derivational expansion of queries are tested. Practical implications - Usefulness of morphological lemmatization and stem generation for IR purposes can be estimated with many factors. On the average P-R level they seem to behave very close to each other in a probabilistic IR system. Thus, the choice of the used method with highly inflectional languages needs to be estimated along other dimensions too. Originality/value - Results are achieved using Finnish as an example of a highly inflectional language. The results are of interest for anyone who is interested in processing of morphological variation of a highly inflected language for IR purposes.
  19. Ahlgren, P.; Kekäläinen, J.: Indexing strategies for Swedish full text retrieval under different user scenarios (2007) 0.01
    0.008123662 = product of:
      0.016247325 = sum of:
        0.016247325 = product of:
          0.03249465 = sum of:
            0.03249465 = weight(_text_:p in 896) [ClassicSimilarity], result of:
              0.03249465 = score(doc=896,freq=2.0), product of:
                0.16359726 = queryWeight, product of:
                  3.5955126 = idf(docFreq=3298, maxDocs=44218)
                  0.045500398 = queryNorm
                0.19862589 = fieldWeight in 896, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5955126 = idf(docFreq=3298, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=896)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
  20. Computational linguistics for the new millennium : divergence or synergy? Proceedings of the International Symposium held at the Ruprecht-Karls Universität Heidelberg, 21-22 July 2000. Festschrift in honour of Peter Hellwig on the occasion of his 60th birthday (2002) 0.01
    0.007705845 = product of:
      0.01541169 = sum of:
        0.01541169 = product of:
          0.03082338 = sum of:
            0.03082338 = weight(_text_:22 in 4900) [ClassicSimilarity], result of:
              0.03082338 = score(doc=4900,freq=2.0), product of:
                0.15933464 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.045500398 = queryNorm
                0.19345059 = fieldWeight in 4900, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=4900)
          0.5 = coord(1/2)
      0.5 = coord(1/2)