Search (5209 results, page 1 of 261)

  1. Seely, P.A.: Dewey Decimal Classification: relocations in edition 15 and criteria for edition 16 (1954) 0.14
    0.13950965 = product of:
      0.8370578 = sum of:
        0.8370578 = weight(_text_:seely in 1716) [ClassicSimilarity], result of:
          0.8370578 = score(doc=1716,freq=2.0), product of:
            0.46940652 = queryWeight, product of:
              10.087449 = idf(docFreq=4, maxDocs=44218)
              0.04653372 = queryNorm
            1.7832259 = fieldWeight in 1716, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              10.087449 = idf(docFreq=4, maxDocs=44218)
              0.125 = fieldNorm(doc=1716)
      0.16666667 = coord(1/6)
    
  2. RSWK-Mitteilung Nr.10 : Entwurf (1997) 0.12
    0.12450405 = product of:
      0.7470243 = sum of:
        0.7470243 = weight(_text_:rswk_00 in 6540) [ClassicSimilarity], result of:
          0.7470243 = score(doc=6540,freq=2.0), product of:
            0.51204497 = queryWeight, product of:
              11.00374 = idf(docFreq=1, maxDocs=44218)
              0.04653372 = queryNorm
            1.4589037 = fieldWeight in 6540, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              11.00374 = idf(docFreq=1, maxDocs=44218)
              0.09375 = fieldNorm(doc=6540)
      0.16666667 = coord(1/6)
    
    Footnote
    Vgl. für den Vorabdruck der Regeln: http://www.dbi-berlin.de/dbi_pub/einzelpu/regelw/rswk/rswk_00.htm
  3. Schrodt, R.: Tiefen und Untiefen im wissenschaftlichen Sprachgebrauch (2008) 0.12
    0.12324353 = product of:
      0.3697306 = sum of:
        0.29563153 = weight(_text_:3a in 140) [ClassicSimilarity], result of:
          0.29563153 = score(doc=140,freq=2.0), product of:
            0.3945134 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.04653372 = queryNorm
            0.7493574 = fieldWeight in 140, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.0625 = fieldNorm(doc=140)
        0.07409907 = weight(_text_:problem in 140) [ClassicSimilarity], result of:
          0.07409907 = score(doc=140,freq=2.0), product of:
            0.19751167 = queryWeight, product of:
              4.244485 = idf(docFreq=1723, maxDocs=44218)
              0.04653372 = queryNorm
            0.375163 = fieldWeight in 140, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              4.244485 = idf(docFreq=1723, maxDocs=44218)
              0.0625 = fieldNorm(doc=140)
      0.33333334 = coord(2/6)
    
    Abstract
    "Wer überhaupt spricht oder schreibt, sollte sich verständlich ausdrücken. Das ist eine auf den ersten Blick einleuchtende Forderung. denn wozu äußert er sich, wenn er nicht verstanden werden will?" (Luhmann 2005, 193) So einfach scheint unser Problem zu sein - doch so einfach ist es nicht.
    Content
    Vgl. auch: https://studylibde.com/doc/13053640/richard-schrodt. Vgl. auch: http%3A%2F%2Fwww.univie.ac.at%2FGermanistik%2Fschrodt%2Fvorlesung%2Fwissenschaftssprache.doc&usg=AOvVaw1lDLDR6NFf1W0-oC9mEUJf.
  4. Hotho, A.; Bloehdorn, S.: Data Mining 2004 : Text classification by boosting weak learners based on terms and concepts (2004) 0.09
    0.08651724 = product of:
      0.2595517 = sum of:
        0.22172365 = weight(_text_:3a in 562) [ClassicSimilarity], result of:
          0.22172365 = score(doc=562,freq=2.0), product of:
            0.3945134 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.04653372 = queryNorm
            0.56201804 = fieldWeight in 562, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.046875 = fieldNorm(doc=562)
        0.037828058 = weight(_text_:22 in 562) [ClassicSimilarity], result of:
          0.037828058 = score(doc=562,freq=2.0), product of:
            0.16295315 = queryWeight, product of:
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.04653372 = queryNorm
            0.23214069 = fieldWeight in 562, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.046875 = fieldNorm(doc=562)
      0.33333334 = coord(2/6)
    
    Content
    Vgl.: http://www.google.de/url?sa=t&rct=j&q=&esrc=s&source=web&cd=1&cad=rja&ved=0CEAQFjAA&url=http%3A%2F%2Fciteseerx.ist.psu.edu%2Fviewdoc%2Fdownload%3Fdoi%3D10.1.1.91.4940%26rep%3Drep1%26type%3Dpdf&ei=dOXrUMeIDYHDtQahsIGACg&usg=AFQjCNHFWVh6gNPvnOrOS9R3rkrXCNVD-A&sig2=5I2F5evRfMnsttSgFF9g7Q&bvm=bv.1357316858,d.Yms.
    Date
    8. 1.2013 10:22:32
  5. Malsburg, C. von der: ¬The correlation theory of brain function (1981) 0.08
    0.08342156 = product of:
      0.25026467 = sum of:
        0.18476972 = weight(_text_:3a in 76) [ClassicSimilarity], result of:
          0.18476972 = score(doc=76,freq=2.0), product of:
            0.3945134 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.04653372 = queryNorm
            0.46834838 = fieldWeight in 76, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.0390625 = fieldNorm(doc=76)
        0.06549495 = weight(_text_:problem in 76) [ClassicSimilarity], result of:
          0.06549495 = score(doc=76,freq=4.0), product of:
            0.19751167 = queryWeight, product of:
              4.244485 = idf(docFreq=1723, maxDocs=44218)
              0.04653372 = queryNorm
            0.33160037 = fieldWeight in 76, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              4.244485 = idf(docFreq=1723, maxDocs=44218)
              0.0390625 = fieldNorm(doc=76)
      0.33333334 = coord(2/6)
    
    Abstract
    A summary of brain theory is given so far as it is contained within the framework of Localization Theory. Difficulties of this "conventional theory" are traced back to a specific deficiency: there is no way to express relations between active cells (as for instance their representing parts of the same object). A new theory is proposed to cure this deficiency. It introduces a new kind of dynamical control, termed synaptic modulation, according to which synapses switch between a conducting and a non- conducting state. The dynamics of this variable is controlled on a fast time scale by correlations in the temporal fine structure of cellular signals. Furthermore, conventional synaptic plasticity is replaced by a refined version. Synaptic modulation and plasticity form the basis for short-term and long-term memory, respectively. Signal correlations, shaped by the variable network, express structure and relationships within objects. In particular, the figure-ground problem may be solved in this way. Synaptic modulation introduces exibility into cerebral networks which is necessary to solve the invariance problem. Since momentarily useless connections are deactivated, interference between di erent memory traces can be reduced, and memory capacity increased, in comparison with conventional associative memory
    Source
    http%3A%2F%2Fcogprints.org%2F1380%2F1%2FvdM_correlation.pdf&usg=AOvVaw0g7DvZbQPb2U7dYb49b9v_
  6. Hockett, C.F.: ¬The problem of universals in language (1963) 0.08
    0.08302432 = product of:
      0.24907297 = sum of:
        0.14819814 = weight(_text_:problem in 5591) [ClassicSimilarity], result of:
          0.14819814 = score(doc=5591,freq=2.0), product of:
            0.19751167 = queryWeight, product of:
              4.244485 = idf(docFreq=1723, maxDocs=44218)
              0.04653372 = queryNorm
            0.750326 = fieldWeight in 5591, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              4.244485 = idf(docFreq=1723, maxDocs=44218)
              0.125 = fieldNorm(doc=5591)
        0.10087483 = weight(_text_:22 in 5591) [ClassicSimilarity], result of:
          0.10087483 = score(doc=5591,freq=2.0), product of:
            0.16295315 = queryWeight, product of:
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.04653372 = queryNorm
            0.61904186 = fieldWeight in 5591, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.125 = fieldNorm(doc=5591)
      0.33333334 = coord(2/6)
    
    Pages
    S. 1-22
  7. Donsbach, W.: Wahrheit in den Medien : über den Sinn eines methodischen Objektivitätsbegriffes (2001) 0.08
    0.07702722 = product of:
      0.23108163 = sum of:
        0.18476972 = weight(_text_:3a in 5895) [ClassicSimilarity], result of:
          0.18476972 = score(doc=5895,freq=2.0), product of:
            0.3945134 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.04653372 = queryNorm
            0.46834838 = fieldWeight in 5895, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.0390625 = fieldNorm(doc=5895)
        0.04631192 = weight(_text_:problem in 5895) [ClassicSimilarity], result of:
          0.04631192 = score(doc=5895,freq=2.0), product of:
            0.19751167 = queryWeight, product of:
              4.244485 = idf(docFreq=1723, maxDocs=44218)
              0.04653372 = queryNorm
            0.23447686 = fieldWeight in 5895, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              4.244485 = idf(docFreq=1723, maxDocs=44218)
              0.0390625 = fieldNorm(doc=5895)
      0.33333334 = coord(2/6)
    
    Abstract
    Das Problem der Wahrnehmung und Darstellung von Wahrheit durch die Medien führt zu vier zentralen Fragen: Wie viel Wahrheit gibt es in der Welt, über die Journalisten berichten müssen? Wie ermittelt oder recherchiert man diese Wahrheit? Wie trennt man die Spreu vom Weizen? Und wie geht man als Journalist mit dem um, was man als Wahrheit erkannt hat oder erkannt zu haben glaubt? Hier gibt es ganz offensichtlich eine Parallele zwischen Journalisten und Wissenschaftlern. Journalisten und Wissenschaftler brauchen erstens Hypothesen, zweitens geeignete Hypothesentests, drittens ein gutes Abgrenzungs-Kriterium und viertens Verfahren, um die erkannten Sachverhalte auf angemessene Weise für eine Kommunikation mit anderen zu repräsentieren, das heißt sie darzustellen. Es gibt zwei große Unterschiede zwischen Journalisten und Wissenschaftlern: Journalisten sind in der Regel auf raum-zeitlich begrenzte Aussagen aus, Wissenschaftler in der Regel auf raumzeitlich unbegrenzte Gesetze. Aber diese Unterschiede sind fließend, weil Wissenschaftler raum-zeitlich begrenzte Aussagen brauchen, um ihre All-Aussagen zu überprüfen, und Journalisten sich immer häufiger auf das Feld der allgemeinen Gesetzes-Aussagen wagen oder doch zumindest Kausalinterpretationen für soziale Phänomene anbieten. Der zweite Unterschied besteht darin, dass die Wissenschaft weitgehend professionalisiert ist (zumindest gilt dies uneingeschränkt für die Naturwissenschaften und die Medizin), was ihr relativ klare Abgrenzungs- und Güte-Kriterien beschert hat. Diese fehlen weitgehend im Journalismus.
    Source
    Politische Meinung. 381(2001) Nr.1, S.65-74 [https%3A%2F%2Fwww.dgfe.de%2Ffileadmin%2FOrdnerRedakteure%2FSektionen%2FSek02_AEW%2FKWF%2FPublikationen_Reihe_1989-2003%2FBand_17%2FBd_17_1994_355-406_A.pdf&usg=AOvVaw2KcbRsHy5UQ9QRIUyuOLNi]
  8. Verwer, K.: Freiheit und Verantwortung bei Hans Jonas (2011) 0.07
    0.07390788 = product of:
      0.4434473 = sum of:
        0.4434473 = weight(_text_:3a in 973) [ClassicSimilarity], result of:
          0.4434473 = score(doc=973,freq=2.0), product of:
            0.3945134 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.04653372 = queryNorm
            1.1240361 = fieldWeight in 973, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.09375 = fieldNorm(doc=973)
      0.16666667 = coord(1/6)
    
    Content
    Vgl.: http%3A%2F%2Fcreativechoice.org%2Fdoc%2FHansJonas.pdf&usg=AOvVaw1TM3teaYKgABL5H9yoIifA&opi=89978449.
  9. Levy, D.M.: Digital libraries and the problem of purpose (2000) 0.07
    0.07264628 = product of:
      0.21793884 = sum of:
        0.12967336 = weight(_text_:problem in 5002) [ClassicSimilarity], result of:
          0.12967336 = score(doc=5002,freq=2.0), product of:
            0.19751167 = queryWeight, product of:
              4.244485 = idf(docFreq=1723, maxDocs=44218)
              0.04653372 = queryNorm
            0.6565352 = fieldWeight in 5002, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              4.244485 = idf(docFreq=1723, maxDocs=44218)
              0.109375 = fieldNorm(doc=5002)
        0.08826547 = weight(_text_:22 in 5002) [ClassicSimilarity], result of:
          0.08826547 = score(doc=5002,freq=2.0), product of:
            0.16295315 = queryWeight, product of:
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.04653372 = queryNorm
            0.5416616 = fieldWeight in 5002, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.109375 = fieldNorm(doc=5002)
      0.33333334 = coord(2/6)
    
    Source
    Bulletin of the American Society for Information Science. 26(2000), no.6, Aug/Sept, S.22-25
  10. Fachsystematik Bremen nebst Schlüssel 1970 ff. (1970 ff) 0.07
    0.072097704 = product of:
      0.2162931 = sum of:
        0.18476972 = weight(_text_:3a in 3577) [ClassicSimilarity], result of:
          0.18476972 = score(doc=3577,freq=2.0), product of:
            0.3945134 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.04653372 = queryNorm
            0.46834838 = fieldWeight in 3577, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.0390625 = fieldNorm(doc=3577)
        0.031523384 = weight(_text_:22 in 3577) [ClassicSimilarity], result of:
          0.031523384 = score(doc=3577,freq=2.0), product of:
            0.16295315 = queryWeight, product of:
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.04653372 = queryNorm
            0.19345059 = fieldWeight in 3577, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.0390625 = fieldNorm(doc=3577)
      0.33333334 = coord(2/6)
    
    Content
    1. Agrarwissenschaften 1981. - 3. Allgemeine Geographie 2.1972. - 3a. Allgemeine Naturwissenschaften 1.1973. - 4. Allgemeine Sprachwissenschaft, Allgemeine Literaturwissenschaft 2.1971. - 6. Allgemeines. 5.1983. - 7. Anglistik 3.1976. - 8. Astronomie, Geodäsie 4.1977. - 12. bio Biologie, bcp Biochemie-Biophysik, bot Botanik, zoo Zoologie 1981. - 13. Bremensien 3.1983. - 13a. Buch- und Bibliothekswesen 3.1975. - 14. Chemie 4.1977. - 14a. Elektrotechnik 1974. - 15 Ethnologie 2.1976. - 16,1. Geowissenschaften. Sachteil 3.1977. - 16,2. Geowissenschaften. Regionaler Teil 3.1977. - 17. Germanistik 6.1984. - 17a,1. Geschichte. Teilsystematik hil. - 17a,2. Geschichte. Teilsystematik his Neuere Geschichte. - 17a,3. Geschichte. Teilsystematik hit Neueste Geschichte. - 18. Humanbiologie 2.1983. - 19. Ingenieurwissenschaften 1974. - 20. siehe 14a. - 21. klassische Philologie 3.1977. - 22. Klinische Medizin 1975. - 23. Kunstgeschichte 2.1971. - 24. Kybernetik. 2.1975. - 25. Mathematik 3.1974. - 26. Medizin 1976. - 26a. Militärwissenschaft 1985. - 27. Musikwissenschaft 1978. - 27a. Noten 2.1974. - 28. Ozeanographie 3.1977. -29. Pädagogik 8.1985. - 30. Philosphie 3.1974. - 31. Physik 3.1974. - 33. Politik, Politische Wissenschaft, Sozialwissenschaft. Soziologie. Länderschlüssel. Register 1981. - 34. Psychologie 2.1972. - 35. Publizistik und Kommunikationswissenschaft 1985. - 36. Rechtswissenschaften 1986. - 37. Regionale Geograpgie 3.1975. - 37a. Religionswissenschaft 1970. - 38. Romanistik 3.1976. - 39. Skandinavistik 4.1985. - 40. Slavistik 1977. - 40a. Sonstige Sprachen und Literaturen 1973. - 43. Sport 4.1983. - 44. Theaterwissenschaft 1985. - 45. Theologie 2.1976. - 45a. Ur- und Frühgeschichte, Archäologie 1970. - 47. Volkskunde 1976. - 47a. Wirtschaftswissenschaften 1971 // Schlüssel: 1. Länderschlüssel 1971. - 2. Formenschlüssel (Kurzform) 1974. - 3. Personenschlüssel Literatur 5. Fassung 1968
  11. Carlson, S.; Seely, A.: Using OpenRefine's reconciliation to validate local authority headings (2017) 0.07
    0.069754824 = product of:
      0.4185289 = sum of:
        0.4185289 = weight(_text_:seely in 5142) [ClassicSimilarity], result of:
          0.4185289 = score(doc=5142,freq=2.0), product of:
            0.46940652 = queryWeight, product of:
              10.087449 = idf(docFreq=4, maxDocs=44218)
              0.04653372 = queryNorm
            0.89161295 = fieldWeight in 5142, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              10.087449 = idf(docFreq=4, maxDocs=44218)
              0.0625 = fieldNorm(doc=5142)
      0.16666667 = coord(1/6)
    
  12. Elliott, P.: Reporting LIS research : a review article (1990) 0.06
    0.062268242 = product of:
      0.18680473 = sum of:
        0.11114861 = weight(_text_:problem in 5882) [ClassicSimilarity], result of:
          0.11114861 = score(doc=5882,freq=2.0), product of:
            0.19751167 = queryWeight, product of:
              4.244485 = idf(docFreq=1723, maxDocs=44218)
              0.04653372 = queryNorm
            0.5627445 = fieldWeight in 5882, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              4.244485 = idf(docFreq=1723, maxDocs=44218)
              0.09375 = fieldNorm(doc=5882)
        0.075656116 = weight(_text_:22 in 5882) [ClassicSimilarity], result of:
          0.075656116 = score(doc=5882,freq=2.0), product of:
            0.16295315 = queryWeight, product of:
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.04653372 = queryNorm
            0.46428138 = fieldWeight in 5882, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.09375 = fieldNorm(doc=5882)
      0.33333334 = coord(2/6)
    
    Abstract
    Discusses the problem of library and information science research publishing and reviews the monographs, abstracting services and periodicals that disseminate information about research.
    Source
    Librarianship. 22(1990), no.4, S.257-264
  13. Stojanovic, N.: Ontology-based Information Retrieval : methods and tools for cooperative query answering (2005) 0.06
    0.061621767 = product of:
      0.1848653 = sum of:
        0.14781576 = weight(_text_:3a in 701) [ClassicSimilarity], result of:
          0.14781576 = score(doc=701,freq=2.0), product of:
            0.3945134 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.04653372 = queryNorm
            0.3746787 = fieldWeight in 701, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03125 = fieldNorm(doc=701)
        0.037049536 = weight(_text_:problem in 701) [ClassicSimilarity], result of:
          0.037049536 = score(doc=701,freq=2.0), product of:
            0.19751167 = queryWeight, product of:
              4.244485 = idf(docFreq=1723, maxDocs=44218)
              0.04653372 = queryNorm
            0.1875815 = fieldWeight in 701, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              4.244485 = idf(docFreq=1723, maxDocs=44218)
              0.03125 = fieldNorm(doc=701)
      0.33333334 = coord(2/6)
    
    Abstract
    By the explosion of possibilities for a ubiquitous content production, the information overload problem reaches the level of complexity which cannot be managed by traditional modelling approaches anymore. Due to their pure syntactical nature traditional information retrieval approaches did not succeed in treating content itself (i.e. its meaning, and not its representation). This leads to a very low usefulness of the results of a retrieval process for a user's task at hand. In the last ten years ontologies have been emerged from an interesting conceptualisation paradigm to a very promising (semantic) modelling technology, especially in the context of the Semantic Web. From the information retrieval point of view, ontologies enable a machine-understandable form of content description, such that the retrieval process can be driven by the meaning of the content. However, the very ambiguous nature of the retrieval process in which a user, due to the unfamiliarity with the underlying repository and/or query syntax, just approximates his information need in a query, implies a necessity to include the user in the retrieval process more actively in order to close the gap between the meaning of the content and the meaning of a user's query (i.e. his information need). This thesis lays foundation for such an ontology-based interactive retrieval process, in which the retrieval system interacts with a user in order to conceptually interpret the meaning of his query, whereas the underlying domain ontology drives the conceptualisation process. In that way the retrieval process evolves from a query evaluation process into a highly interactive cooperation between a user and the retrieval system, in which the system tries to anticipate the user's information need and to deliver the relevant content proactively. Moreover, the notion of content relevance for a user's query evolves from a content dependent artefact to the multidimensional context-dependent structure, strongly influenced by the user's preferences. This cooperation process is realized as the so-called Librarian Agent Query Refinement Process. In order to clarify the impact of an ontology on the retrieval process (regarding its complexity and quality), a set of methods and tools for different levels of content and query formalisation is developed, ranging from pure ontology-based inferencing to keyword-based querying in which semantics automatically emerges from the results. Our evaluation studies have shown that the possibilities to conceptualize a user's information need in the right manner and to interpret the retrieval results accordingly are key issues for realizing much more meaningful information retrieval systems.
    Content
    Vgl.: http%3A%2F%2Fdigbib.ubka.uni-karlsruhe.de%2Fvolltexte%2Fdocuments%2F1627&ei=tAtYUYrBNoHKtQb3l4GYBw&usg=AFQjCNHeaxKkKU3-u54LWxMNYGXaaDLCGw&sig2=8WykXWQoDKjDSdGtAakH2Q&bvm=bv.44442042,d.Yms.
  14. Opdahl, A.L.; Sindre, G.: Facet modelling : an approach to flexible and integrated conceptual modelling (1997) 0.06
    0.061621286 = product of:
      0.18486385 = sum of:
        0.14703579 = weight(_text_:problem in 1701) [ClassicSimilarity], result of:
          0.14703579 = score(doc=1701,freq=14.0), product of:
            0.19751167 = queryWeight, product of:
              4.244485 = idf(docFreq=1723, maxDocs=44218)
              0.04653372 = queryNorm
            0.74444103 = fieldWeight in 1701, product of:
              3.7416575 = tf(freq=14.0), with freq of:
                14.0 = termFreq=14.0
              4.244485 = idf(docFreq=1723, maxDocs=44218)
              0.046875 = fieldNorm(doc=1701)
        0.037828058 = weight(_text_:22 in 1701) [ClassicSimilarity], result of:
          0.037828058 = score(doc=1701,freq=2.0), product of:
            0.16295315 = queryWeight, product of:
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.04653372 = queryNorm
            0.23214069 = fieldWeight in 1701, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.046875 = fieldNorm(doc=1701)
      0.33333334 = coord(2/6)
    
    Abstract
    Points to weaknesses of conceptual modelling languages that are oriented towards certain aspects of the problem domain of information systems development. Modelling languages are needed that allow modellers to: freely choose to represent a wide and extensible range of aspects of problem domain phenomena continguent on the problems at hand; simultaneously co-represent several aspects of the same problem domain phenomenon whenever needed; reflect semantical relations between these aspects in the problem domain models; and extend the set of kinds of aspects that can be represented and visualised throughout problem analysis as understanding of the problem domain and the problems at hand increases. Outlines an approach called facet modelling of real-world problem domains to deal with the complexity of contemporary analysis problems. Defines and visualizes facet models, discusses facet modelling in relation to other recent ideas and techniques in the information system development field. Case studies are currently in progress to evaluate various implications of the facet modelling approach empirically
    Source
    Information systems. 22(1997) no.5, S.291-323
  15. Kleineberg, M.: Context analysis and context indexing : formal pragmatics in knowledge organization (2014) 0.06
    0.061589908 = product of:
      0.36953944 = sum of:
        0.36953944 = weight(_text_:3a in 1826) [ClassicSimilarity], result of:
          0.36953944 = score(doc=1826,freq=2.0), product of:
            0.3945134 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.04653372 = queryNorm
            0.93669677 = fieldWeight in 1826, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.078125 = fieldNorm(doc=1826)
      0.16666667 = coord(1/6)
    
    Source
    http://www.google.de/url?sa=t&rct=j&q=&esrc=s&source=web&cd=5&ved=0CDQQFjAE&url=http%3A%2F%2Fdigbib.ubka.uni-karlsruhe.de%2Fvolltexte%2Fdocuments%2F3131107&ei=HzFWVYvGMsiNsgGTyoFI&usg=AFQjCNE2FHUeR9oQTQlNC4TPedv4Mo3DaQ&sig2=Rlzpr7a3BLZZkqZCXXN_IA&bvm=bv.93564037,d.bGg&cad=rja
  16. DeRaedt, L.: Logical settings for concept-learning (1997) 0.06
    0.060595147 = product of:
      0.18178543 = sum of:
        0.09262384 = weight(_text_:problem in 3780) [ClassicSimilarity], result of:
          0.09262384 = score(doc=3780,freq=2.0), product of:
            0.19751167 = queryWeight, product of:
              4.244485 = idf(docFreq=1723, maxDocs=44218)
              0.04653372 = queryNorm
            0.46895373 = fieldWeight in 3780, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              4.244485 = idf(docFreq=1723, maxDocs=44218)
              0.078125 = fieldNorm(doc=3780)
        0.08916159 = weight(_text_:22 in 3780) [ClassicSimilarity], result of:
          0.08916159 = score(doc=3780,freq=4.0), product of:
            0.16295315 = queryWeight, product of:
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.04653372 = queryNorm
            0.54716086 = fieldWeight in 3780, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.078125 = fieldNorm(doc=3780)
      0.33333334 = coord(2/6)
    
    Abstract
    Analyzes 3 different formalisations of concept-learning in logic. Learning from interpretations reduces to learning from entailment, which in turn reduces to learning from satisfiability. Discusses the implications for inductive logic programming and computational learning theory and formulates guidelines for choosing a problem-setting method
    Date
    6. 3.1997 16:22:15
    22. 1.1999 18:56:45
  17. Brown, J.S.; Duguid, P.: ¬The social life of information (2000) 0.05
    0.052316114 = product of:
      0.3138967 = sum of:
        0.3138967 = weight(_text_:seely in 307) [ClassicSimilarity], result of:
          0.3138967 = score(doc=307,freq=2.0), product of:
            0.46940652 = queryWeight, product of:
              10.087449 = idf(docFreq=4, maxDocs=44218)
              0.04653372 = queryNorm
            0.6687097 = fieldWeight in 307, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              10.087449 = idf(docFreq=4, maxDocs=44218)
              0.046875 = fieldNorm(doc=307)
      0.16666667 = coord(1/6)
    
    Footnote
    Rez. in: JASIST 53(2002) no.4, S.320-321 (A.-C.H. Dianu): "Are you a cyberutopian or a technophobe? Do you believe that cyberspace is the ultimate home for all humans, or you completely deny the advancement of information technology? Is there a middle ground between the two extremes? Where is it and how to find it? It is exactly the middle ground that the authors, John Seely Brown and Paul Duguid, try to discover and propose in this book. From management, research to education, the book demonstrates that information technology is deeply embedded in its social context, as suggested in the book title. By uprooting it from its social context and detaching all human elements from it, information technology will no longer be viable. On the one hand, the book serves as a warning to information designers by emphasizing the importance of social and human elements in information technology development. On the other hand, it reveals to information users the importance of realizing the embedding of information technology in our lives."
  18. Seely, E.: Cataloguing non-English materials at Cleveland Public Library : a one hundred twenty four year history (1993) 0.05
    0.052316114 = product of:
      0.3138967 = sum of:
        0.3138967 = weight(_text_:seely in 577) [ClassicSimilarity], result of:
          0.3138967 = score(doc=577,freq=2.0), product of:
            0.46940652 = queryWeight, product of:
              10.087449 = idf(docFreq=4, maxDocs=44218)
              0.04653372 = queryNorm
            0.6687097 = fieldWeight in 577, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              10.087449 = idf(docFreq=4, maxDocs=44218)
              0.046875 = fieldNorm(doc=577)
      0.16666667 = coord(1/6)
    
  19. Bonhomme, S.; Roisin, C.: Interactively restructuring HTML documents (1996) 0.05
    0.051890206 = product of:
      0.15567061 = sum of:
        0.09262384 = weight(_text_:problem in 5862) [ClassicSimilarity], result of:
          0.09262384 = score(doc=5862,freq=2.0), product of:
            0.19751167 = queryWeight, product of:
              4.244485 = idf(docFreq=1723, maxDocs=44218)
              0.04653372 = queryNorm
            0.46895373 = fieldWeight in 5862, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              4.244485 = idf(docFreq=1723, maxDocs=44218)
              0.078125 = fieldNorm(doc=5862)
        0.06304677 = weight(_text_:22 in 5862) [ClassicSimilarity], result of:
          0.06304677 = score(doc=5862,freq=2.0), product of:
            0.16295315 = queryWeight, product of:
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.04653372 = queryNorm
            0.38690117 = fieldWeight in 5862, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.078125 = fieldNorm(doc=5862)
      0.33333334 = coord(2/6)
    
    Abstract
    Presents a solution to the problem of trasnforming the document structure in a HTML editor. Describes a tool based on a transformation language. Techniques that have been designed for general structured documents have been adapted to take into account the specific structure of the HTML DTD
    Date
    1. 8.1996 22:08:06
  20. Eisenberg, M.: Big 6 tips : number two. Information seeking strategies (1997) 0.05
    0.051890206 = product of:
      0.15567061 = sum of:
        0.09262384 = weight(_text_:problem in 1584) [ClassicSimilarity], result of:
          0.09262384 = score(doc=1584,freq=2.0), product of:
            0.19751167 = queryWeight, product of:
              4.244485 = idf(docFreq=1723, maxDocs=44218)
              0.04653372 = queryNorm
            0.46895373 = fieldWeight in 1584, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              4.244485 = idf(docFreq=1723, maxDocs=44218)
              0.078125 = fieldNorm(doc=1584)
        0.06304677 = weight(_text_:22 in 1584) [ClassicSimilarity], result of:
          0.06304677 = score(doc=1584,freq=2.0), product of:
            0.16295315 = queryWeight, product of:
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.04653372 = queryNorm
            0.38690117 = fieldWeight in 1584, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5018296 = idf(docFreq=3622, maxDocs=44218)
              0.078125 = fieldNorm(doc=1584)
      0.33333334 = coord(2/6)
    
    Abstract
    Discusses stage 2 in the process of teaching information problem solving, information seeking strategies, which has 2 components: determining the range of possible sources, and evaluating them to determine priorities. Describes 'brainstorming and narrow', the essential process for information seeking strategies
    Source
    Emergency librarian. 25(1997) no.2, S.22

Languages

Types

  • a 4357
  • m 497
  • el 261
  • s 197
  • x 67
  • b 39
  • r 26
  • i 25
  • ? 9
  • d 4
  • p 4
  • n 3
  • u 2
  • z 2
  • au 1
  • h 1
  • More… Less…

Themes

Subjects

Classifications