Search (4456 results, page 1 of 223)

  • × type_ss:"a"
  1. Morse, P.M.: Browsing and search theory (1973) 0.15
    0.14663067 = product of:
      0.29326135 = sum of:
        0.29326135 = sum of:
          0.1953193 = weight(_text_:theory in 3339) [ClassicSimilarity], result of:
            0.1953193 = score(doc=3339,freq=4.0), product of:
              0.21471956 = queryWeight, product of:
                4.1583924 = idf(docFreq=1878, maxDocs=44218)
                0.05163523 = queryNorm
              0.90964836 = fieldWeight in 3339, product of:
                2.0 = tf(freq=4.0), with freq of:
                  4.0 = termFreq=4.0
                4.1583924 = idf(docFreq=1878, maxDocs=44218)
                0.109375 = fieldNorm(doc=3339)
          0.097942054 = weight(_text_:22 in 3339) [ClassicSimilarity], result of:
            0.097942054 = score(doc=3339,freq=2.0), product of:
              0.18081778 = queryWeight, product of:
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.05163523 = queryNorm
              0.5416616 = fieldWeight in 3339, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.109375 = fieldNorm(doc=3339)
      0.5 = coord(1/2)
    
    Date
    22. 5.2005 19:52:29
    Source
    Toward a theory of librarianship. Papers in honor of J.H. Shera. Ed. by H. Rawski
  2. Morse, P.M.: Search theory and browsing (1970) 0.13
    0.1348878 = product of:
      0.2697756 = sum of:
        0.2697756 = sum of:
          0.15784183 = weight(_text_:theory in 1448) [ClassicSimilarity], result of:
            0.15784183 = score(doc=1448,freq=2.0), product of:
              0.21471956 = queryWeight, product of:
                4.1583924 = idf(docFreq=1878, maxDocs=44218)
                0.05163523 = queryNorm
              0.7351069 = fieldWeight in 1448, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                4.1583924 = idf(docFreq=1878, maxDocs=44218)
                0.125 = fieldNorm(doc=1448)
          0.111933775 = weight(_text_:22 in 1448) [ClassicSimilarity], result of:
            0.111933775 = score(doc=1448,freq=2.0), product of:
              0.18081778 = queryWeight, product of:
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.05163523 = queryNorm
              0.61904186 = fieldWeight in 1448, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.125 = fieldNorm(doc=1448)
      0.5 = coord(1/2)
    
    Date
    22. 5.2005 19:53:09
  3. Malsburg, C. von der: ¬The correlation theory of brain function (1981) 0.12
    0.12348971 = sum of:
      0.06834204 = product of:
        0.2050261 = sum of:
          0.2050261 = weight(_text_:3a in 76) [ClassicSimilarity], result of:
            0.2050261 = score(doc=76,freq=2.0), product of:
              0.43776408 = queryWeight, product of:
                8.478011 = idf(docFreq=24, maxDocs=44218)
                0.05163523 = queryNorm
              0.46834838 = fieldWeight in 76, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                8.478011 = idf(docFreq=24, maxDocs=44218)
                0.0390625 = fieldNorm(doc=76)
        0.33333334 = coord(1/3)
      0.05514767 = product of:
        0.11029534 = sum of:
          0.11029534 = weight(_text_:theory in 76) [ClassicSimilarity], result of:
            0.11029534 = score(doc=76,freq=10.0), product of:
              0.21471956 = queryWeight, product of:
                4.1583924 = idf(docFreq=1878, maxDocs=44218)
                0.05163523 = queryNorm
              0.5136716 = fieldWeight in 76, product of:
                3.1622777 = tf(freq=10.0), with freq of:
                  10.0 = termFreq=10.0
                4.1583924 = idf(docFreq=1878, maxDocs=44218)
                0.0390625 = fieldNorm(doc=76)
        0.5 = coord(1/2)
    
    Abstract
    A summary of brain theory is given so far as it is contained within the framework of Localization Theory. Difficulties of this "conventional theory" are traced back to a specific deficiency: there is no way to express relations between active cells (as for instance their representing parts of the same object). A new theory is proposed to cure this deficiency. It introduces a new kind of dynamical control, termed synaptic modulation, according to which synapses switch between a conducting and a non- conducting state. The dynamics of this variable is controlled on a fast time scale by correlations in the temporal fine structure of cellular signals. Furthermore, conventional synaptic plasticity is replaced by a refined version. Synaptic modulation and plasticity form the basis for short-term and long-term memory, respectively. Signal correlations, shaped by the variable network, express structure and relationships within objects. In particular, the figure-ground problem may be solved in this way. Synaptic modulation introduces exibility into cerebral networks which is necessary to solve the invariance problem. Since momentarily useless connections are deactivated, interference between di erent memory traces can be reduced, and memory capacity increased, in comparison with conventional associative memory
    Source
    http%3A%2F%2Fcogprints.org%2F1380%2F1%2FvdM_correlation.pdf&usg=AOvVaw0g7DvZbQPb2U7dYb49b9v_
  4. Warner, A.J.: Quantitative and qualitative assessments of the impact of linguistic theory on information science (1991) 0.12
    0.11802683 = product of:
      0.23605366 = sum of:
        0.23605366 = sum of:
          0.1381116 = weight(_text_:theory in 29) [ClassicSimilarity], result of:
            0.1381116 = score(doc=29,freq=2.0), product of:
              0.21471956 = queryWeight, product of:
                4.1583924 = idf(docFreq=1878, maxDocs=44218)
                0.05163523 = queryNorm
              0.6432185 = fieldWeight in 29, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                4.1583924 = idf(docFreq=1878, maxDocs=44218)
                0.109375 = fieldNorm(doc=29)
          0.097942054 = weight(_text_:22 in 29) [ClassicSimilarity], result of:
            0.097942054 = score(doc=29,freq=2.0), product of:
              0.18081778 = queryWeight, product of:
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.05163523 = queryNorm
              0.5416616 = fieldWeight in 29, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.109375 = fieldNorm(doc=29)
      0.5 = coord(1/2)
    
    Date
    6. 1.1999 10:22:45
  5. Mas, S.; Marleau, Y.: Proposition of a faceted classification model to support corporate information organization and digital records management (2009) 0.11
    0.11160578 = sum of:
      0.08201044 = product of:
        0.24603131 = sum of:
          0.24603131 = weight(_text_:3a in 2918) [ClassicSimilarity], result of:
            0.24603131 = score(doc=2918,freq=2.0), product of:
              0.43776408 = queryWeight, product of:
                8.478011 = idf(docFreq=24, maxDocs=44218)
                0.05163523 = queryNorm
              0.56201804 = fieldWeight in 2918, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                8.478011 = idf(docFreq=24, maxDocs=44218)
                0.046875 = fieldNorm(doc=2918)
        0.33333334 = coord(1/3)
      0.029595342 = product of:
        0.059190683 = sum of:
          0.059190683 = weight(_text_:theory in 2918) [ClassicSimilarity], result of:
            0.059190683 = score(doc=2918,freq=2.0), product of:
              0.21471956 = queryWeight, product of:
                4.1583924 = idf(docFreq=1878, maxDocs=44218)
                0.05163523 = queryNorm
              0.27566507 = fieldWeight in 2918, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                4.1583924 = idf(docFreq=1878, maxDocs=44218)
                0.046875 = fieldNorm(doc=2918)
        0.5 = coord(1/2)
    
    Abstract
    The employees of an organization often use a personal hierarchical classification scheme to organize digital documents that are stored on their own workstations. As this may make it hard for other employees to retrieve these documents, there is a risk that the organization will lose track of needed documentation. Furthermore, the inherent boundaries of such a hierarchical structure require making arbitrary decisions about which specific criteria the classification will b.e based on (for instance, the administrative activity or the document type, although a document can have several attributes and require classification in several classes).A faceted classification model to support corporate information organization is proposed. Partially based on Ranganathan's facets theory, this model aims not only to standardize the organization of digital documents, but also to simplify the management of a document throughout its life cycle for both individuals and organizations, while ensuring compliance to regulatory and policy requirements.
    Footnote
    Vgl.: http://ieeexplore.ieee.org/Xplore/login.jsp?reload=true&url=http%3A%2F%2Fieeexplore.ieee.org%2Fiel5%2F4755313%2F4755314%2F04755480.pdf%3Farnumber%3D4755480&authDecision=-203.
  6. Hotho, A.; Bloehdorn, S.: Data Mining 2004 : Text classification by boosting weak learners based on terms and concepts (2004) 0.10
    0.102998026 = sum of:
      0.08201044 = product of:
        0.24603131 = sum of:
          0.24603131 = weight(_text_:3a in 562) [ClassicSimilarity], result of:
            0.24603131 = score(doc=562,freq=2.0), product of:
              0.43776408 = queryWeight, product of:
                8.478011 = idf(docFreq=24, maxDocs=44218)
                0.05163523 = queryNorm
              0.56201804 = fieldWeight in 562, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                8.478011 = idf(docFreq=24, maxDocs=44218)
                0.046875 = fieldNorm(doc=562)
        0.33333334 = coord(1/3)
      0.020987583 = product of:
        0.041975167 = sum of:
          0.041975167 = weight(_text_:22 in 562) [ClassicSimilarity], result of:
            0.041975167 = score(doc=562,freq=2.0), product of:
              0.18081778 = queryWeight, product of:
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.05163523 = queryNorm
              0.23214069 = fieldWeight in 562, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.046875 = fieldNorm(doc=562)
        0.5 = coord(1/2)
    
    Content
    Vgl.: http://www.google.de/url?sa=t&rct=j&q=&esrc=s&source=web&cd=1&cad=rja&ved=0CEAQFjAA&url=http%3A%2F%2Fciteseerx.ist.psu.edu%2Fviewdoc%2Fdownload%3Fdoi%3D10.1.1.91.4940%26rep%3Drep1%26type%3Dpdf&ei=dOXrUMeIDYHDtQahsIGACg&usg=AFQjCNHFWVh6gNPvnOrOS9R3rkrXCNVD-A&sig2=5I2F5evRfMnsttSgFF9g7Q&bvm=bv.1357316858,d.Yms.
    Date
    8. 1.2013 10:22:32
  7. Foskett, D.J.: Systems theory and its relevance to documentary classification (2017) 0.10
    0.101165846 = product of:
      0.20233169 = sum of:
        0.20233169 = sum of:
          0.118381366 = weight(_text_:theory in 3176) [ClassicSimilarity], result of:
            0.118381366 = score(doc=3176,freq=2.0), product of:
              0.21471956 = queryWeight, product of:
                4.1583924 = idf(docFreq=1878, maxDocs=44218)
                0.05163523 = queryNorm
              0.55133015 = fieldWeight in 3176, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                4.1583924 = idf(docFreq=1878, maxDocs=44218)
                0.09375 = fieldNorm(doc=3176)
          0.08395033 = weight(_text_:22 in 3176) [ClassicSimilarity], result of:
            0.08395033 = score(doc=3176,freq=2.0), product of:
              0.18081778 = queryWeight, product of:
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.05163523 = queryNorm
              0.46428138 = fieldWeight in 3176, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.09375 = fieldNorm(doc=3176)
      0.5 = coord(1/2)
    
    Date
    6. 5.2017 18:46:22
  8. DeRaedt, L.: Logical settings for concept-learning (1997) 0.10
    0.09879378 = product of:
      0.19758756 = sum of:
        0.19758756 = sum of:
          0.09865115 = weight(_text_:theory in 3780) [ClassicSimilarity], result of:
            0.09865115 = score(doc=3780,freq=2.0), product of:
              0.21471956 = queryWeight, product of:
                4.1583924 = idf(docFreq=1878, maxDocs=44218)
                0.05163523 = queryNorm
              0.4594418 = fieldWeight in 3780, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                4.1583924 = idf(docFreq=1878, maxDocs=44218)
                0.078125 = fieldNorm(doc=3780)
          0.098936416 = weight(_text_:22 in 3780) [ClassicSimilarity], result of:
            0.098936416 = score(doc=3780,freq=4.0), product of:
              0.18081778 = queryWeight, product of:
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.05163523 = queryNorm
              0.54716086 = fieldWeight in 3780, product of:
                2.0 = tf(freq=4.0), with freq of:
                  4.0 = termFreq=4.0
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.078125 = fieldNorm(doc=3780)
      0.5 = coord(1/2)
    
    Abstract
    Analyzes 3 different formalisations of concept-learning in logic. Learning from interpretations reduces to learning from entailment, which in turn reduces to learning from satisfiability. Discusses the implications for inductive logic programming and computational learning theory and formulates guidelines for choosing a problem-setting method
    Date
    6. 3.1997 16:22:15
    22. 1.1999 18:56:45
  9. Van der Veer Martens, B.; Goodrum, G.: ¬The diffusion of theories : a functional approach (2006) 0.09
    0.09354132 = product of:
      0.18708263 = sum of:
        0.18708263 = sum of:
          0.1381116 = weight(_text_:theory in 5269) [ClassicSimilarity], result of:
            0.1381116 = score(doc=5269,freq=8.0), product of:
              0.21471956 = queryWeight, product of:
                4.1583924 = idf(docFreq=1878, maxDocs=44218)
                0.05163523 = queryNorm
              0.6432185 = fieldWeight in 5269, product of:
                2.828427 = tf(freq=8.0), with freq of:
                  8.0 = termFreq=8.0
                4.1583924 = idf(docFreq=1878, maxDocs=44218)
                0.0546875 = fieldNorm(doc=5269)
          0.048971027 = weight(_text_:22 in 5269) [ClassicSimilarity], result of:
            0.048971027 = score(doc=5269,freq=2.0), product of:
              0.18081778 = queryWeight, product of:
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.05163523 = queryNorm
              0.2708308 = fieldWeight in 5269, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.0546875 = fieldNorm(doc=5269)
      0.5 = coord(1/2)
    
    Abstract
    This comparative case study of the diffusion and nondiffusion over time of eight theories in the social sciences uses citation analysis, citation context analysis, content analysis, surveys of editorial review boards, and personal interviews with theorists to develop a model of the theory functions that facilitate theory diffusion throughout specific intellectual communities. Unlike previous work on the diffusion of theories as innovations, this theory functions model differs in several important respects from the findings of previous studies that employed Everett Rogers's classic typology of innovation characteristics that promote diffusion. The model is also presented as a contribution to a more integrated theory of citation.
    Date
    22. 7.2006 15:20:01
  10. Ackermann, E.: Piaget's constructivism, Papert's constructionism : what's the difference? (2001) 0.09
    0.09300482 = sum of:
      0.06834204 = product of:
        0.2050261 = sum of:
          0.2050261 = weight(_text_:3a in 692) [ClassicSimilarity], result of:
            0.2050261 = score(doc=692,freq=2.0), product of:
              0.43776408 = queryWeight, product of:
                8.478011 = idf(docFreq=24, maxDocs=44218)
                0.05163523 = queryNorm
              0.46834838 = fieldWeight in 692, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                8.478011 = idf(docFreq=24, maxDocs=44218)
                0.0390625 = fieldNorm(doc=692)
        0.33333334 = coord(1/3)
      0.024662787 = product of:
        0.049325574 = sum of:
          0.049325574 = weight(_text_:theory in 692) [ClassicSimilarity], result of:
            0.049325574 = score(doc=692,freq=2.0), product of:
              0.21471956 = queryWeight, product of:
                4.1583924 = idf(docFreq=1878, maxDocs=44218)
                0.05163523 = queryNorm
              0.2297209 = fieldWeight in 692, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                4.1583924 = idf(docFreq=1878, maxDocs=44218)
                0.0390625 = fieldNorm(doc=692)
        0.5 = coord(1/2)
    
    Abstract
    What is the difference between Piaget's constructivism and Papert's "constructionism"? Beyond the mere play on the words, I think the distinction holds, and that integrating both views can enrich our understanding of how people learn and grow. Piaget's constructivism offers a window into what children are interested in, and able to achieve, at different stages of their development. The theory describes how children's ways of doing and thinking evolve over time, and under which circumstance children are more likely to let go of-or hold onto- their currently held views. Piaget suggests that children have very good reasons not to abandon their worldviews just because someone else, be it an expert, tells them they're wrong. Papert's constructionism, in contrast, focuses more on the art of learning, or 'learning to learn', and on the significance of making things in learning. Papert is interested in how learners engage in a conversation with [their own or other people's] artifacts, and how these conversations boost self-directed learning, and ultimately facilitate the construction of new knowledge. He stresses the importance of tools, media, and context in human development. Integrating both perspectives illuminates the processes by which individuals come to make sense of their experience, gradually optimizing their interactions with the world.
    Content
    Vgl.: https://www.semanticscholar.org/paper/Piaget-%E2%80%99-s-Constructivism-%2C-Papert-%E2%80%99-s-%3A-What-%E2%80%99-s-Ackermann/89cbcc1e740a4591443ff4765a6ae8df0fdf5554. Darunter weitere Hinweise auf verwandte Beiträge. Auch unter: Learning Group Publication 5(2001) no.3, S.438.
  11. Mooers, C.: ¬The indexing language of an information retrieval system (1963) 0.08
    0.084304884 = product of:
      0.16860977 = sum of:
        0.16860977 = sum of:
          0.09865115 = weight(_text_:theory in 1641) [ClassicSimilarity], result of:
            0.09865115 = score(doc=1641,freq=2.0), product of:
              0.21471956 = queryWeight, product of:
                4.1583924 = idf(docFreq=1878, maxDocs=44218)
                0.05163523 = queryNorm
              0.4594418 = fieldWeight in 1641, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                4.1583924 = idf(docFreq=1878, maxDocs=44218)
                0.078125 = fieldNorm(doc=1641)
          0.06995861 = weight(_text_:22 in 1641) [ClassicSimilarity], result of:
            0.06995861 = score(doc=1641,freq=2.0), product of:
              0.18081778 = queryWeight, product of:
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.05163523 = queryNorm
              0.38690117 = fieldWeight in 1641, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.078125 = fieldNorm(doc=1641)
      0.5 = coord(1/2)
    
    Footnote
    Reprinted in: Theory of subject analysis: a sourcebook. Ed. by L.M. Chan et al. Littleton, CO: Libraries Unlimited 1985, S.247-261
    Source
    Information retrieval today: papers presented at an Institute conducted by the Library School and the Cetre for Continuation Study, University of Minnesota, Sept. 19-22, 1962. Ed. by. Wesley Simonton
  12. Dickson, N.: Understanding the information economy : putting theory back into practice (1997) 0.08
    0.084304884 = product of:
      0.16860977 = sum of:
        0.16860977 = sum of:
          0.09865115 = weight(_text_:theory in 3028) [ClassicSimilarity], result of:
            0.09865115 = score(doc=3028,freq=2.0), product of:
              0.21471956 = queryWeight, product of:
                4.1583924 = idf(docFreq=1878, maxDocs=44218)
                0.05163523 = queryNorm
              0.4594418 = fieldWeight in 3028, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                4.1583924 = idf(docFreq=1878, maxDocs=44218)
                0.078125 = fieldNorm(doc=3028)
          0.06995861 = weight(_text_:22 in 3028) [ClassicSimilarity], result of:
            0.06995861 = score(doc=3028,freq=2.0), product of:
              0.18081778 = queryWeight, product of:
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.05163523 = queryNorm
              0.38690117 = fieldWeight in 3028, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.078125 = fieldNorm(doc=3028)
      0.5 = coord(1/2)
    
    Date
    22. 2.1999 16:01:46
  13. Pejtersen, A.M.: Design of a classification scheme for fiction based on an analysis of actual user-librarian communication, and use of the scheme for control of librarians' search strategies (1980) 0.08
    0.084304884 = product of:
      0.16860977 = sum of:
        0.16860977 = sum of:
          0.09865115 = weight(_text_:theory in 5835) [ClassicSimilarity], result of:
            0.09865115 = score(doc=5835,freq=2.0), product of:
              0.21471956 = queryWeight, product of:
                4.1583924 = idf(docFreq=1878, maxDocs=44218)
                0.05163523 = queryNorm
              0.4594418 = fieldWeight in 5835, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                4.1583924 = idf(docFreq=1878, maxDocs=44218)
                0.078125 = fieldNorm(doc=5835)
          0.06995861 = weight(_text_:22 in 5835) [ClassicSimilarity], result of:
            0.06995861 = score(doc=5835,freq=2.0), product of:
              0.18081778 = queryWeight, product of:
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.05163523 = queryNorm
              0.38690117 = fieldWeight in 5835, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.078125 = fieldNorm(doc=5835)
      0.5 = coord(1/2)
    
    Date
    5. 8.2006 13:22:44
    Source
    Theory and application of information research. Proc. of the 2nd Int. Research Forum on Information Science, 3.-6.8.1977, Copenhagen. Ed.: O. Harbo u, L. Kajberg
  14. Haverty, M.: Information architexture without internal theory : an inductive design process (2002) 0.08
    0.08428959 = product of:
      0.16857918 = sum of:
        0.16857918 = sum of:
          0.11960815 = weight(_text_:theory in 975) [ClassicSimilarity], result of:
            0.11960815 = score(doc=975,freq=6.0), product of:
              0.21471956 = queryWeight, product of:
                4.1583924 = idf(docFreq=1878, maxDocs=44218)
                0.05163523 = queryNorm
              0.55704355 = fieldWeight in 975, product of:
                2.4494898 = tf(freq=6.0), with freq of:
                  6.0 = termFreq=6.0
                4.1583924 = idf(docFreq=1878, maxDocs=44218)
                0.0546875 = fieldNorm(doc=975)
          0.048971027 = weight(_text_:22 in 975) [ClassicSimilarity], result of:
            0.048971027 = score(doc=975,freq=2.0), product of:
              0.18081778 = queryWeight, product of:
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.05163523 = queryNorm
              0.2708308 = fieldWeight in 975, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.0546875 = fieldNorm(doc=975)
      0.5 = coord(1/2)
    
    Abstract
    This article suggests that Information Architecture (IA) design is primarily an inductive process. Although toplevel goals, user attributes and available content are periodically considered, the process involves bottom-up design activities. IA is inductive partly because it lacks internal theory, and partly because it is an activity that supports emergent phenomena (user experiences) from basic design components. The nature of IA design is well described by Constructive Induction (CI), a design process that involves locating the best representational framework for the design problem, identifying a solution within that framework and translating it back to the design problem at hand. The future of IA, if it remains inductive or develops a body of theory (or both), is considered.
    Date
    3.10.2002 17:22:41
  15. Bensman, S.J.: Eugene Garfield, Francis Narin, and PageRank : the theoretical bases of the Google search engine (2013) 0.08
    0.08378895 = product of:
      0.1675779 = sum of:
        0.1675779 = sum of:
          0.11161102 = weight(_text_:theory in 1149) [ClassicSimilarity], result of:
            0.11161102 = score(doc=1149,freq=4.0), product of:
              0.21471956 = queryWeight, product of:
                4.1583924 = idf(docFreq=1878, maxDocs=44218)
                0.05163523 = queryNorm
              0.51979905 = fieldWeight in 1149, product of:
                2.0 = tf(freq=4.0), with freq of:
                  4.0 = termFreq=4.0
                4.1583924 = idf(docFreq=1878, maxDocs=44218)
                0.0625 = fieldNorm(doc=1149)
          0.055966888 = weight(_text_:22 in 1149) [ClassicSimilarity], result of:
            0.055966888 = score(doc=1149,freq=2.0), product of:
              0.18081778 = queryWeight, product of:
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.05163523 = queryNorm
              0.30952093 = fieldWeight in 1149, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.0625 = fieldNorm(doc=1149)
      0.5 = coord(1/2)
    
    Abstract
    This paper presents a test of the validity of using Google Scholar to evaluate the publications of researchers by comparing the premises on which its search engine, PageRank, is based, to those of Garfield's theory of citation indexing. It finds that the premises are identical and that PageRank and Garfield's theory of citation indexing validate each other.
    Date
    17.12.2013 11:02:22
  16. Besler, G.; Szulc, J.: Gottlob Frege's theory of definition as useful tool for knowledge organization : definition of 'context' - case study (2014) 0.08
    0.082741246 = product of:
      0.16548249 = sum of:
        0.16548249 = sum of:
          0.13050319 = weight(_text_:theory in 1440) [ClassicSimilarity], result of:
            0.13050319 = score(doc=1440,freq=14.0), product of:
              0.21471956 = queryWeight, product of:
                4.1583924 = idf(docFreq=1878, maxDocs=44218)
                0.05163523 = queryNorm
              0.6077844 = fieldWeight in 1440, product of:
                3.7416575 = tf(freq=14.0), with freq of:
                  14.0 = termFreq=14.0
                4.1583924 = idf(docFreq=1878, maxDocs=44218)
                0.0390625 = fieldNorm(doc=1440)
          0.034979306 = weight(_text_:22 in 1440) [ClassicSimilarity], result of:
            0.034979306 = score(doc=1440,freq=2.0), product of:
              0.18081778 = queryWeight, product of:
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.05163523 = queryNorm
              0.19345059 = fieldWeight in 1440, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.0390625 = fieldNorm(doc=1440)
      0.5 = coord(1/2)
    
    Abstract
    The aim of this paper is to analyze the Gottlob Frege's (1848-1925) theory of definition as a tool for knowledge organization. The objective was achieved by discussing the theory of definition including: the aims of definition, kinds of definition, condition of correct definition, what is undefinable. Frege indicated the following aims of a defining: (1) to introduce a new word, which has had no precise meaning until then (2) to explain the meaning of a word; (3) to catch a thought. We would like to present three kinds of definitions used by Frege: a contextual definition, a stipulative definition and a piecemeal definition. In the history of theory of definition Frege was the first to have formulated the condition of a correct definition. According to Frege not everything can be defined, what is logically simple cannot have a proper definition Usability of Frege's theory of definition is referred in the case study. Definitions that serve as an example are definitions of 'context'. The term 'context' is used in different situations and meanings in the field of knowledge organization. The paper is rounded by a discussion of how Frege's theory of definition can be useful for knowledge organization. To present G. Frege's theory of definition in view of the need for knowledge organization we shall start with different ranges of knowledge organization.
    Source
    Knowledge organization in the 21st century: between historical patterns and future prospects. Proceedings of the Thirteenth International ISKO Conference 19-22 May 2014, Kraków, Poland. Ed.: Wieslaw Babik
  17. Huang, M.; Barbour, J.; Su, C.; Contractor, N.: Why do group members provide information to digital knowledge repositories? : a multilevel application of transactive memory theory (2013) 0.08
    0.08017827 = product of:
      0.16035654 = sum of:
        0.16035654 = sum of:
          0.118381366 = weight(_text_:theory in 666) [ClassicSimilarity], result of:
            0.118381366 = score(doc=666,freq=8.0), product of:
              0.21471956 = queryWeight, product of:
                4.1583924 = idf(docFreq=1878, maxDocs=44218)
                0.05163523 = queryNorm
              0.55133015 = fieldWeight in 666, product of:
                2.828427 = tf(freq=8.0), with freq of:
                  8.0 = termFreq=8.0
                4.1583924 = idf(docFreq=1878, maxDocs=44218)
                0.046875 = fieldNorm(doc=666)
          0.041975167 = weight(_text_:22 in 666) [ClassicSimilarity], result of:
            0.041975167 = score(doc=666,freq=2.0), product of:
              0.18081778 = queryWeight, product of:
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.05163523 = queryNorm
              0.23214069 = fieldWeight in 666, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.046875 = fieldNorm(doc=666)
      0.5 = coord(1/2)
    
    Abstract
    The proliferation of digital knowledge repositories (DKRs) used for distributed and collocated work raises important questions about how to manage these technologies. This study investigates why individuals contribute information to DKRs by applying and extending transactive memory theory. Data from knowledge workers (N = 208) nested in work groups (J = 17) located in Europe and the United States revealed, consistent with transactive memory theory, that perceptions of experts' retrieval of information were positively related to the likelihood of information provision to DKRs. The relationship between experts' perceptions of retrieval and information provision varied from group to group, and cross-level interactions indicated that trust in how the information would be used and the interdependence of tasks within groups could explain that variation. Furthermore, information provision to DKRs was related to communication networks in ways consistent with theorizing regarding the formation of transactive memory systems. Implications for theory and practice are discussed, emphasizing the utility of multilevel approaches for conceptualizing and modeling why individuals provide information to DKRs.
    Date
    22. 3.2013 19:39:00
  18. Mikacic, M.: Statistical system for subject designation (SSSD) for libraries in Croatia (1996) 0.08
    0.07903503 = product of:
      0.15807006 = sum of:
        0.15807006 = sum of:
          0.078920916 = weight(_text_:theory in 2943) [ClassicSimilarity], result of:
            0.078920916 = score(doc=2943,freq=2.0), product of:
              0.21471956 = queryWeight, product of:
                4.1583924 = idf(docFreq=1878, maxDocs=44218)
                0.05163523 = queryNorm
              0.36755344 = fieldWeight in 2943, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                4.1583924 = idf(docFreq=1878, maxDocs=44218)
                0.0625 = fieldNorm(doc=2943)
          0.079149134 = weight(_text_:22 in 2943) [ClassicSimilarity], result of:
            0.079149134 = score(doc=2943,freq=4.0), product of:
              0.18081778 = queryWeight, product of:
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.05163523 = queryNorm
              0.4377287 = fieldWeight in 2943, product of:
                2.0 = tf(freq=4.0), with freq of:
                  4.0 = termFreq=4.0
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.0625 = fieldNorm(doc=2943)
      0.5 = coord(1/2)
    
    Abstract
    Describes the developments of the Statistical System for Subject Designation (SSSD): a syntactical system for subject designation for libraries in Croatia, based on the construction of subject headings in agreement with the theory of the sentence nature of subject headings. The discussion is preceded by a brief summary of theories underlying basic principles and fundamental rules of the alphabetical subject catalogue
    Date
    31. 7.2006 14:22:21
    Source
    Cataloging and classification quarterly. 22(1996) no.1, S.77-93
  19. Huth, M.: Symbolic and sub-symbolic knowledge organization in the Computational Theory of Mind (1995) 0.08
    0.07790089 = product of:
      0.15580177 = sum of:
        0.15580177 = sum of:
          0.120822474 = weight(_text_:theory in 1086) [ClassicSimilarity], result of:
            0.120822474 = score(doc=1086,freq=12.0), product of:
              0.21471956 = queryWeight, product of:
                4.1583924 = idf(docFreq=1878, maxDocs=44218)
                0.05163523 = queryNorm
              0.56269896 = fieldWeight in 1086, product of:
                3.4641016 = tf(freq=12.0), with freq of:
                  12.0 = termFreq=12.0
                4.1583924 = idf(docFreq=1878, maxDocs=44218)
                0.0390625 = fieldNorm(doc=1086)
          0.034979306 = weight(_text_:22 in 1086) [ClassicSimilarity], result of:
            0.034979306 = score(doc=1086,freq=2.0), product of:
              0.18081778 = queryWeight, product of:
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.05163523 = queryNorm
              0.19345059 = fieldWeight in 1086, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.0390625 = fieldNorm(doc=1086)
      0.5 = coord(1/2)
    
    Abstract
    We sketch the historic transformation of culturally grown techniques of symbol manipulation, such as basic arithmetic in the decimal number system, to the full-fledges version of the Computational Theory of Mind. Symbol manipulation systems had been considered by Leibniz as a methodology of inferring knowledge in a secure and purely mechanical fashion. Such 'inference calculi' were considered as mer artefacts which could not possibly encompass als human knowldge acquisition. In Alan Turing's work one notices a crucial shift of perspective. The abstract mathematical states of a Turing machine (a kind of 'calculus universalis' that Leibniz was looking for) are claimed to correspond th equivalent psychological states. Artefacts are turned into faithful models of human cognition. A further step toward the Computational Theory of Mind was the physical symbol system hypothesis, contending to have found a necessary and sifficient criterion for the presence of 'intelligence' in operative mediums. This, together with Chomsky's foundational work on linguistics, led naturally to the Computational Theory of Mind as set out by Jerry Fodor and Zenon Pylshyn. We discuss problematic aspects of this theory. Then we deal with another paradigm of the Computational Theory of Mind based on network automata. This sub-symbolic paradigm seems to avoid problems occuring in symbolic computations, like the proble 'frame problem' and 'graceful degradation'
    Source
    Knowledge organization. 22(1995) no.1, S.10-17
  20. Dack, D.: Australian attends conference on Dewey (1989) 0.07
    0.07331534 = product of:
      0.14663067 = sum of:
        0.14663067 = sum of:
          0.09765965 = weight(_text_:theory in 2509) [ClassicSimilarity], result of:
            0.09765965 = score(doc=2509,freq=4.0), product of:
              0.21471956 = queryWeight, product of:
                4.1583924 = idf(docFreq=1878, maxDocs=44218)
                0.05163523 = queryNorm
              0.45482418 = fieldWeight in 2509, product of:
                2.0 = tf(freq=4.0), with freq of:
                  4.0 = termFreq=4.0
                4.1583924 = idf(docFreq=1878, maxDocs=44218)
                0.0546875 = fieldNorm(doc=2509)
          0.048971027 = weight(_text_:22 in 2509) [ClassicSimilarity], result of:
            0.048971027 = score(doc=2509,freq=2.0), product of:
              0.18081778 = queryWeight, product of:
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.05163523 = queryNorm
              0.2708308 = fieldWeight in 2509, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.0546875 = fieldNorm(doc=2509)
      0.5 = coord(1/2)
    
    Abstract
    Edited version of a report to the Australian Library and Information Association on the Conference on classification theory in the computer age, Albany, New York, 18-19 Nov 88, and on the meeting of the Dewey Editorial Policy Committee which preceded it. The focus of the Editorial Policy Committee Meeting lay in the following areas: browsing; potential for improved subject access; system design; potential conflict between shelf location and information retrieval; and users. At the Conference on classification theory in the computer age the following papers were presented: Applications of artificial intelligence to bibliographic classification, by Irene Travis; Automation and classification, By Elaine Svenonious; Subject classification and language processing for retrieval in large data bases, by Diana Scott; Implications for information processing, by Carol Mandel; and implications for information science education, by Richard Halsey.
    Date
    8.11.1995 11:52:22

Languages

Types

  • el 108
  • b 34
  • p 2
  • s 1
  • More… Less…

Themes

Classifications