Search (221 results, page 11 of 12)

  • × theme_ss:"Klassifikationstheorie: Elemente / Struktur"
  • × type_ss:"a"
  1. Jacob, E.K.: Augmenting human capabilities : classification as cognitive scaffolding (2003) 0.00
    0.0013485396 = product of:
      0.0067426977 = sum of:
        0.0067426977 = weight(_text_:a in 2672) [ClassicSimilarity], result of:
          0.0067426977 = score(doc=2672,freq=4.0), product of:
            0.053464882 = queryWeight, product of:
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.046368346 = queryNorm
            0.12611452 = fieldWeight in 2672, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.0546875 = fieldNorm(doc=2672)
      0.2 = coord(1/5)
    
    Abstract
    The argument presented here seeks to extend the notion of the classification scheme as a culturally-transmitted tool by emphasizing the cognitive value of the scheme's internal patterns of relationship. lt elaborates an the use of classification as cognitive scaffolding (Jacob, 2001) and amplifies this idea through application of three constructs - constraints, selections and expectations - derived from Luhmann's (1995) theory of social systems.
    Type
    a
  2. Ranganathan, S.R.: Facet analysis: fundamental categories (1985) 0.00
    0.0013485396 = product of:
      0.0067426977 = sum of:
        0.0067426977 = weight(_text_:a in 3631) [ClassicSimilarity], result of:
          0.0067426977 = score(doc=3631,freq=16.0), product of:
            0.053464882 = queryWeight, product of:
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.046368346 = queryNorm
            0.12611452 = fieldWeight in 3631, product of:
              4.0 = tf(freq=16.0), with freq of:
                16.0 = termFreq=16.0
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.02734375 = fieldNorm(doc=3631)
      0.2 = coord(1/5)
    
    Abstract
    Among the theorists in the field of subject analysis in the twentieth century, none has been more influential than S. R. Ranganathan (1892-1972) of India, a mathematician by training who turned to librarianship and made some of the most far-reaching contributions to the theory of librarianship in general and subject analysis in particular. Dissatisfied with both the Dewey Decimal Classification and the Universal Decimal Classification, Ranganathan set out to develop his own system. His Colon Classification was first published in 1933 and went through six editions; the seventh edition was in progress when Ranganathan died in 1972. In the course of developing the Colon Classification, Ranganathan formulated a body of classification theory which was published in numerous writings, of which the best known are Elements of Library Classification (1945; 3rd ed., 1962) and Prolegomena to Library Classification (1967). Among the principles Ranganathan established, the most powerful and influential are those relating to facet analysis. Ranganathan demonstrated that facet analysis (breaking down subjects into their component parts) and synthesis (recombining these parts to fit the documents) provide the most viable approach to representing the contents of documents. Although the idea and use of facets, though not always called by that name, have been present for a long time (for instance, in the Dewey Decimal Classification and Charles A. Cutter's Expansive Classification), Ranganathan was the person who systematized the ideas and established principles for them. For his Colon Classification, Ranganathan identified five fundamental categories: Personality (P), Material (M), Energy (E), Space (S) and Time (T) and the citation order PMEST based an the idea of decreasing concreteness.
    The Colon Classification has not been widely adopted; however, the theory of facet analysis and synthesis Ranganathan developed has proved to be most influential. Although many theorists of subject analysis do not totally agree with his fundamental categories or citation order, Ranganathan's concept of facet analysis and synthesis has provided a viable method and a framework for approaching subject analysis and has become the foundation of subject analysis in the twentieth century. In this sense, his theory laid the groundwork for later investigations and inquiries into the nature of subject and classificatory categories and citation order. His influence is felt in all modern classification schemes and indexing systems. This is attested to by the citations to his ideas and works in numerous papers included in this collection and by the fact that other modern classification systems such as the Dewey Decimal Classification and the Bliss Bibliographic Classification have become increasingly faceted in recent editions. The following chapter from Elements of Library Classification represents one of Ranganathan's many expositions of facet analysis and fundamental categories. It is chosen because of its clarity of expression and comprehensibility (many readers find the majority of his writings difficult to understand).
    Source
    Theory of subject analysis: a sourcebook. Ed.: L.M. Chan, et al
    Type
    a
  3. Bosch, M.: Ontologies, different reasoning strategies, different logics, different kinds of knowledge representation : working together (2006) 0.00
    0.0013485396 = product of:
      0.0067426977 = sum of:
        0.0067426977 = weight(_text_:a in 166) [ClassicSimilarity], result of:
          0.0067426977 = score(doc=166,freq=4.0), product of:
            0.053464882 = queryWeight, product of:
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.046368346 = queryNorm
            0.12611452 = fieldWeight in 166, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.0546875 = fieldNorm(doc=166)
      0.2 = coord(1/5)
    
    Abstract
    The recent experiences in the building, maintenance and reuse of ontologies has shown that the most efficient approach is the collaborative one. However, communication between collaborators such as IT professionals, librarians, web designers and subject matter experts is difficult and time consuming. This is because there are different reasoning strategies, different logics and different kinds of knowledge representation in the applications of Semantic Web. This article intends to be a reference scheme. It uses concise and simple explanations that can be used in common by specialists of different backgrounds working together in an application of Semantic Web.
    Type
    a
  4. Green, R.; Panzer, M.: ¬The ontological character of classes in the Dewey Decimal Classification 0.00
    0.0013485396 = product of:
      0.0067426977 = sum of:
        0.0067426977 = weight(_text_:a in 3530) [ClassicSimilarity], result of:
          0.0067426977 = score(doc=3530,freq=4.0), product of:
            0.053464882 = queryWeight, product of:
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.046368346 = queryNorm
            0.12611452 = fieldWeight in 3530, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.0546875 = fieldNorm(doc=3530)
      0.2 = coord(1/5)
    
    Abstract
    Classes in the Dewey Decimal Classification (DDC) system function as neighborhoods around focal topics in captions and notes. Topical neighborhoods are generated through specialization and instantiation, complex topic synthesis, index terms and mapped headings, hierarchical force, rules for choosing between numbers, development of the DDC over time, and use of the system in classifying resources. Implications of representation using a formal knowledge representation language are explored.
    Type
    a
  5. Olson, H.; Nielsen, J.; Dippie, S.R.: Encyclopaedist rivalry, classificatory commonality, illusory universality (2003) 0.00
    0.001334708 = product of:
      0.0066735395 = sum of:
        0.0066735395 = weight(_text_:a in 2761) [ClassicSimilarity], result of:
          0.0066735395 = score(doc=2761,freq=12.0), product of:
            0.053464882 = queryWeight, product of:
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.046368346 = queryNorm
            0.12482099 = fieldWeight in 2761, product of:
              3.4641016 = tf(freq=12.0), with freq of:
                12.0 = termFreq=12.0
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.03125 = fieldNorm(doc=2761)
      0.2 = coord(1/5)
    
    Abstract
    This paper describes the cultural construction of classification as exemplified by the French Encyclopòudists, Jean d'Alembert and Denis Diderot, and the encyclopaedism of Samuel Taylor Coleridge analysing original texts digitized and encoded using XML and an adaptation of TEI. 1. Introduction This paper, focusing an encyclopaedism, is part of a larger study exploring the cultural construction of classification. The larger study explores possible foundations for bias in the structure of classifications with a view to more equitable practice. Bias in classification has been documented relative to race, ethnicity, gender, religion, sexuality and other factors. Analyses and proposed solutions have addressed only acute biases in particular systems, not the systems themselves. The project tentatively identifies the systemic roots of bias are culturally specific and reflected in the structure of conventional classifcatory practices. A wide range of western cultural texts from classic Greek philosophy to twentieth-century ethnography is being analysed. The consistency with which certain presumptions are revealed, no matter how different the philosophical and social views of the authors, indicates their ubiquity in western thought, though it is not mirrored in many other cultures. We hope that an understanding of these fundamental cultural presumptions will make space for development of alternative approaches to knowledge organization that can work alongside conventional methods. This paper describes an example of the first phase of the project, which is a deconstruction developed from relevant texts. In the context of encyclopaedism the key texts used in this paper are Jean d'Alembert's Preliminary Discourse to the Encyclopedie, selections from Denis Diderot's contributions to the Encyclopedie, and Samuel Taylor Coleridge's Treatise an Method and Prospectus of the Encyclopedia Metropolitana. We are analysing these texts in digital form using Extensible Markup Language (XML) implemented via a document type definition (DTD) created for the purpose including elements of the Text Encoding Initiative (TEI). We will first explain the encoding methodology; then define the differences between the French Encyclopaedists and the English Coleridge; deconstruct these differences by allowing the commonalities between the texts to emerge; and, finally, examine their cultural specificity.
    Type
    a
  6. Beghtol, C.: General classification systems : structural principles for multidisciplinary specification (1998) 0.00
    0.001155891 = product of:
      0.005779455 = sum of:
        0.005779455 = weight(_text_:a in 44) [ClassicSimilarity], result of:
          0.005779455 = score(doc=44,freq=4.0), product of:
            0.053464882 = queryWeight, product of:
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.046368346 = queryNorm
            0.10809815 = fieldWeight in 44, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.046875 = fieldNorm(doc=44)
      0.2 = coord(1/5)
    
    Abstract
    In this century, knowledge creation, production, dissemination and use have changed profoundly. Intellectual and physical barriers have been substantially reduced by the rise of multidisciplinarity and by the influence of computerization, particularly by the spread of the World Wide Web (WWW). Bibliographic classification systems need to respond to this situation. Three possible strategic responses are described: 1) adopting an existing system; 2) adapting an existing system; and 3) finding new structural principles for classification systems. Examples of these three responses are given. An extended example of the third option uses the knowledge outline in the Spectrum of Britannica Online to suggest a theory of "viewpoint warrant" that could be used to incorporate differing perspectives into general classification systems
    Type
    a
  7. Cordeiro, M.I.; Slavic, A.: Data models for knowledge organization tools : evolution and perspectives (2003) 0.00
    0.001155891 = product of:
      0.005779455 = sum of:
        0.005779455 = weight(_text_:a in 2632) [ClassicSimilarity], result of:
          0.005779455 = score(doc=2632,freq=4.0), product of:
            0.053464882 = queryWeight, product of:
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.046368346 = queryNorm
            0.10809815 = fieldWeight in 2632, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.046875 = fieldNorm(doc=2632)
      0.2 = coord(1/5)
    
    Type
    a
  8. Negrini, G.; Zozi, P.: Ontological analysis of the literary work of art (2003) 0.00
    0.001155891 = product of:
      0.005779455 = sum of:
        0.005779455 = weight(_text_:a in 2687) [ClassicSimilarity], result of:
          0.005779455 = score(doc=2687,freq=4.0), product of:
            0.053464882 = queryWeight, product of:
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.046368346 = queryNorm
            0.10809815 = fieldWeight in 2687, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.046875 = fieldNorm(doc=2687)
      0.2 = coord(1/5)
    
    Abstract
    Ontological structures can aid the understanding and modelling of works of art. Ontology of the aesthetic object, and particularly of the literary work, has been analysed by Hartmann and Ingarden. Application of Dahlberg's ontical 'systematifier' model enabled us to organize the entire structure of the Thesaurus of Italian Literature, and to highlight a number of significant aspects of the literary work. After describing the conclusions arising from the experience of compiling the thesaurus, the paper briefly outlines Hartmann's and Ingarden's theories of levels and seeks to identify commonalities between the ontological analysis of the two theories and the conclusions of the thesaurus.
    Type
    a
  9. Kaula, P.N.: Canons in analytico-synthetic classification (1979) 0.00
    0.0010897844 = product of:
      0.005448922 = sum of:
        0.005448922 = weight(_text_:a in 1428) [ClassicSimilarity], result of:
          0.005448922 = score(doc=1428,freq=2.0), product of:
            0.053464882 = queryWeight, product of:
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.046368346 = queryNorm
            0.10191591 = fieldWeight in 1428, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.0625 = fieldNorm(doc=1428)
      0.2 = coord(1/5)
    
    Type
    a
  10. Mai, J.E.: Classification of the Web : challenges and inquiries (2004) 0.00
    0.0010897844 = product of:
      0.005448922 = sum of:
        0.005448922 = weight(_text_:a in 3075) [ClassicSimilarity], result of:
          0.005448922 = score(doc=3075,freq=2.0), product of:
            0.053464882 = queryWeight, product of:
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.046368346 = queryNorm
            0.10191591 = fieldWeight in 3075, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.0625 = fieldNorm(doc=3075)
      0.2 = coord(1/5)
    
    Type
    a
  11. Szostak, R.: Interdisciplinarity and the classification of scholarly documents by phenomena, theories and methods (2007) 0.00
    0.0010897844 = product of:
      0.005448922 = sum of:
        0.005448922 = weight(_text_:a in 1135) [ClassicSimilarity], result of:
          0.005448922 = score(doc=1135,freq=2.0), product of:
            0.053464882 = queryWeight, product of:
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.046368346 = queryNorm
            0.10191591 = fieldWeight in 1135, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.0625 = fieldNorm(doc=1135)
      0.2 = coord(1/5)
    
    Type
    a
  12. Keilty, P.: Tabulating queer : space, perversion, and belonging (2009) 0.00
    0.0010897844 = product of:
      0.005448922 = sum of:
        0.005448922 = weight(_text_:a in 3253) [ClassicSimilarity], result of:
          0.005448922 = score(doc=3253,freq=2.0), product of:
            0.053464882 = queryWeight, product of:
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.046368346 = queryNorm
            0.10191591 = fieldWeight in 3253, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.0625 = fieldNorm(doc=3253)
      0.2 = coord(1/5)
    
    Type
    a
  13. Gödert, W.: Bibliothekarische Klassifikationssysteme und on-line-Kataloge : Grundlagen und Anwendungen (1987) 0.00
    0.0010897844 = product of:
      0.005448922 = sum of:
        0.005448922 = weight(_text_:a in 4576) [ClassicSimilarity], result of:
          0.005448922 = score(doc=4576,freq=2.0), product of:
            0.053464882 = queryWeight, product of:
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.046368346 = queryNorm
            0.10191591 = fieldWeight in 4576, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.0625 = fieldNorm(doc=4576)
      0.2 = coord(1/5)
    
    Type
    a
  14. Hjoerland, B.: Classification (2017) 0.00
    0.0010897844 = product of:
      0.005448922 = sum of:
        0.005448922 = weight(_text_:a in 3610) [ClassicSimilarity], result of:
          0.005448922 = score(doc=3610,freq=2.0), product of:
            0.053464882 = queryWeight, product of:
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.046368346 = queryNorm
            0.10191591 = fieldWeight in 3610, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.0625 = fieldNorm(doc=3610)
      0.2 = coord(1/5)
    
    Type
    a
  15. Franz, S.; Lopatka, T.; Kunze, G.; Meyn, N.; Strupler, N.: Un/Doing Classification : Bibliothekarische Klassifikationssysteme zwischen Universalitätsanspruch und reduktionistischer Wissensorganisation (2022) 0.00
    0.0010897844 = product of:
      0.005448922 = sum of:
        0.005448922 = weight(_text_:a in 675) [ClassicSimilarity], result of:
          0.005448922 = score(doc=675,freq=2.0), product of:
            0.053464882 = queryWeight, product of:
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.046368346 = queryNorm
            0.10191591 = fieldWeight in 675, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.0625 = fieldNorm(doc=675)
      0.2 = coord(1/5)
    
    Type
    a
  16. Green, R.: Facet analysis and semantic frames (2017) 0.00
    9.6324255E-4 = product of:
      0.0048162127 = sum of:
        0.0048162127 = weight(_text_:a in 3849) [ClassicSimilarity], result of:
          0.0048162127 = score(doc=3849,freq=4.0), product of:
            0.053464882 = queryWeight, product of:
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.046368346 = queryNorm
            0.090081796 = fieldWeight in 3849, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.0390625 = fieldNorm(doc=3849)
      0.2 = coord(1/5)
    
    Abstract
    Various fields, each with its own theories, techniques, and tools, are concerned with identifying and representing the conceptual structure of specific knowledge domains. This paper compares facet analysis, an analytic technique coming out of knowledge organization (especially as undertaken by members of the Classification Research Group (CRG)), with semantic frame analysis, an analytic technique coming out of lexical semantics (especially as undertaken by the developers of Frame-Net) The investigation addresses three questions: 1) how do CRG-style facet analysis and semantic frame analysis characterize the conceptual structures that they identify?; 2) how similar are the techniques they use?; and, 3) how similar are the conceptual structures they produce? Facet analysis is concerned with the logical categories underlying the terminology of an entire field, while semantic frame analysis is concerned with the participant-and-prop structure manifest in sentences about a type of situation or event. When their scope of application is similar, as, for example, in the areas of the performing arts or education, the resulting facets and semantic frame elements often bear striking resemblance, without being the same; facets are more often expressed as semantic types, while frame elements are more often expressed as roles.
    Type
    a
  17. Weinberger, O.: Begriffsstruktur und Klassifikation (1980) 0.00
    9.5356145E-4 = product of:
      0.004767807 = sum of:
        0.004767807 = weight(_text_:a in 1440) [ClassicSimilarity], result of:
          0.004767807 = score(doc=1440,freq=2.0), product of:
            0.053464882 = queryWeight, product of:
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.046368346 = queryNorm
            0.089176424 = fieldWeight in 1440, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.0546875 = fieldNorm(doc=1440)
      0.2 = coord(1/5)
    
    Type
    a
  18. Foskett, D.J.: Systems theory and its relevance to documentary classification (2017) 0.00
    9.5356145E-4 = product of:
      0.004767807 = sum of:
        0.004767807 = weight(_text_:a in 3617) [ClassicSimilarity], result of:
          0.004767807 = score(doc=3617,freq=2.0), product of:
            0.053464882 = queryWeight, product of:
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.046368346 = queryNorm
            0.089176424 = fieldWeight in 3617, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.0546875 = fieldNorm(doc=3617)
      0.2 = coord(1/5)
    
    Type
    a
  19. Rescheleit, W.; Menner, L.: Vergleich der Wissensrepräsentationssprache FRL mit Dezimalklassifikation und Facettenklassifikation (1986) 0.00
    8.173384E-4 = product of:
      0.004086692 = sum of:
        0.004086692 = weight(_text_:a in 1555) [ClassicSimilarity], result of:
          0.004086692 = score(doc=1555,freq=2.0), product of:
            0.053464882 = queryWeight, product of:
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.046368346 = queryNorm
            0.07643694 = fieldWeight in 1555, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.046875 = fieldNorm(doc=1555)
      0.2 = coord(1/5)
    
    Type
    a
  20. Dousa, T.M.: Empirical observation, rational structures, and pragmatist aims : epistemology and method in Julius Otto Kaiser's theory of systematic indexing (2008) 0.00
    8.173384E-4 = product of:
      0.004086692 = sum of:
        0.004086692 = weight(_text_:a in 2508) [ClassicSimilarity], result of:
          0.004086692 = score(doc=2508,freq=2.0), product of:
            0.053464882 = queryWeight, product of:
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.046368346 = queryNorm
            0.07643694 = fieldWeight in 2508, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              1.153047 = idf(docFreq=37942, maxDocs=44218)
              0.046875 = fieldNorm(doc=2508)
      0.2 = coord(1/5)
    
    Type
    a

Authors

Languages