Search (23510 results, page 2 of 1176)

  • × language_ss:"e"
  1. Noever, D.; Ciolino, M.: ¬The Turing deception (2022) 0.15
    0.1523986 = product of:
      0.25399765 = sum of:
        0.03980924 = product of:
          0.19904618 = sum of:
            0.19904618 = weight(_text_:3a in 862) [ClassicSimilarity], result of:
              0.19904618 = score(doc=862,freq=2.0), product of:
                0.35416332 = queryWeight, product of:
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.04177434 = queryNorm
                0.56201804 = fieldWeight in 862, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.046875 = fieldNorm(doc=862)
          0.2 = coord(1/5)
        0.19904618 = weight(_text_:2f in 862) [ClassicSimilarity], result of:
          0.19904618 = score(doc=862,freq=2.0), product of:
            0.35416332 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.04177434 = queryNorm
            0.56201804 = fieldWeight in 862, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.046875 = fieldNorm(doc=862)
        0.015142222 = weight(_text_:of in 862) [ClassicSimilarity], result of:
          0.015142222 = score(doc=862,freq=10.0), product of:
            0.06532493 = queryWeight, product of:
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.04177434 = queryNorm
            0.23179851 = fieldWeight in 862, product of:
              3.1622777 = tf(freq=10.0), with freq of:
                10.0 = termFreq=10.0
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.046875 = fieldNorm(doc=862)
      0.6 = coord(3/5)
    
    Abstract
    This research revisits the classic Turing test and compares recent large language models such as ChatGPT for their abilities to reproduce human-level comprehension and compelling text generation. Two task challenges- summary and question answering- prompt ChatGPT to produce original content (98-99%) from a single text entry and sequential questions initially posed by Turing in 1950. We score the original and generated content against the OpenAI GPT-2 Output Detector from 2019, and establish multiple cases where the generated content proves original and undetectable (98%). The question of a machine fooling a human judge recedes in this work relative to the question of "how would one prove it?" The original contribution of the work presents a metric and simple grammatical set for understanding the writing mechanics of chatbots in evaluating their readability and statistical clarity, engagement, delivery, overall quality, and plagiarism risks. While Turing's original prose scores at least 14% below the machine-generated output, whether an algorithm displays hints of Turing's true initial thoughts (the "Lovelace 2.0" test) remains unanswerable.
    Source
    https%3A%2F%2Farxiv.org%2Fabs%2F2212.06721&usg=AOvVaw3i_9pZm9y_dQWoHi6uv0EN
  2. Robinson, L.; Bawden, D.: Mind the gap : transitions between concepts of information in varied domains (2014) 0.15
    0.1500192 = product of:
      0.25003198 = sum of:
        0.14058095 = weight(_text_:philosophy in 1315) [ClassicSimilarity], result of:
          0.14058095 = score(doc=1315,freq=2.0), product of:
            0.23055021 = queryWeight, product of:
              5.5189433 = idf(docFreq=481, maxDocs=44218)
              0.04177434 = queryNorm
            0.60976285 = fieldWeight in 1315, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              5.5189433 = idf(docFreq=481, maxDocs=44218)
              0.078125 = fieldNorm(doc=1315)
        0.019548526 = weight(_text_:of in 1315) [ClassicSimilarity], result of:
          0.019548526 = score(doc=1315,freq=6.0), product of:
            0.06532493 = queryWeight, product of:
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.04177434 = queryNorm
            0.2992506 = fieldWeight in 1315, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.078125 = fieldNorm(doc=1315)
        0.0899025 = product of:
          0.179805 = sum of:
            0.179805 = weight(_text_:mind in 1315) [ClassicSimilarity], result of:
              0.179805 = score(doc=1315,freq=2.0), product of:
                0.2607373 = queryWeight, product of:
                  6.241566 = idf(docFreq=233, maxDocs=44218)
                  0.04177434 = queryNorm
                0.68960214 = fieldWeight in 1315, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  6.241566 = idf(docFreq=233, maxDocs=44218)
                  0.078125 = fieldNorm(doc=1315)
          0.5 = coord(1/2)
      0.6 = coord(3/5)
    
    Series
    Studies in history and philosophy of science ; 34
    Source
    Theories of information, communication and knowledge : a multidisciplinary approach. Eds.: F. Ibekwe-SanJuan u. T.M. Dousa
  3. Huo, W.: Automatic multi-word term extraction and its application to Web-page summarization (2012) 0.14
    0.14110757 = product of:
      0.23517928 = sum of:
        0.19904618 = weight(_text_:2f in 563) [ClassicSimilarity], result of:
          0.19904618 = score(doc=563,freq=2.0), product of:
            0.35416332 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.04177434 = queryNorm
            0.56201804 = fieldWeight in 563, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.046875 = fieldNorm(doc=563)
        0.019153563 = weight(_text_:of in 563) [ClassicSimilarity], result of:
          0.019153563 = score(doc=563,freq=16.0), product of:
            0.06532493 = queryWeight, product of:
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.04177434 = queryNorm
            0.2932045 = fieldWeight in 563, product of:
              4.0 = tf(freq=16.0), with freq of:
                16.0 = termFreq=16.0
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.046875 = fieldNorm(doc=563)
        0.016979538 = product of:
          0.033959076 = sum of:
            0.033959076 = weight(_text_:22 in 563) [ClassicSimilarity], result of:
              0.033959076 = score(doc=563,freq=2.0), product of:
                0.14628662 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.04177434 = queryNorm
                0.23214069 = fieldWeight in 563, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.046875 = fieldNorm(doc=563)
          0.5 = coord(1/2)
      0.6 = coord(3/5)
    
    Abstract
    In this thesis we propose three new word association measures for multi-word term extraction. We combine these association measures with LocalMaxs algorithm in our extraction model and compare the results of different multi-word term extraction methods. Our approach is language and domain independent and requires no training data. It can be applied to such tasks as text summarization, information retrieval, and document classification. We further explore the potential of using multi-word terms as an effective representation for general web-page summarization. We extract multi-word terms from human written summaries in a large collection of web-pages, and generate the summaries by aligning document words with these multi-word terms. Our system applies machine translation technology to learn the aligning process from a training set and focuses on selecting high quality multi-word terms from human written summaries to generate suitable results for web-page summarization.
    Content
    A Thesis presented to The University of Guelph In partial fulfilment of requirements for the degree of Master of Science in Computer Science. Vgl. Unter: http://www.inf.ufrgs.br%2F~ceramisch%2Fdownload_files%2Fpublications%2F2009%2Fp01.pdf.
    Date
    10. 1.2013 19:22:47
    Imprint
    Guelph, Ontario : University of Guelph
  4. Thornley, C.; Gibb, F.: Meaning in philosophy and meaning in information retrieval (IR) (2009) 0.13
    0.13443762 = product of:
      0.2240627 = sum of:
        0.18597113 = weight(_text_:philosophy in 2682) [ClassicSimilarity], result of:
          0.18597113 = score(doc=2682,freq=14.0), product of:
            0.23055021 = queryWeight, product of:
              5.5189433 = idf(docFreq=481, maxDocs=44218)
              0.04177434 = queryNorm
            0.80664045 = fieldWeight in 2682, product of:
              3.7416575 = tf(freq=14.0), with freq of:
                14.0 = termFreq=14.0
              5.5189433 = idf(docFreq=481, maxDocs=44218)
              0.0390625 = fieldNorm(doc=2682)
        0.023941955 = weight(_text_:of in 2682) [ClassicSimilarity], result of:
          0.023941955 = score(doc=2682,freq=36.0), product of:
            0.06532493 = queryWeight, product of:
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.04177434 = queryNorm
            0.36650562 = fieldWeight in 2682, product of:
              6.0 = tf(freq=36.0), with freq of:
                36.0 = termFreq=36.0
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.0390625 = fieldNorm(doc=2682)
        0.0141496165 = product of:
          0.028299233 = sum of:
            0.028299233 = weight(_text_:22 in 2682) [ClassicSimilarity], result of:
              0.028299233 = score(doc=2682,freq=2.0), product of:
                0.14628662 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.04177434 = queryNorm
                0.19345059 = fieldWeight in 2682, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=2682)
          0.5 = coord(1/2)
      0.6 = coord(3/5)
    
    Abstract
    Purpose - The purpose of this paper is to explore the question of whether the differences between meaning in philosophy and meaning in information retrieval (IR) have implications for the use of philosophy in supporting research in IR. Design/methodology/approach - The approach takes the form of a conceptual analysis and literature review. Findings - There are some differences in the role of meaning in terms of purpose, content and use which should be clarified in order to assist a productive relationship between the philosophy of language and IR. Research limitations/implications - This provides some new theoretical insights into the philosophical context of IR. It suggests that further productive work on the central concepts within IR could be achieved through the use of a methodology which analyses how exactly these concepts are discussed in other disciplines and the implications of any differences in the way in which they may operate in IR. Originality/value - The paper suggests a new perspective on the relationship between philosophy and IR by exploring the role of meaning in these respective disciplines and highlighting differences, as well as similarities, with particular reference to the role of information as well as meaning in IR. This contributes to an understanding of two of the central concepts in IR, meaning and information, and the ways in which they are related. There is a history of work in IR and information science (IS) examining dilemmas and the paper builds on this work by relating it to some similar dilemmas in philosophy. Thus it develops the theory and conceptual understanding of IR by suggesting that philosophy could be used as a way of exploring intractable dilemmas in IR.
    Date
    23. 2.2009 17:22:29
    Source
    Journal of documentation. 65(2009) no.1, S.133-150
  5. Farazi, M.: Faceted lightweight ontologies : a formalization and some experiments (2010) 0.13
    0.1320966 = product of:
      0.220161 = sum of:
        0.033174362 = product of:
          0.16587181 = sum of:
            0.16587181 = weight(_text_:3a in 4997) [ClassicSimilarity], result of:
              0.16587181 = score(doc=4997,freq=2.0), product of:
                0.35416332 = queryWeight, product of:
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.04177434 = queryNorm
                0.46834838 = fieldWeight in 4997, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=4997)
          0.2 = coord(1/5)
        0.16587181 = weight(_text_:2f in 4997) [ClassicSimilarity], result of:
          0.16587181 = score(doc=4997,freq=2.0), product of:
            0.35416332 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.04177434 = queryNorm
            0.46834838 = fieldWeight in 4997, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.0390625 = fieldNorm(doc=4997)
        0.02111482 = weight(_text_:of in 4997) [ClassicSimilarity], result of:
          0.02111482 = score(doc=4997,freq=28.0), product of:
            0.06532493 = queryWeight, product of:
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.04177434 = queryNorm
            0.32322758 = fieldWeight in 4997, product of:
              5.2915025 = tf(freq=28.0), with freq of:
                28.0 = termFreq=28.0
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.0390625 = fieldNorm(doc=4997)
      0.6 = coord(3/5)
    
    Abstract
    While classifications are heavily used to categorize web content, the evolution of the web foresees a more formal structure - ontology - which can serve this purpose. Ontologies are core artifacts of the Semantic Web which enable machines to use inference rules to conduct automated reasoning on data. Lightweight ontologies bridge the gap between classifications and ontologies. A lightweight ontology (LO) is an ontology representing a backbone taxonomy where the concept of the child node is more specific than the concept of the parent node. Formal lightweight ontologies can be generated from their informal ones. The key applications of formal lightweight ontologies are document classification, semantic search, and data integration. However, these applications suffer from the following problems: the disambiguation accuracy of the state of the art NLP tools used in generating formal lightweight ontologies from their informal ones; the lack of background knowledge needed for the formal lightweight ontologies; and the limitation of ontology reuse. In this dissertation, we propose a novel solution to these problems in formal lightweight ontologies; namely, faceted lightweight ontology (FLO). FLO is a lightweight ontology in which terms, present in each node label, and their concepts, are available in the background knowledge (BK), which is organized as a set of facets. A facet can be defined as a distinctive property of the groups of concepts that can help in differentiating one group from another. Background knowledge can be defined as a subset of a knowledge base, such as WordNet, and often represents a specific domain.
    Content
    PhD Dissertation at International Doctorate School in Information and Communication Technology. Vgl.: https%3A%2F%2Fcore.ac.uk%2Fdownload%2Fpdf%2F150083013.pdf&usg=AOvVaw2n-qisNagpyT0lli_6QbAQ.
    Imprint
    Trento : University / Department of information engineering and computer science
  6. Stojanovic, N.: Ontology-based Information Retrieval : methods and tools for cooperative query answering (2005) 0.13
    0.13028319 = product of:
      0.21713865 = sum of:
        0.06638306 = product of:
          0.16595763 = sum of:
            0.13269745 = weight(_text_:3a in 701) [ClassicSimilarity], result of:
              0.13269745 = score(doc=701,freq=2.0), product of:
                0.35416332 = queryWeight, product of:
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.04177434 = queryNorm
                0.3746787 = fieldWeight in 701, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.03125 = fieldNorm(doc=701)
            0.033260178 = weight(_text_:problem in 701) [ClassicSimilarity], result of:
              0.033260178 = score(doc=701,freq=2.0), product of:
                0.17731056 = queryWeight, product of:
                  4.244485 = idf(docFreq=1723, maxDocs=44218)
                  0.04177434 = queryNorm
                0.1875815 = fieldWeight in 701, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  4.244485 = idf(docFreq=1723, maxDocs=44218)
                  0.03125 = fieldNorm(doc=701)
          0.4 = coord(2/5)
        0.13269745 = weight(_text_:2f in 701) [ClassicSimilarity], result of:
          0.13269745 = score(doc=701,freq=2.0), product of:
            0.35416332 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.04177434 = queryNorm
            0.3746787 = fieldWeight in 701, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.03125 = fieldNorm(doc=701)
        0.018058153 = weight(_text_:of in 701) [ClassicSimilarity], result of:
          0.018058153 = score(doc=701,freq=32.0), product of:
            0.06532493 = queryWeight, product of:
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.04177434 = queryNorm
            0.27643585 = fieldWeight in 701, product of:
              5.656854 = tf(freq=32.0), with freq of:
                32.0 = termFreq=32.0
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.03125 = fieldNorm(doc=701)
      0.6 = coord(3/5)
    
    Abstract
    By the explosion of possibilities for a ubiquitous content production, the information overload problem reaches the level of complexity which cannot be managed by traditional modelling approaches anymore. Due to their pure syntactical nature traditional information retrieval approaches did not succeed in treating content itself (i.e. its meaning, and not its representation). This leads to a very low usefulness of the results of a retrieval process for a user's task at hand. In the last ten years ontologies have been emerged from an interesting conceptualisation paradigm to a very promising (semantic) modelling technology, especially in the context of the Semantic Web. From the information retrieval point of view, ontologies enable a machine-understandable form of content description, such that the retrieval process can be driven by the meaning of the content. However, the very ambiguous nature of the retrieval process in which a user, due to the unfamiliarity with the underlying repository and/or query syntax, just approximates his information need in a query, implies a necessity to include the user in the retrieval process more actively in order to close the gap between the meaning of the content and the meaning of a user's query (i.e. his information need). This thesis lays foundation for such an ontology-based interactive retrieval process, in which the retrieval system interacts with a user in order to conceptually interpret the meaning of his query, whereas the underlying domain ontology drives the conceptualisation process. In that way the retrieval process evolves from a query evaluation process into a highly interactive cooperation between a user and the retrieval system, in which the system tries to anticipate the user's information need and to deliver the relevant content proactively. Moreover, the notion of content relevance for a user's query evolves from a content dependent artefact to the multidimensional context-dependent structure, strongly influenced by the user's preferences. This cooperation process is realized as the so-called Librarian Agent Query Refinement Process. In order to clarify the impact of an ontology on the retrieval process (regarding its complexity and quality), a set of methods and tools for different levels of content and query formalisation is developed, ranging from pure ontology-based inferencing to keyword-based querying in which semantics automatically emerges from the results. Our evaluation studies have shown that the possibilities to conceptualize a user's information need in the right manner and to interpret the retrieval results accordingly are key issues for realizing much more meaningful information retrieval systems.
    Content
    Vgl.: http%3A%2F%2Fdigbib.ubka.uni-karlsruhe.de%2Fvolltexte%2Fdocuments%2F1627&ei=tAtYUYrBNoHKtQb3l4GYBw&usg=AFQjCNHeaxKkKU3-u54LWxMNYGXaaDLCGw&sig2=8WykXWQoDKjDSdGtAakH2Q&bvm=bv.44442042,d.Yms.
  7. Blair, D.: Wittgenstein, language and information : "Back to the Rough Ground!" (2006) 0.13
    0.12971117 = product of:
      0.21618526 = sum of:
        0.1590492 = weight(_text_:philosophy in 828) [ClassicSimilarity], result of:
          0.1590492 = score(doc=828,freq=16.0), product of:
            0.23055021 = queryWeight, product of:
              5.5189433 = idf(docFreq=481, maxDocs=44218)
              0.04177434 = queryNorm
            0.6898679 = fieldWeight in 828, product of:
              4.0 = tf(freq=16.0), with freq of:
                16.0 = termFreq=16.0
              5.5189433 = idf(docFreq=481, maxDocs=44218)
              0.03125 = fieldNorm(doc=828)
        0.021175062 = weight(_text_:of in 828) [ClassicSimilarity], result of:
          0.021175062 = score(doc=828,freq=44.0), product of:
            0.06532493 = queryWeight, product of:
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.04177434 = queryNorm
            0.3241498 = fieldWeight in 828, product of:
              6.6332498 = tf(freq=44.0), with freq of:
                44.0 = termFreq=44.0
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.03125 = fieldNorm(doc=828)
        0.035961 = product of:
          0.071922 = sum of:
            0.071922 = weight(_text_:mind in 828) [ClassicSimilarity], result of:
              0.071922 = score(doc=828,freq=2.0), product of:
                0.2607373 = queryWeight, product of:
                  6.241566 = idf(docFreq=233, maxDocs=44218)
                  0.04177434 = queryNorm
                0.27584085 = fieldWeight in 828, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  6.241566 = idf(docFreq=233, maxDocs=44218)
                  0.03125 = fieldNorm(doc=828)
          0.5 = coord(1/2)
      0.6 = coord(3/5)
    
    Abstract
    This book is an extension of the discussions presented in Blair's 1990 book "Language and Representation in Information Retrieval", which was selected as the "Best Information Science Book of the Year" by the American Society for Information Science (ASIS). That work stated that the Philosophy of Language had the best theory for understanding meaning in language, and within the Philosophy of Language, the work of philosopher Ludwig Wittgenstein was found to be most perceptive. The success of that book provided an incentive to look more deeply into Wittgenstein's philosophy of language, and how it can help us to understand how to represent the intellectual content of information. This is what the current title does, and by using this theory it creates a firm foundation for future Information Retrieval research. The work consists of four related parts. Firstly, a brief overview of Wittgenstein's philosophy of language and its relevance to information systems. Secondly, a detailed explanation of Wittgenstein's late philosophy of language and mind. Thirdly, an extended discussion of the relevance of his philosophy to understanding some of the problems inherent in information systems, especially those systems which rely on retrieval based on some representation of the intellectual content of that information. And, fourthly, a series of detailed footnotes which cite the sources of the numerous quotations and provide some discussion of the related issues that the text inspires.
    Footnote
    Rez. in: Journal of Documentation 63(2007) no.2, S.xxx-xxx (B. Hjoerland)
    LCSH
    Language and languages / Philosophy
    Subject
    Language and languages / Philosophy
  8. Prokop, M.: Hans Jonas and the phenomenological continuity of life and mind (2022) 0.13
    0.12834376 = product of:
      0.21390626 = sum of:
        0.09940575 = weight(_text_:philosophy in 1048) [ClassicSimilarity], result of:
          0.09940575 = score(doc=1048,freq=4.0), product of:
            0.23055021 = queryWeight, product of:
              5.5189433 = idf(docFreq=481, maxDocs=44218)
              0.04177434 = queryNorm
            0.43116745 = fieldWeight in 1048, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              5.5189433 = idf(docFreq=481, maxDocs=44218)
              0.0390625 = fieldNorm(doc=1048)
        0.024598021 = weight(_text_:of in 1048) [ClassicSimilarity], result of:
          0.024598021 = score(doc=1048,freq=38.0), product of:
            0.06532493 = queryWeight, product of:
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.04177434 = queryNorm
            0.37654874 = fieldWeight in 1048, product of:
              6.164414 = tf(freq=38.0), with freq of:
                38.0 = termFreq=38.0
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.0390625 = fieldNorm(doc=1048)
        0.0899025 = product of:
          0.179805 = sum of:
            0.179805 = weight(_text_:mind in 1048) [ClassicSimilarity], result of:
              0.179805 = score(doc=1048,freq=8.0), product of:
                0.2607373 = queryWeight, product of:
                  6.241566 = idf(docFreq=233, maxDocs=44218)
                  0.04177434 = queryNorm
                0.68960214 = fieldWeight in 1048, product of:
                  2.828427 = tf(freq=8.0), with freq of:
                    8.0 = termFreq=8.0
                  6.241566 = idf(docFreq=233, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=1048)
          0.5 = coord(1/2)
      0.6 = coord(3/5)
    
    Abstract
    This paper offers a novel interpretation of Hans Jonas' analysis of metabolism, the centrepiece of Jonas' philosophy of organism, in relation to recent controversies regarding the phenomenological dimension of life-mind continuity as understood within 'autopoietic' enactivism (AE). Jonas' philosophy of organism chiefly inspired AE's development of what we might call 'the phenomenological life-mind continuity thesis' (PLMCT), the claim that certain phenomenological features of human experience are central to a proper scientific understanding of both life and mind, and as such central features of all living organisms. After discussing the understanding of PLMCT within AE, and recent criticisms thereof, I develop a reading of Jonas' analysis of metabolism, in light of previous commentators, which emphasizes its systematicity and transcendental flavour. The central thought is that, for Jonas, the attribution of certain phenomenological features is a necessary precondition for our understanding of the possibility of metabolism, rather than being derivable from metabolism itself. I argue that my interpretation strengthens Jonas' contribution to AE's justification for ascribing certain phenomenological features to life across the board. However, it also emphasises the need to complement Jonas' analysis with an explanatory account of organic identity in order to vindicate these phenomenological ascriptions in a scientific context.
  9. Marradi, A.: ¬The concept of concept : concepts and terms (2012) 0.13
    0.12746051 = product of:
      0.21243417 = sum of:
        0.070290476 = weight(_text_:philosophy in 33) [ClassicSimilarity], result of:
          0.070290476 = score(doc=33,freq=2.0), product of:
            0.23055021 = queryWeight, product of:
              5.5189433 = idf(docFreq=481, maxDocs=44218)
              0.04177434 = queryNorm
            0.30488142 = fieldWeight in 33, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              5.5189433 = idf(docFreq=481, maxDocs=44218)
              0.0390625 = fieldNorm(doc=33)
        0.023941955 = weight(_text_:of in 33) [ClassicSimilarity], result of:
          0.023941955 = score(doc=33,freq=36.0), product of:
            0.06532493 = queryWeight, product of:
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.04177434 = queryNorm
            0.36650562 = fieldWeight in 33, product of:
              6.0 = tf(freq=36.0), with freq of:
                36.0 = termFreq=36.0
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.0390625 = fieldNorm(doc=33)
        0.11820173 = sum of:
          0.0899025 = weight(_text_:mind in 33) [ClassicSimilarity], result of:
            0.0899025 = score(doc=33,freq=2.0), product of:
              0.2607373 = queryWeight, product of:
                6.241566 = idf(docFreq=233, maxDocs=44218)
                0.04177434 = queryNorm
              0.34480107 = fieldWeight in 33, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                6.241566 = idf(docFreq=233, maxDocs=44218)
                0.0390625 = fieldNorm(doc=33)
          0.028299233 = weight(_text_:22 in 33) [ClassicSimilarity], result of:
            0.028299233 = score(doc=33,freq=2.0), product of:
              0.14628662 = queryWeight, product of:
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.04177434 = queryNorm
              0.19345059 = fieldWeight in 33, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.0390625 = fieldNorm(doc=33)
      0.6 = coord(3/5)
    
    Abstract
    The concept of concept has seldom been examined in its entirety, and the term very seldom defined. The rigidity, or lack thereof, and the homogeneity, or lack thereof, of concepts, are only two of their characteristics that have been debated. These issues are reviewed in this paper, namely: 1) does a concept represent its referent(s), or is it a free creation of the mind?; 2) can a concept be analyzed in parts or elements?; 3) must a concept be general, i.e., refer to a category or a type, or can it refer to a single object, physical or mental?; 4) are concepts as clearly delimited as terms are? Are concepts voiceless terms?; and, 5) what do terms contribute to an individual's and a community's conceptual richness? As regards the relationship of concepts with their referents in the stage of formation, it seems reasonable to conclude that said relationship may be close in some concepts, less close in others, and lacking altogether in some cases. The set of elements of a concept, which varies from individual to individual and across time inside the same individual, is called the intension of a concept. The set of referents of a concept is called the extension of that concept. Most concepts don't have a clearly delimited extension: their referents form a fuzzy set. The aspects of a concept's intension form a scale of generality. A concept is not equal to the term that describes it; rather, many terms are joined to concepts. Language, therefore, renders a gamut of services to the development, consolidation, and communication of conceptual richness.
    Date
    22. 1.2012 13:11:25
    Series
    Forum: The philosophy of classification
  10. Floridi, L.: Open problems in the philosophy of information (2004) 0.12
    0.124973536 = product of:
      0.20828922 = sum of:
        0.011641062 = product of:
          0.05820531 = sum of:
            0.05820531 = weight(_text_:problem in 2635) [ClassicSimilarity], result of:
              0.05820531 = score(doc=2635,freq=2.0), product of:
                0.17731056 = queryWeight, product of:
                  4.244485 = idf(docFreq=1723, maxDocs=44218)
                  0.04177434 = queryNorm
                0.3282676 = fieldWeight in 2635, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  4.244485 = idf(docFreq=1723, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=2635)
          0.2 = coord(1/5)
        0.17044537 = weight(_text_:philosophy in 2635) [ClassicSimilarity], result of:
          0.17044537 = score(doc=2635,freq=6.0), product of:
            0.23055021 = queryWeight, product of:
              5.5189433 = idf(docFreq=481, maxDocs=44218)
              0.04177434 = queryNorm
            0.7392982 = fieldWeight in 2635, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              5.5189433 = idf(docFreq=481, maxDocs=44218)
              0.0546875 = fieldNorm(doc=2635)
        0.026202802 = weight(_text_:of in 2635) [ClassicSimilarity], result of:
          0.026202802 = score(doc=2635,freq=22.0), product of:
            0.06532493 = queryWeight, product of:
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.04177434 = queryNorm
            0.40111488 = fieldWeight in 2635, product of:
              4.690416 = tf(freq=22.0), with freq of:
                22.0 = termFreq=22.0
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.0546875 = fieldNorm(doc=2635)
      0.6 = coord(3/5)
    
    Abstract
    The philosophy of information (PI) is a new area of research with its own field of investigation and methodology. This article, based an the Herbert A. Simon Lecture of Computing and Philosophy I gave at Carnegie Mellon University in 2001, analyses the eighteen principal open problems in PI. Section 1 introduces the analysis by outlining Herbert Simon's approach to PI. Section 2 discusses some methodological considerations about what counts as a good philosophical problem. The discussion centers an Hilbert's famous analysis of the central problems in mathematics. The rest of the article is devoted to the eighteen problems. These are organized into five sections: problems in the analysis of the concept of information, in semantics, in the study of intelligence, in the relation between information and nature, and in the investigation of values.
  11. Parrochia, D.; Neuville, D.: Towards a general theory of classifications (2013) 0.12
    0.12446415 = product of:
      0.20744024 = sum of:
        0.1721758 = weight(_text_:philosophy in 3100) [ClassicSimilarity], result of:
          0.1721758 = score(doc=3100,freq=12.0), product of:
            0.23055021 = queryWeight, product of:
              5.5189433 = idf(docFreq=481, maxDocs=44218)
              0.04177434 = queryNorm
            0.7468039 = fieldWeight in 3100, product of:
              3.4641016 = tf(freq=12.0), with freq of:
                12.0 = termFreq=12.0
              5.5189433 = idf(docFreq=481, maxDocs=44218)
              0.0390625 = fieldNorm(doc=3100)
        0.02111482 = weight(_text_:of in 3100) [ClassicSimilarity], result of:
          0.02111482 = score(doc=3100,freq=28.0), product of:
            0.06532493 = queryWeight, product of:
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.04177434 = queryNorm
            0.32322758 = fieldWeight in 3100, product of:
              5.2915025 = tf(freq=28.0), with freq of:
                28.0 = termFreq=28.0
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.0390625 = fieldNorm(doc=3100)
        0.0141496165 = product of:
          0.028299233 = sum of:
            0.028299233 = weight(_text_:22 in 3100) [ClassicSimilarity], result of:
              0.028299233 = score(doc=3100,freq=2.0), product of:
                0.14628662 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.04177434 = queryNorm
                0.19345059 = fieldWeight in 3100, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=3100)
          0.5 = coord(1/2)
      0.6 = coord(3/5)
    
    Abstract
    This book is an essay on the epistemology of classifications. Its main purpose is not to provide an exposition of an actual mathematical theory of classifications, that is, a general theory which would be available to any kind of them: hierarchical or non-hierarchical, ordinary or fuzzy, overlapping or not overlapping, finite or infinite, and so on, establishing a basis for all possible divisions of the real world. For the moment, such a theory remains nothing but a dream. Instead, the authors are essentially put forward a number of key questions. Their aim is rather to reveal the "state of art" of this dynamic field and the philosophy one may eventually adopt to go further. To this end they present some advances made in the course of the last century, discuss a few tricky problems that remain to be solved, and show the avenues open to those who no longer wish to stay on the wrong track. Researchers and professionals interested in the epistemology and philosophy of science, library science, logic and set theory, order theory or cluster analysis will find this book a comprehensive, original and progressive introduction to the main questions in this field.
    Content
    Philosophical problemsInformation / data structures / Empirical clustering and classic hierarchies / Algebra of trees / Generalized classifications / Topology of generalized classifications / Metaclassification / For an axiomatic theory of classifications / Alternative theories and higher infinite / Postscript.
    Date
    8. 9.2016 22:04:09
    LCSH
    Categories (Philosophy)
    Mathematics / Philosophy
    Subject
    Categories (Philosophy)
    Mathematics / Philosophy
  12. Broadfield, A.: ¬The philosophy of classification (1956) 0.12
    0.12149384 = product of:
      0.3037346 = sum of:
        0.2811619 = weight(_text_:philosophy in 1256) [ClassicSimilarity], result of:
          0.2811619 = score(doc=1256,freq=2.0), product of:
            0.23055021 = queryWeight, product of:
              5.5189433 = idf(docFreq=481, maxDocs=44218)
              0.04177434 = queryNorm
            1.2195257 = fieldWeight in 1256, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              5.5189433 = idf(docFreq=481, maxDocs=44218)
              0.15625 = fieldNorm(doc=1256)
        0.022572692 = weight(_text_:of in 1256) [ClassicSimilarity], result of:
          0.022572692 = score(doc=1256,freq=2.0), product of:
            0.06532493 = queryWeight, product of:
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.04177434 = queryNorm
            0.34554482 = fieldWeight in 1256, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.15625 = fieldNorm(doc=1256)
      0.4 = coord(2/5)
    
  13. Ranganathan, S.R.: Philosophy of library classification (1989) 0.12
    0.12149384 = product of:
      0.3037346 = sum of:
        0.2811619 = weight(_text_:philosophy in 3540) [ClassicSimilarity], result of:
          0.2811619 = score(doc=3540,freq=2.0), product of:
            0.23055021 = queryWeight, product of:
              5.5189433 = idf(docFreq=481, maxDocs=44218)
              0.04177434 = queryNorm
            1.2195257 = fieldWeight in 3540, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              5.5189433 = idf(docFreq=481, maxDocs=44218)
              0.15625 = fieldNorm(doc=3540)
        0.022572692 = weight(_text_:of in 3540) [ClassicSimilarity], result of:
          0.022572692 = score(doc=3540,freq=2.0), product of:
            0.06532493 = queryWeight, product of:
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.04177434 = queryNorm
            0.34554482 = fieldWeight in 3540, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.15625 = fieldNorm(doc=3540)
      0.4 = coord(2/5)
    
  14. Theories of information, communication and knowledge : a multidisciplinary approach (2014) 0.12
    0.119987085 = product of:
      0.19997847 = sum of:
        0.14761 = weight(_text_:philosophy in 2110) [ClassicSimilarity], result of:
          0.14761 = score(doc=2110,freq=18.0), product of:
            0.23055021 = queryWeight, product of:
              5.5189433 = idf(docFreq=481, maxDocs=44218)
              0.04177434 = queryNorm
            0.640251 = fieldWeight in 2110, product of:
              4.2426405 = tf(freq=18.0), with freq of:
                18.0 = termFreq=18.0
              5.5189433 = idf(docFreq=481, maxDocs=44218)
              0.02734375 = fieldNorm(doc=2110)
        0.020902606 = weight(_text_:of in 2110) [ClassicSimilarity], result of:
          0.020902606 = score(doc=2110,freq=56.0), product of:
            0.06532493 = queryWeight, product of:
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.04177434 = queryNorm
            0.31997898 = fieldWeight in 2110, product of:
              7.483315 = tf(freq=56.0), with freq of:
                56.0 = termFreq=56.0
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.02734375 = fieldNorm(doc=2110)
        0.031465873 = product of:
          0.062931746 = sum of:
            0.062931746 = weight(_text_:mind in 2110) [ClassicSimilarity], result of:
              0.062931746 = score(doc=2110,freq=2.0), product of:
                0.2607373 = queryWeight, product of:
                  6.241566 = idf(docFreq=233, maxDocs=44218)
                  0.04177434 = queryNorm
                0.24136074 = fieldWeight in 2110, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  6.241566 = idf(docFreq=233, maxDocs=44218)
                  0.02734375 = fieldNorm(doc=2110)
          0.5 = coord(1/2)
      0.6 = coord(3/5)
    
    Abstract
    This book addresses some of the key questions that scientists have been asking themselves for centuries: what is knowledge? What is information? How do we know that we know something? How do we construct meaning from the perceptions of things? Although no consensus exists on a common definition of the concepts of information and communication, few can reject the hypothesis that information - whether perceived as « object » or as « process » - is a pre-condition for knowledge. Epistemology is the study of how we know things (anglophone meaning) or the study of how scientific knowledge is arrived at and validated (francophone conception). To adopt an epistemological stance is to commit oneself to render an account of what constitutes knowledge or in procedural terms, to render an account of when one can claim to know something. An epistemological theory imposes constraints on the interpretation of human cognitive interaction with the world. It goes without saying that different epistemological theories will have more or less restrictive criteria to distinguish what constitutes knowledge from what is not. If information is a pre-condition for knowledge acquisition, giving an account of how knowledge is acquired should impact our comprehension of information and communication as concepts. While a lot has been written on the definition of these concepts, less research has attempted to establish explicit links between differing theoretical conceptions of these concepts and the underlying epistemological stances. This is what this volume attempts to do. It offers a multidisciplinary exploration of information and communication as perceived in different disciplines and how those perceptions affect theories of knowledge.
    Content
    Introduction; 1. Fidelia Ibekwe-SanJuan and Thomas Dousa.- 2. Cybersemiotics: A new foundation for transdisciplinary theory of information, cognition, meaning, communication and consciousness; Soren Brier.- 3. Epistemology and the Study of Social Information within the Perspective of a Unified Theory of Information;Wolfgang Hofkirchner.- 4. Perception and Testimony as Data Providers; Luciano Floridi.- 5. Human communication from the semiotic perspective; Winfried Noth.- 6. Mind the gap: transitions between concepts of information in varied domains; Lyn Robinson and David Bawden.- 7. Information and the disciplines: A conceptual meta-analysis; Jonathan Furner.- 8. Epistemological Challenges for Information Science; Ian Cornelius.- 9. The nature of information science and its core concepts; Birger Hjorland.- 10. Visual information construing: bistability as a revealer of mediating patterns; Sylvie Leleu-Merviel. - 11. Understanding users' informational constructs via a triadic method approach: a case study; Michel Labour. - 12. Documentary languages and the demarcation of information units in textual information: the case of Julius O. Kaisers's Systematic Indexing
    LCSH
    Knowledge, Theory of
    Semantics (Philosophy)
    Philosophy (General)
    Science / Philosophy
    Social sciences / Philosophy
    Series
    Studies in history and philosophy of science ; 34
    Subject
    Knowledge, Theory of
    Semantics (Philosophy)
    Philosophy (General)
    Science / Philosophy
    Social sciences / Philosophy
  15. Bettella, C.; Carrara, M.: ¬The philosophy of classifying philosophy : preface to special issue (2009) 0.12
    0.11968403 = product of:
      0.29921007 = sum of:
        0.26673362 = weight(_text_:philosophy in 3264) [ClassicSimilarity], result of:
          0.26673362 = score(doc=3264,freq=20.0), product of:
            0.23055021 = queryWeight, product of:
              5.5189433 = idf(docFreq=481, maxDocs=44218)
              0.04177434 = queryNorm
            1.1569437 = fieldWeight in 3264, product of:
              4.472136 = tf(freq=20.0), with freq of:
                20.0 = termFreq=20.0
              5.5189433 = idf(docFreq=481, maxDocs=44218)
              0.046875 = fieldNorm(doc=3264)
        0.032476448 = weight(_text_:of in 3264) [ClassicSimilarity], result of:
          0.032476448 = score(doc=3264,freq=46.0), product of:
            0.06532493 = queryWeight, product of:
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.04177434 = queryNorm
            0.4971524 = fieldWeight in 3264, product of:
              6.78233 = tf(freq=46.0), with freq of:
                46.0 = termFreq=46.0
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.046875 = fieldNorm(doc=3264)
      0.4 = coord(2/5)
    
    Abstract
    Most of the articles published in this special issue are a selection of the talks given at the workshop on Classifying the Human Sciences: The Case of Philosophy held on February 2, 2007, at the University of Padua (Padua, Italy). The conference has been organized by the Library of the Department of Philosophy (University of Padua), in association with the Italian ISKO Chapter, and sponsored by the University Library System of the University of Padua. The aim of the workshop was to discuss themes of knowledge organization for philosophy and classification of philosophical data, specifically in libraries of philosophy. For these reasons experts on classification theory, philosophers, and philosophy librarians were invited to the event. We would like to thank the participants and the organizers of the workshop; special thanks to Prof. Francesca Menegoni and Prof. Luca Illetterati, Directors of the Library of the Department of Philosophy at the University of Padua; Claudio Gnoli, chair of the Italian ISKO Chapter, and Pio Liverotti, Coordinator of theHumanities Libraries at the University of Padua.
    Footnote
    Einführung in ein Themenheft "The philosophy of classifying philosophy"
  16. Lemos, N.M.: ¬An introduction to the theory of knowledge (2007) 0.12
    0.11569425 = product of:
      0.19282374 = sum of:
        0.02231161 = product of:
          0.11155804 = sum of:
            0.11155804 = weight(_text_:problem in 63) [ClassicSimilarity], result of:
              0.11155804 = score(doc=63,freq=10.0), product of:
                0.17731056 = queryWeight, product of:
                  4.244485 = idf(docFreq=1723, maxDocs=44218)
                  0.04177434 = queryNorm
                0.6291675 = fieldWeight in 63, product of:
                  3.1622777 = tf(freq=10.0), with freq of:
                    10.0 = termFreq=10.0
                  4.244485 = idf(docFreq=1723, maxDocs=44218)
                  0.046875 = fieldNorm(doc=63)
          0.2 = coord(1/5)
        0.14609602 = weight(_text_:philosophy in 63) [ClassicSimilarity], result of:
          0.14609602 = score(doc=63,freq=6.0), product of:
            0.23055021 = queryWeight, product of:
              5.5189433 = idf(docFreq=481, maxDocs=44218)
              0.04177434 = queryNorm
            0.63368416 = fieldWeight in 63, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              5.5189433 = idf(docFreq=481, maxDocs=44218)
              0.046875 = fieldNorm(doc=63)
        0.024416098 = weight(_text_:of in 63) [ClassicSimilarity], result of:
          0.024416098 = score(doc=63,freq=26.0), product of:
            0.06532493 = queryWeight, product of:
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.04177434 = queryNorm
            0.37376386 = fieldWeight in 63, product of:
              5.0990195 = tf(freq=26.0), with freq of:
                26.0 = termFreq=26.0
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.046875 = fieldNorm(doc=63)
      0.6 = coord(3/5)
    
    Abstract
    Epistemology or the theory of knowledge is one of the cornerstones of analytic philosophy, and this book provides a clear and accessible introduction to the subject. It discusses some of the main theories of justification, including foundationalism, coherentism, reliabilism, and virtue epistemology. Other topics include the Gettier problem, internalism and externalism, skepticism, the problem of epistemic circularity, the problem of the criterion, a priori knowledge, and naturalized epistemology. Intended primarily for students taking a first class in epistemology, this lucid and well-written text would also provide an excellent introduction for anyone interested in knowing more about this important area of philosophy.
    Content
    Knowledge, truth, and justification -- The traditional analysis and the Gettier problem -- Foundationalism -- The coherence theory of justification -- Reliabilism and virtue epistemology -- Internalism, externalism, and epistemic circularity -- Skepticism -- The problem of the criterion -- The a priori -- Naturalized epistemology
    LCSH
    Knowledge, Theory of
    Series
    Cambridge introductions to philosophy
    Subject
    Knowledge, Theory of
  17. Abrahamson, J.R.: Mind, evolution, and computers (1994) 0.11
    0.110114746 = product of:
      0.27528685 = sum of:
        0.017665926 = weight(_text_:of in 8234) [ClassicSimilarity], result of:
          0.017665926 = score(doc=8234,freq=10.0), product of:
            0.06532493 = queryWeight, product of:
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.04177434 = queryNorm
            0.2704316 = fieldWeight in 8234, product of:
              3.1622777 = tf(freq=10.0), with freq of:
                10.0 = termFreq=10.0
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.0546875 = fieldNorm(doc=8234)
        0.25762093 = sum of:
          0.21800199 = weight(_text_:mind in 8234) [ClassicSimilarity], result of:
            0.21800199 = score(doc=8234,freq=6.0), product of:
              0.2607373 = queryWeight, product of:
                6.241566 = idf(docFreq=233, maxDocs=44218)
                0.04177434 = queryNorm
              0.8360982 = fieldWeight in 8234, product of:
                2.4494898 = tf(freq=6.0), with freq of:
                  6.0 = termFreq=6.0
                6.241566 = idf(docFreq=233, maxDocs=44218)
                0.0546875 = fieldNorm(doc=8234)
          0.039618924 = weight(_text_:22 in 8234) [ClassicSimilarity], result of:
            0.039618924 = score(doc=8234,freq=2.0), product of:
              0.14628662 = queryWeight, product of:
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.04177434 = queryNorm
              0.2708308 = fieldWeight in 8234, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.0546875 = fieldNorm(doc=8234)
      0.4 = coord(2/5)
    
    Abstract
    Science deals with knowledge of the material world based on objective reality, and is under constant attack by those who need magic, that is concepts based on imagination and desire. Roger Penrose in 'The emperor's new mind' attampts to look beyond objective reality to questions concerning the machinery and method of the operation of the human mind, using the theory that computers will never be able to duplicate the human experience. Shows where Penrose is wrong by reviewing the evolution of men and computers and speculating about where computers might and might not imitate human perception. Warns against the danger of passive acceptance when respected scientists venture into the occult
    Source
    AI magazine. 15(1994) no.1, S.19-22
  18. Hjoerland, B.: Arguments for philosophical realism in library and information science (2004) 0.11
    0.109595545 = product of:
      0.18265924 = sum of:
        0.12174669 = weight(_text_:philosophy in 832) [ClassicSimilarity], result of:
          0.12174669 = score(doc=832,freq=6.0), product of:
            0.23055021 = queryWeight, product of:
              5.5189433 = idf(docFreq=481, maxDocs=44218)
              0.04177434 = queryNorm
            0.52807015 = fieldWeight in 832, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              5.5189433 = idf(docFreq=481, maxDocs=44218)
              0.0390625 = fieldNorm(doc=832)
        0.015961302 = weight(_text_:of in 832) [ClassicSimilarity], result of:
          0.015961302 = score(doc=832,freq=16.0), product of:
            0.06532493 = queryWeight, product of:
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.04177434 = queryNorm
            0.24433708 = fieldWeight in 832, product of:
              4.0 = tf(freq=16.0), with freq of:
                16.0 = termFreq=16.0
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.0390625 = fieldNorm(doc=832)
        0.04495125 = product of:
          0.0899025 = sum of:
            0.0899025 = weight(_text_:mind in 832) [ClassicSimilarity], result of:
              0.0899025 = score(doc=832,freq=2.0), product of:
                0.2607373 = queryWeight, product of:
                  6.241566 = idf(docFreq=233, maxDocs=44218)
                  0.04177434 = queryNorm
                0.34480107 = fieldWeight in 832, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  6.241566 = idf(docFreq=233, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=832)
          0.5 = coord(1/2)
      0.6 = coord(3/5)
    
    Abstract
    The basic realist claim is that a mind-independent reality exists. It should be common sense knowledge to accept this claim, just as any theories that try to deny it soon become inconsistent because reality strikes back. In spite of this, antirealist philosophies flourish, not only in philosophy but also in the behavioral and cognitive sciences and in information science. This is highly problematic because it removes the attention from reality to subjective phenomena with no real explanatory power. Realism should not be confused with the view that all scientific claims are true or with any other kind of naiveté concerning knowledge claims. The opposite of realism may be termed antirealism, idealism, or nominalism. Although many people confuse empiricism and positivism with realism, these traditions are by nature strongly antirealist, which is why a sharp distinction should be made between empiricism and realism. Empirical research should not be founded on assumptions about "the given" of observations, but should recognize the theory-laden nature of observations. Domain analysis represents an attempt to reintroduce a realist perspective in library and information science. A realist conception of relevance, information seeking, information retrieval, and knowledge organization is outlined. Information systems of all kinds, including research libraries and public libraries, should be informed by a realist philosophy and a realist information science.
    Footnote
    Artikel in einem Themenheft: The philosophy of information
  19. Bruneau, A.-P.: Geometrical patterns underlying human intelligence : implications in information retrieval (1994) 0.11
    0.108414285 = product of:
      0.18069047 = sum of:
        0.098406665 = weight(_text_:philosophy in 8275) [ClassicSimilarity], result of:
          0.098406665 = score(doc=8275,freq=2.0), product of:
            0.23055021 = queryWeight, product of:
              5.5189433 = idf(docFreq=481, maxDocs=44218)
              0.04177434 = queryNorm
            0.426834 = fieldWeight in 8275, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              5.5189433 = idf(docFreq=481, maxDocs=44218)
              0.0546875 = fieldNorm(doc=8275)
        0.01935205 = weight(_text_:of in 8275) [ClassicSimilarity], result of:
          0.01935205 = score(doc=8275,freq=12.0), product of:
            0.06532493 = queryWeight, product of:
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.04177434 = queryNorm
            0.29624295 = fieldWeight in 8275, product of:
              3.4641016 = tf(freq=12.0), with freq of:
                12.0 = termFreq=12.0
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.0546875 = fieldNorm(doc=8275)
        0.062931746 = product of:
          0.12586349 = sum of:
            0.12586349 = weight(_text_:mind in 8275) [ClassicSimilarity], result of:
              0.12586349 = score(doc=8275,freq=2.0), product of:
                0.2607373 = queryWeight, product of:
                  6.241566 = idf(docFreq=233, maxDocs=44218)
                  0.04177434 = queryNorm
                0.48272148 = fieldWeight in 8275, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  6.241566 = idf(docFreq=233, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=8275)
          0.5 = coord(1/2)
      0.6 = coord(3/5)
    
    Abstract
    The author argues that there are underlying structures to the mind which may be described as a from of visual intelligence. This idea favors artificial intelligence research directed in studies of geometrical patterns to cognition. He hypothezises that such patterns may be compared with geographical maps as well as topological or spatial entities present in most written languages, but especially spatially based scripts such as Chinese. A philosophical approach is employed to discuss these issues, most notably the German philosophy of Gestalt and an epistemological critique of the foundation of knowledge. He concludes such entities may provide the basis for a solid model of intelligence based on formalized, geometrical formal patterns and that this model may be used effectively in a connectionist environment
  20. Channon, M.: ¬The Stowe table as the definitive periodic system (2011) 0.11
    0.108414285 = product of:
      0.18069047 = sum of:
        0.098406665 = weight(_text_:philosophy in 3445) [ClassicSimilarity], result of:
          0.098406665 = score(doc=3445,freq=2.0), product of:
            0.23055021 = queryWeight, product of:
              5.5189433 = idf(docFreq=481, maxDocs=44218)
              0.04177434 = queryNorm
            0.426834 = fieldWeight in 3445, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              5.5189433 = idf(docFreq=481, maxDocs=44218)
              0.0546875 = fieldNorm(doc=3445)
        0.01935205 = weight(_text_:of in 3445) [ClassicSimilarity], result of:
          0.01935205 = score(doc=3445,freq=12.0), product of:
            0.06532493 = queryWeight, product of:
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.04177434 = queryNorm
            0.29624295 = fieldWeight in 3445, product of:
              3.4641016 = tf(freq=12.0), with freq of:
                12.0 = termFreq=12.0
              1.5637573 = idf(docFreq=25162, maxDocs=44218)
              0.0546875 = fieldNorm(doc=3445)
        0.062931746 = product of:
          0.12586349 = sum of:
            0.12586349 = weight(_text_:mind in 3445) [ClassicSimilarity], result of:
              0.12586349 = score(doc=3445,freq=2.0), product of:
                0.2607373 = queryWeight, product of:
                  6.241566 = idf(docFreq=233, maxDocs=44218)
                  0.04177434 = queryNorm
                0.48272148 = fieldWeight in 3445, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  6.241566 = idf(docFreq=233, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=3445)
          0.5 = coord(1/2)
      0.6 = coord(3/5)
    
    Abstract
    In the debate between Professors Hjørland and Scerri (Hjørland 2008 and 2011; Scerri 2011), a question is raised as to which of the many periodic tables is best. Perhaps the confusion results in part from an excessively narrow focus on atoms. To answer this question, then, it might help to broaden one's view of classification tables. With this in mind, we should first note that there are some ten major categories of particle phenomena), and there are now classification tables for most of these, notably elementary particles, hadrons (e.g., spin 3/2 baryons, hadron systems (table of nuclides), galaxies, and, interestingly, universes.
    Series
    Forum: The Philosophy of Classification

Authors

Languages

Types

Themes

Subjects

Classifications