Search (402 results, page 1 of 21)

  • × type_ss:"el"
  1. Kleineberg, M.: Context analysis and context indexing : formal pragmatics in knowledge organization (2014) 0.41
    0.40794837 = product of:
      0.81589675 = sum of:
        0.08158968 = product of:
          0.24476902 = sum of:
            0.24476902 = weight(_text_:3a in 1826) [ClassicSimilarity], result of:
              0.24476902 = score(doc=1826,freq=2.0), product of:
                0.26131085 = queryWeight, product of:
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.030822188 = queryNorm
                0.93669677 = fieldWeight in 1826, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.078125 = fieldNorm(doc=1826)
          0.33333334 = coord(1/3)
        0.24476902 = weight(_text_:2f in 1826) [ClassicSimilarity], result of:
          0.24476902 = score(doc=1826,freq=2.0), product of:
            0.26131085 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.030822188 = queryNorm
            0.93669677 = fieldWeight in 1826, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.078125 = fieldNorm(doc=1826)
        0.24476902 = weight(_text_:2f in 1826) [ClassicSimilarity], result of:
          0.24476902 = score(doc=1826,freq=2.0), product of:
            0.26131085 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.030822188 = queryNorm
            0.93669677 = fieldWeight in 1826, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.078125 = fieldNorm(doc=1826)
        0.24476902 = weight(_text_:2f in 1826) [ClassicSimilarity], result of:
          0.24476902 = score(doc=1826,freq=2.0), product of:
            0.26131085 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.030822188 = queryNorm
            0.93669677 = fieldWeight in 1826, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.078125 = fieldNorm(doc=1826)
      0.5 = coord(4/8)
    
    Source
    http://www.google.de/url?sa=t&rct=j&q=&esrc=s&source=web&cd=5&ved=0CDQQFjAE&url=http%3A%2F%2Fdigbib.ubka.uni-karlsruhe.de%2Fvolltexte%2Fdocuments%2F3131107&ei=HzFWVYvGMsiNsgGTyoFI&usg=AFQjCNE2FHUeR9oQTQlNC4TPedv4Mo3DaQ&sig2=Rlzpr7a3BLZZkqZCXXN_IA&bvm=bv.93564037,d.bGg&cad=rja
  2. Popper, K.R.: Three worlds : the Tanner lecture on human values. Deliverd at the University of Michigan, April 7, 1978 (1978) 0.33
    0.3263587 = product of:
      0.6527174 = sum of:
        0.06527174 = product of:
          0.19581522 = sum of:
            0.19581522 = weight(_text_:3a in 230) [ClassicSimilarity], result of:
              0.19581522 = score(doc=230,freq=2.0), product of:
                0.26131085 = queryWeight, product of:
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.030822188 = queryNorm
                0.7493574 = fieldWeight in 230, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.0625 = fieldNorm(doc=230)
          0.33333334 = coord(1/3)
        0.19581522 = weight(_text_:2f in 230) [ClassicSimilarity], result of:
          0.19581522 = score(doc=230,freq=2.0), product of:
            0.26131085 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.030822188 = queryNorm
            0.7493574 = fieldWeight in 230, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.0625 = fieldNorm(doc=230)
        0.19581522 = weight(_text_:2f in 230) [ClassicSimilarity], result of:
          0.19581522 = score(doc=230,freq=2.0), product of:
            0.26131085 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.030822188 = queryNorm
            0.7493574 = fieldWeight in 230, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.0625 = fieldNorm(doc=230)
        0.19581522 = weight(_text_:2f in 230) [ClassicSimilarity], result of:
          0.19581522 = score(doc=230,freq=2.0), product of:
            0.26131085 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.030822188 = queryNorm
            0.7493574 = fieldWeight in 230, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.0625 = fieldNorm(doc=230)
      0.5 = coord(4/8)
    
    Source
    https%3A%2F%2Ftannerlectures.utah.edu%2F_documents%2Fa-to-z%2Fp%2Fpopper80.pdf&usg=AOvVaw3f4QRTEH-OEBmoYr2J_c7H
  3. Shala, E.: ¬Die Autonomie des Menschen und der Maschine : gegenwärtige Definitionen von Autonomie zwischen philosophischem Hintergrund und technologischer Umsetzbarkeit (2014) 0.20
    0.20397419 = product of:
      0.40794837 = sum of:
        0.04079484 = product of:
          0.12238451 = sum of:
            0.12238451 = weight(_text_:3a in 4388) [ClassicSimilarity], result of:
              0.12238451 = score(doc=4388,freq=2.0), product of:
                0.26131085 = queryWeight, product of:
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.030822188 = queryNorm
                0.46834838 = fieldWeight in 4388, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=4388)
          0.33333334 = coord(1/3)
        0.12238451 = weight(_text_:2f in 4388) [ClassicSimilarity], result of:
          0.12238451 = score(doc=4388,freq=2.0), product of:
            0.26131085 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.030822188 = queryNorm
            0.46834838 = fieldWeight in 4388, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.0390625 = fieldNorm(doc=4388)
        0.12238451 = weight(_text_:2f in 4388) [ClassicSimilarity], result of:
          0.12238451 = score(doc=4388,freq=2.0), product of:
            0.26131085 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.030822188 = queryNorm
            0.46834838 = fieldWeight in 4388, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.0390625 = fieldNorm(doc=4388)
        0.12238451 = weight(_text_:2f in 4388) [ClassicSimilarity], result of:
          0.12238451 = score(doc=4388,freq=2.0), product of:
            0.26131085 = queryWeight, product of:
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.030822188 = queryNorm
            0.46834838 = fieldWeight in 4388, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.478011 = idf(docFreq=24, maxDocs=44218)
              0.0390625 = fieldNorm(doc=4388)
      0.5 = coord(4/8)
    
    Footnote
    Vgl. unter: https://www.google.de/url?sa=t&rct=j&q=&esrc=s&source=web&cd=2&cad=rja&uact=8&ved=2ahUKEwizweHljdbcAhVS16QKHXcFD9QQFjABegQICRAB&url=https%3A%2F%2Fwww.researchgate.net%2Fpublication%2F271200105_Die_Autonomie_des_Menschen_und_der_Maschine_-_gegenwartige_Definitionen_von_Autonomie_zwischen_philosophischem_Hintergrund_und_technologischer_Umsetzbarkeit_Redigierte_Version_der_Magisterarbeit_Karls&usg=AOvVaw06orrdJmFF2xbCCp_hL26q.
  4. Schleim, S.: Körper ist Geist (2017) 0.20
    0.20040713 = product of:
      0.80162853 = sum of:
        0.23347573 = weight(_text_:materialismus in 3670) [ClassicSimilarity], result of:
          0.23347573 = score(doc=3670,freq=4.0), product of:
            0.27705562 = queryWeight, product of:
              8.988837 = idf(docFreq=14, maxDocs=44218)
              0.030822188 = queryNorm
            0.84270346 = fieldWeight in 3670, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              8.988837 = idf(docFreq=14, maxDocs=44218)
              0.046875 = fieldNorm(doc=3670)
        0.5681528 = sum of:
          0.2671861 = weight(_text_:leib in 3670) [ClassicSimilarity], result of:
            0.2671861 = score(doc=3670,freq=8.0), product of:
              0.24922726 = queryWeight, product of:
                8.085969 = idf(docFreq=36, maxDocs=44218)
                0.030822188 = queryNorm
              1.0720581 = fieldWeight in 3670, product of:
                2.828427 = tf(freq=8.0), with freq of:
                  8.0 = termFreq=8.0
                8.085969 = idf(docFreq=36, maxDocs=44218)
                0.046875 = fieldNorm(doc=3670)
          0.21865623 = weight(_text_:seele in 3670) [ClassicSimilarity], result of:
            0.21865623 = score(doc=3670,freq=8.0), product of:
              0.22546001 = queryWeight, product of:
                7.314861 = idf(docFreq=79, maxDocs=44218)
                0.030822188 = queryNorm
              0.96982265 = fieldWeight in 3670, product of:
                2.828427 = tf(freq=8.0), with freq of:
                  8.0 = termFreq=8.0
                7.314861 = idf(docFreq=79, maxDocs=44218)
                0.046875 = fieldNorm(doc=3670)
          0.08231041 = weight(_text_:problem in 3670) [ClassicSimilarity], result of:
            0.08231041 = score(doc=3670,freq=10.0), product of:
              0.13082431 = queryWeight, product of:
                4.244485 = idf(docFreq=1723, maxDocs=44218)
                0.030822188 = queryNorm
              0.6291675 = fieldWeight in 3670, product of:
                3.1622777 = tf(freq=10.0), with freq of:
                  10.0 = termFreq=10.0
                4.244485 = idf(docFreq=1723, maxDocs=44218)
                0.046875 = fieldNorm(doc=3670)
      0.25 = coord(2/8)
    
    Abstract
    Bezugnahme auf: Lehmann, K.: Denken mit Leib und Seele. Unter: https://www.heise.de/tp/features/Denken-mit-Leib-und-Seele-3593478.html. Konrad Lehmann wies in seinem Artikel vom 27. Mai auf Probleme hin, Bewusstsein naturwissenschaftlich zu erklären. Er kritisierte das uneingelöste Versprechen des Materialismus beziehungsweise Physikalismus und diskutierte alternative Ansätze, wie den Panpsychismus, Idealismus und praktischen Materialismus. Meiner Meinung nach muss man Aussagen über das Sein (also ontologische Aussagen) klarer von Aussagen über unsere Erkenntnis (also epistemische Aussagen) trennen, als Lehmann dies getan hat. Außerdem sollte man ein Problem klarer definieren, bevor man es diskutiert. Nach einer Replik auf den Artikel Lehmanns möchte ich meine eigenen Gedanken zum Leib-Seele-Problem zur Diskussion stellen. Dabei gehe ich auch auf Aussagen über den Erkenntniswert der Neurowissenschaften zum Verständnis des Bewusstseins ein. Mein Vorschlag ist, sich vom traditionellen Leib-Seele-Problem zu verabschieden, da es eher ein Problem unseres Denkens als ein Problem der Welt ist. Replik darauf: Lehmann, K.: Geist? Welcher Geist?. [04. Juni 2017]. Unter: https://www.heise.de/tp/features/Geist-Welcher-Geist-3733426.html.
  5. Koch, M.: Bewusstsein ohne Gehirn : kann man den Geist eines Menschen auf eine Maschine übertragen? (2018) 0.08
    0.08175993 = product of:
      0.32703972 = sum of:
        0.16509227 = weight(_text_:materialismus in 4436) [ClassicSimilarity], result of:
          0.16509227 = score(doc=4436,freq=2.0), product of:
            0.27705562 = queryWeight, product of:
              8.988837 = idf(docFreq=14, maxDocs=44218)
              0.030822188 = queryNorm
            0.59588134 = fieldWeight in 4436, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.988837 = idf(docFreq=14, maxDocs=44218)
              0.046875 = fieldNorm(doc=4436)
        0.16194746 = product of:
          0.24292117 = sum of:
            0.13359305 = weight(_text_:leib in 4436) [ClassicSimilarity], result of:
              0.13359305 = score(doc=4436,freq=2.0), product of:
                0.24922726 = queryWeight, product of:
                  8.085969 = idf(docFreq=36, maxDocs=44218)
                  0.030822188 = queryNorm
                0.53602904 = fieldWeight in 4436, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  8.085969 = idf(docFreq=36, maxDocs=44218)
                  0.046875 = fieldNorm(doc=4436)
            0.10932811 = weight(_text_:seele in 4436) [ClassicSimilarity], result of:
              0.10932811 = score(doc=4436,freq=2.0), product of:
                0.22546001 = queryWeight, product of:
                  7.314861 = idf(docFreq=79, maxDocs=44218)
                  0.030822188 = queryNorm
                0.48491132 = fieldWeight in 4436, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  7.314861 = idf(docFreq=79, maxDocs=44218)
                  0.046875 = fieldNorm(doc=4436)
          0.6666667 = coord(2/3)
      0.25 = coord(2/8)
    
    Abstract
    Die Frage, in welchem Verhältnis Leib und Seele beziehungsweise Materie und Geist zueinanderstehen, beschäftigt Wissenschaftler und Philosophen schon seit Langem. Eine viel zitierte Antwort gab Mitte des 19. Jahrhunderts der deutsch-schweizerische Naturforscher Carl Vogt. In seinen »Physiologischen Briefen« erklärte er, »dass all jene Fähigkeiten, die wir unter dem Namen Seelentätigkeiten begreifen, nur Funktionen der Gehirnsubstanz sind, oder, um mich einigermaßen grob auszudrücken, dass die Gedanken in demselben Verhältnis zu dem Gehirn stehen, wie die Galle zur Leber oder der Urin zu den Nieren«. Einer, der Vogt hier entschieden widersprach, war Wladimir I. Lenin. In seinem Buch »Materialismus und Empiriokritizismus« belegte er dessen Auffassung mit dem wenig schmeichelhaften Adjektiv »vulgärmaterialistisch«. Denn Empfindungen und davon abgeleitete Gedanken seien nicht materiell, sondern subjektive Abbilder der objektiven Welt, so Lenin. An einer Idee indes hielt er fest. Er beschrieb Geist und Bewusstsein als »höchste Produkte der in besonderer Weise organisierten Materie«, sprich des Gehirns.
  6. Wilke, M.; Pauen, M.; Ayan, S.: »Wir überschätzen die Rolle des Bewusstseins systematisch« : Leib-Seele-Problem (2022) 0.07
    0.06895139 = product of:
      0.5516111 = sum of:
        0.5516111 = sum of:
          0.26995498 = weight(_text_:leib in 490) [ClassicSimilarity], result of:
            0.26995498 = score(doc=490,freq=6.0), product of:
              0.24922726 = queryWeight, product of:
                8.085969 = idf(docFreq=36, maxDocs=44218)
                0.030822188 = queryNorm
              1.0831679 = fieldWeight in 490, product of:
                2.4494898 = tf(freq=6.0), with freq of:
                  6.0 = termFreq=6.0
                8.085969 = idf(docFreq=36, maxDocs=44218)
                0.0546875 = fieldNorm(doc=490)
          0.22092216 = weight(_text_:seele in 490) [ClassicSimilarity], result of:
            0.22092216 = score(doc=490,freq=6.0), product of:
              0.22546001 = queryWeight, product of:
                7.314861 = idf(docFreq=79, maxDocs=44218)
                0.030822188 = queryNorm
              0.97987294 = fieldWeight in 490, product of:
                2.4494898 = tf(freq=6.0), with freq of:
                  6.0 = termFreq=6.0
                7.314861 = idf(docFreq=79, maxDocs=44218)
                0.0546875 = fieldNorm(doc=490)
          0.06073395 = weight(_text_:problem in 490) [ClassicSimilarity], result of:
            0.06073395 = score(doc=490,freq=4.0), product of:
              0.13082431 = queryWeight, product of:
                4.244485 = idf(docFreq=1723, maxDocs=44218)
                0.030822188 = queryNorm
              0.46424055 = fieldWeight in 490, product of:
                2.0 = tf(freq=4.0), with freq of:
                  4.0 = termFreq=4.0
                4.244485 = idf(docFreq=1723, maxDocs=44218)
                0.0546875 = fieldNorm(doc=490)
      0.125 = coord(1/8)
    
    Abstract
    In den vergangenen 20 Jahren haben Wissenschaftlerinnen und Wissenschaftler vieles über das Bewusstsein gelernt. Einer der größten Fortschritte: Das Bewusstsein ist inzwischen ein etablierter Gegenstand der empirischen Forschung, sagen die Neurowissenschaftlerin Melanie Wilke und der Philosoph Michael Pauen. Im Interview erklären Sie, vor welchen Hürden Forscherinnen und Forscher immer noch stehen und wie sie die »harte Nuss« des Leib-Seele-Problems endlich knacken wollen. Ein Gespräch über Geist, Gehirn und ihre Beziehung zueinander mit der Neurowissenschaftlerin Melanie Wilke und dem Philosophen Michael Pauen.
    Source
    https://www.spektrum.de/news/leib-seele-problem-was-wissen-wir-ueber-das-bewusstsein/1974235?utm_source=pocket-newtab-global-de-DE
  7. Lehmann, K.: Geist? Welcher Geist? (2017) 0.06
    0.058277395 = product of:
      0.46621916 = sum of:
        0.46621916 = sum of:
          0.22265509 = weight(_text_:leib in 2390) [ClassicSimilarity], result of:
            0.22265509 = score(doc=2390,freq=2.0), product of:
              0.24922726 = queryWeight, product of:
                8.085969 = idf(docFreq=36, maxDocs=44218)
                0.030822188 = queryNorm
              0.8933818 = fieldWeight in 2390, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                8.085969 = idf(docFreq=36, maxDocs=44218)
                0.078125 = fieldNorm(doc=2390)
          0.18221353 = weight(_text_:seele in 2390) [ClassicSimilarity], result of:
            0.18221353 = score(doc=2390,freq=2.0), product of:
              0.22546001 = queryWeight, product of:
                7.314861 = idf(docFreq=79, maxDocs=44218)
                0.030822188 = queryNorm
              0.8081856 = fieldWeight in 2390, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                7.314861 = idf(docFreq=79, maxDocs=44218)
                0.078125 = fieldNorm(doc=2390)
          0.06135055 = weight(_text_:problem in 2390) [ClassicSimilarity], result of:
            0.06135055 = score(doc=2390,freq=2.0), product of:
              0.13082431 = queryWeight, product of:
                4.244485 = idf(docFreq=1723, maxDocs=44218)
                0.030822188 = queryNorm
              0.46895373 = fieldWeight in 2390, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                4.244485 = idf(docFreq=1723, maxDocs=44218)
                0.078125 = fieldNorm(doc=2390)
      0.125 = coord(1/8)
    
    Abstract
    Replik auf die Replik von Stephan Schleim: Um über das Leib-Seele-Problem reden zu können, muss man sich zunächst darüber einigen, wovon man überhaupt spricht. Erwiderung auf: Schleim, S.: Körper ist Geist. [31.05.2017]. Unter: https://www.heise.de/tp/features/Koerper-ist-Geist-3729372.html.
  8. Lehmann, K.: Denken mit Leib und Seele (2017) 0.05
    0.047544096 = product of:
      0.38035277 = sum of:
        0.38035277 = sum of:
          0.18892913 = weight(_text_:leib in 3668) [ClassicSimilarity], result of:
            0.18892913 = score(doc=3668,freq=4.0), product of:
              0.24922726 = queryWeight, product of:
                8.085969 = idf(docFreq=36, maxDocs=44218)
                0.030822188 = queryNorm
              0.7580596 = fieldWeight in 3668, product of:
                2.0 = tf(freq=4.0), with freq of:
                  4.0 = termFreq=4.0
                8.085969 = idf(docFreq=36, maxDocs=44218)
                0.046875 = fieldNorm(doc=3668)
          0.1546133 = weight(_text_:seele in 3668) [ClassicSimilarity], result of:
            0.1546133 = score(doc=3668,freq=4.0), product of:
              0.22546001 = queryWeight, product of:
                7.314861 = idf(docFreq=79, maxDocs=44218)
                0.030822188 = queryNorm
              0.6857682 = fieldWeight in 3668, product of:
                2.0 = tf(freq=4.0), with freq of:
                  4.0 = termFreq=4.0
                7.314861 = idf(docFreq=79, maxDocs=44218)
                0.046875 = fieldNorm(doc=3668)
          0.03681033 = weight(_text_:problem in 3668) [ClassicSimilarity], result of:
            0.03681033 = score(doc=3668,freq=2.0), product of:
              0.13082431 = queryWeight, product of:
                4.244485 = idf(docFreq=1723, maxDocs=44218)
                0.030822188 = queryNorm
              0.28137225 = fieldWeight in 3668, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                4.244485 = idf(docFreq=1723, maxDocs=44218)
                0.046875 = fieldNorm(doc=3668)
      0.125 = coord(1/8)
    
    Abstract
    Im Jahre 1999 hörte ich bei einer Tagung von Neurowissenschaftlern und Geistesphilosophen einen Vortrag der bekannten Neurophilosophin Patricia Churchland. In der anschließenden Diskussion fragte sie ein Zuschauer die Große Frage: Wie denn nun Gehirn und Bewusstsein zusammenhingen? Churchland antwortete (so zumindest habe ich es in meinem kreativen Gedächtnis gespeichert): Nun, dazu seien wir ja hier, um das von den Neurobiologen zu erfahren. Es war die Bankrotterklärung der Philosophie: Bewusstsein als biologisches Problem. Allerdings war die Kapitulation der Geistesphilosophen zu der Zeit nachvollziehbar, denn die Neurowissenschaften zeigten quasi täglich neu die enge Kopplung von Gehirnfunktion und Erlebnisinhalten, die skurrilen Ausfallserscheinungen bei neurologischen Patienten mit klar definierten Schädigungen, und biochemische, drogeninduzierte Veränderungen des Bewusstseins. Erwiderung: Schleim, S.: Körper ist Geist. [31.05.2017]. Unter: https://www.heise.de/tp/features/Koerper-ist-Geist-3729372.html und weitere Entgegnung von Lehmann unter: https://www.heise.de/tp/features/Geist-Welcher-Geist-3733426.html.
    Source
    https://www.heise.de/tp/features/Denken-mit-Leib-und-Seele-3593478.html
  9. Horgan, J.: Hängt das Universum von uns ab? (2017) 0.02
    0.020636534 = product of:
      0.16509227 = sum of:
        0.16509227 = weight(_text_:materialismus in 4366) [ClassicSimilarity], result of:
          0.16509227 = score(doc=4366,freq=2.0), product of:
            0.27705562 = queryWeight, product of:
              8.988837 = idf(docFreq=14, maxDocs=44218)
              0.030822188 = queryNorm
            0.59588134 = fieldWeight in 4366, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              8.988837 = idf(docFreq=14, maxDocs=44218)
              0.046875 = fieldNorm(doc=4366)
      0.125 = coord(1/8)
    
    Content
    "Der kühle, harte Skeptiker in mir lehnt jedoch den Neo-Geozentrismus als genau die Art von geistig verschwommenem Mystizismus ab, von dem uns die Wissenschaft befreit hat. Der Neo-Geozentrismus verkörpert die Projektion unserer Ängste und Hoffnungen, unsere Sehnsucht nach Bedeutung. Seine wachsende Popularität ist vielleicht auch ein Symptom für die durch soziale Medien verursachte Selbstverliebtheit unserer Zeit. Aber nicht allein der Neo-Geozentrismus geht mir auf die Nerven. Ich lehne auch militanten Materialismus und Atheismus ab, die unser Verlangen nach transzendenter Bedeutung herabsetzen und das außergewöhnlich Unwahrscheinliche unserer Existenz nicht wahrzunehmen scheinen. Denn letztlich: Ohne einen bewussten Geist, der nachdenkt, könnte unser Universum ebenso gut nicht existieren. Wofür also spreche ich mich aus? Für die einfache Anerkennung der Tatsache, dass keine Theorie oder Theologie dem Mysterium unserer Existenz gerecht werden kann. Ein solcher moderater Agnostizismus ist, so scheint es mir, das, was ein Homo sapiens wählen würde."
  10. Schmid, E.: Variationen zu Poppers Drei-Welten-Lehre : Gedanken zu einer phänomenologischen und kulturellen Basis von Poppers drei Welten in Handlungsgemeinschaften (2018) 0.02
    0.020243432 = product of:
      0.16194746 = sum of:
        0.16194746 = product of:
          0.24292117 = sum of:
            0.13359305 = weight(_text_:leib in 4510) [ClassicSimilarity], result of:
              0.13359305 = score(doc=4510,freq=2.0), product of:
                0.24922726 = queryWeight, product of:
                  8.085969 = idf(docFreq=36, maxDocs=44218)
                  0.030822188 = queryNorm
                0.53602904 = fieldWeight in 4510, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  8.085969 = idf(docFreq=36, maxDocs=44218)
                  0.046875 = fieldNorm(doc=4510)
            0.10932811 = weight(_text_:seele in 4510) [ClassicSimilarity], result of:
              0.10932811 = score(doc=4510,freq=2.0), product of:
                0.22546001 = queryWeight, product of:
                  7.314861 = idf(docFreq=79, maxDocs=44218)
                  0.030822188 = queryNorm
                0.48491132 = fieldWeight in 4510, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  7.314861 = idf(docFreq=79, maxDocs=44218)
                  0.046875 = fieldNorm(doc=4510)
          0.6666667 = coord(2/3)
      0.125 = coord(1/8)
    
    Abstract
    Die leitende Idee des in dieser Arbeit verfolgten Ansatzes besteht in der Verbindung von Poppers Drei-Welten-Lehre mit dem Programm des sogenannten 'methodischen Kulturalismus' (MK). In dieser Arbeit möchte ich das Potential des MK zur Klärung von Fragen zu den Popperschen Welten erkunden. Ich betrachte die Diskussion um die Erst-Person-Perspektive (1P) und ihr Verhältnis zur Dritt-Person-Perspektive (3P) als eine weitere wichtige Facette dieser Fragen. Ich möchte aufzeigen, dass ein Ansatz, wie ihn der MK verfolgt, erweitert um phänomenologische Analysen, die drei Welten Poppers teilweise präzisiert und teilweise in einem etwas anderen Licht erscheinen lässt. #Kultur #Erst-Person-Perspektive #1P #Dritt-Person-Perspektive #3P #Leib-Seele-Problematik
  11. Lehmann, K.: Neues vom Gehirn : Essays zu Erkenntnissen der Neurobiologie (2017) 0.02
    0.01908569 = product of:
      0.15268552 = sum of:
        0.15268552 = product of:
          0.22902827 = sum of:
            0.12595274 = weight(_text_:leib in 3665) [ClassicSimilarity], result of:
              0.12595274 = score(doc=3665,freq=4.0), product of:
                0.24922726 = queryWeight, product of:
                  8.085969 = idf(docFreq=36, maxDocs=44218)
                  0.030822188 = queryNorm
                0.50537306 = fieldWeight in 3665, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  8.085969 = idf(docFreq=36, maxDocs=44218)
                  0.03125 = fieldNorm(doc=3665)
            0.103075534 = weight(_text_:seele in 3665) [ClassicSimilarity], result of:
              0.103075534 = score(doc=3665,freq=4.0), product of:
                0.22546001 = queryWeight, product of:
                  7.314861 = idf(docFreq=79, maxDocs=44218)
                  0.030822188 = queryNorm
                0.4571788 = fieldWeight in 3665, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  7.314861 = idf(docFreq=79, maxDocs=44218)
                  0.03125 = fieldNorm(doc=3665)
          0.6666667 = coord(2/3)
      0.125 = coord(1/8)
    
    Content
    - Was wollen wir tun? Warum nicht einfach aufgeben? -- Das war schön! Nochmal! -- Warum wollen Sie diesen Artikel lesen? -- Einsamkeit ist ein Grundgefühl -- Das mütterliche Gehirn - Was können wir wissen? Wie entsteht Wissen? -- Was Hirnchen nicht lernt ... -- Schlaf, der Gedächtnisgärtner -- Unser Gehirn kartiert auch Beziehungen räumlich -- Die leuchtende Spur der Erinnerung -- Neuronales Upgrade -- Warum erzeugt das Gehirn neue Neuronen? -- Frischzellen fürs Gehirn - Was ist der Mensch? Gleichgeschaltete Meisen -- Von wegen: "Spatzenhirn" -- Ich verdaue, also bin ich -- Ideen aus dem neuronalen Untergrund -- Homo musicus - Was können wir hoffen? Denken mit Leib und Seele
    Footnote
    Vgl. auch den Beitrag unter: https://www.heise.de/tp/features/Denken-mit-Leib-und-Seele-3593478.html.
  12. Widmann, V.: ¬Die Antiquiertheit der Seele (2019) 0.01
    0.0131501295 = product of:
      0.105201036 = sum of:
        0.105201036 = product of:
          0.3156031 = sum of:
            0.3156031 = weight(_text_:seele in 5235) [ClassicSimilarity], result of:
              0.3156031 = score(doc=5235,freq=6.0), product of:
                0.22546001 = queryWeight, product of:
                  7.314861 = idf(docFreq=79, maxDocs=44218)
                  0.030822188 = queryNorm
                1.3998185 = fieldWeight in 5235, product of:
                  2.4494898 = tf(freq=6.0), with freq of:
                    6.0 = termFreq=6.0
                  7.314861 = idf(docFreq=79, maxDocs=44218)
                  0.078125 = fieldNorm(doc=5235)
          0.33333334 = coord(1/3)
      0.125 = coord(1/8)
    
    Abstract
    Die Transhumanisten sind auf dem Vormarsch. Diese ausdrücklich technophile Bewegung sorgt mit unerhörten Thesen und provokanten Parolen für Furore. Nun erfolgt der Angriff auf die menschliche Seele.
    Source
    https://www.heise.de/tp/features/Die-Antiquiertheit-der-Seele-4408867.html?view=print
  13. Dietz, K.: en.wikipedia.org > 6 Mio. Artikel (2020) 0.01
    0.011954496 = product of:
      0.047817983 = sum of:
        0.04079484 = product of:
          0.12238451 = sum of:
            0.12238451 = weight(_text_:3a in 5669) [ClassicSimilarity], result of:
              0.12238451 = score(doc=5669,freq=2.0), product of:
                0.26131085 = queryWeight, product of:
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.030822188 = queryNorm
                0.46834838 = fieldWeight in 5669, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=5669)
          0.33333334 = coord(1/3)
        0.007023146 = product of:
          0.021069437 = sum of:
            0.021069437 = weight(_text_:29 in 5669) [ClassicSimilarity], result of:
              0.021069437 = score(doc=5669,freq=2.0), product of:
                0.108422816 = queryWeight, product of:
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.030822188 = queryNorm
                0.19432661 = fieldWeight in 5669, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=5669)
          0.33333334 = coord(1/3)
      0.25 = coord(2/8)
    
    Content
    "Die Englischsprachige Wikipedia verfügt jetzt über mehr als 6 Millionen Artikel. An zweiter Stelle kommt die deutschsprachige Wikipedia mit 2.3 Millionen Artikeln, an dritter Stelle steht die französischsprachige Wikipedia mit 2.1 Millionen Artikeln (via Researchbuzz: Firehose <https://rbfirehose.com/2020/01/24/techcrunch-wikipedia-now-has-more-than-6-million-articles-in-english/> und Techcrunch <https://techcrunch.com/2020/01/23/wikipedia-english-six-million-articles/?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+Techcrunch+%28TechCrunch%29&guccounter=1&guce_referrer=aHR0cHM6Ly9yYmZpcmVob3NlLmNvbS8yMDIwLzAxLzI0L3RlY2hjcnVuY2gtd2lraXBlZGlhLW5vdy1oYXMtbW9yZS10aGFuLTYtbWlsbGlvbi1hcnRpY2xlcy1pbi1lbmdsaXNoLw&guce_referrer_sig=AQAAAK0zHfjdDZ_spFZBF_z-zDjtL5iWvuKDumFTzm4HvQzkUfE2pLXQzGS6FGB_y-VISdMEsUSvkNsg2U_NWQ4lwWSvOo3jvXo1I3GtgHpP8exukVxYAnn5mJspqX50VHIWFADHhs5AerkRn3hMRtf_R3F1qmEbo8EROZXp328HMC-o>). 250120 via digithek ch = #fineBlog s.a.: Angesichts der Veröffentlichung des 6-millionsten Artikels vergangene Woche in der englischsprachigen Wikipedia hat die Community-Zeitungsseite "Wikipedia Signpost" ein Moratorium bei der Veröffentlichung von Unternehmensartikeln gefordert. Das sei kein Vorwurf gegen die Wikimedia Foundation, aber die derzeitigen Maßnahmen, um die Enzyklopädie gegen missbräuchliches undeklariertes Paid Editing zu schützen, funktionierten ganz klar nicht. *"Da die ehrenamtlichen Autoren derzeit von Werbung in Gestalt von Wikipedia-Artikeln überwältigt werden, und da die WMF nicht in der Lage zu sein scheint, dem irgendetwas entgegenzusetzen, wäre der einzige gangbare Weg für die Autoren, fürs erste die Neuanlage von Artikeln über Unternehmen zu untersagen"*, schreibt der Benutzer Smallbones in seinem Editorial <https://en.wikipedia.org/wiki/Wikipedia:Wikipedia_Signpost/2020-01-27/From_the_editor> zur heutigen Ausgabe."
  14. Heflin, J.; Hendler, J.: Semantic interoperability on the Web (2000) 0.01
    0.0074971514 = product of:
      0.029988606 = sum of:
        0.02024465 = product of:
          0.06073395 = sum of:
            0.06073395 = weight(_text_:problem in 759) [ClassicSimilarity], result of:
              0.06073395 = score(doc=759,freq=4.0), product of:
                0.13082431 = queryWeight, product of:
                  4.244485 = idf(docFreq=1723, maxDocs=44218)
                  0.030822188 = queryNorm
                0.46424055 = fieldWeight in 759, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  4.244485 = idf(docFreq=1723, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=759)
          0.33333334 = coord(1/3)
        0.009743956 = product of:
          0.029231867 = sum of:
            0.029231867 = weight(_text_:22 in 759) [ClassicSimilarity], result of:
              0.029231867 = score(doc=759,freq=2.0), product of:
                0.10793405 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.030822188 = queryNorm
                0.2708308 = fieldWeight in 759, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=759)
          0.33333334 = coord(1/3)
      0.25 = coord(2/8)
    
    Abstract
    XML will have a profound impact on the way data is exchanged on the Internet. An important feature of this language is the separation of content from presentation, which makes it easier to select and/or reformat the data. However, due to the likelihood of numerous industry and domain specific DTDs, those who wish to integrate information will still be faced with the problem of semantic interoperability. In this paper we discuss why this problem is not solved by XML, and then discuss why the Resource Description Framework is only a partial solution. We then present the SHOE language, which we feel has many of the features necessary to enable a semantic web, and describe an existing set of tools that make it easy to use the language.
    Date
    11. 5.2013 19:22:18
  15. Nicholson, D.: High-Level Thesaurus (HILT) project : interoperability and cross-searching distributed services (200?) 0.01
    0.006899295 = product of:
      0.02759718 = sum of:
        0.016360147 = product of:
          0.04908044 = sum of:
            0.04908044 = weight(_text_:problem in 5966) [ClassicSimilarity], result of:
              0.04908044 = score(doc=5966,freq=2.0), product of:
                0.13082431 = queryWeight, product of:
                  4.244485 = idf(docFreq=1723, maxDocs=44218)
                  0.030822188 = queryNorm
                0.375163 = fieldWeight in 5966, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  4.244485 = idf(docFreq=1723, maxDocs=44218)
                  0.0625 = fieldNorm(doc=5966)
          0.33333334 = coord(1/3)
        0.011237033 = product of:
          0.033711098 = sum of:
            0.033711098 = weight(_text_:29 in 5966) [ClassicSimilarity], result of:
              0.033711098 = score(doc=5966,freq=2.0), product of:
                0.108422816 = queryWeight, product of:
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.030822188 = queryNorm
                0.31092256 = fieldWeight in 5966, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.0625 = fieldNorm(doc=5966)
          0.33333334 = coord(1/3)
      0.25 = coord(2/8)
    
    Abstract
    My presentation is about the HILT, High Level Thesaurus Project, which is looking, very roughly speaking, at how we might deal with interoperability problems relating to cross-searching distributed services by subject. The aims of HILT are to study and report on the problem of cross-searching and browsing by subject across a range of communities, services, and service or resource types in the UK given the wide range of subject schemes and associated practices in place
    Date
    13. 4.2008 12:29:16
  16. Faro, S.; Francesconi, E.; Marinai, E.; Sandrucci, V.: Report on execution and results of the interoperability tests (2008) 0.01
    0.006874024 = product of:
      0.027496096 = sum of:
        0.016360147 = product of:
          0.04908044 = sum of:
            0.04908044 = weight(_text_:problem in 7411) [ClassicSimilarity], result of:
              0.04908044 = score(doc=7411,freq=2.0), product of:
                0.13082431 = queryWeight, product of:
                  4.244485 = idf(docFreq=1723, maxDocs=44218)
                  0.030822188 = queryNorm
                0.375163 = fieldWeight in 7411, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  4.244485 = idf(docFreq=1723, maxDocs=44218)
                  0.0625 = fieldNorm(doc=7411)
          0.33333334 = coord(1/3)
        0.01113595 = product of:
          0.03340785 = sum of:
            0.03340785 = weight(_text_:22 in 7411) [ClassicSimilarity], result of:
              0.03340785 = score(doc=7411,freq=2.0), product of:
                0.10793405 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.030822188 = queryNorm
                0.30952093 = fieldWeight in 7411, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0625 = fieldNorm(doc=7411)
          0.33333334 = coord(1/3)
      0.25 = coord(2/8)
    
    Abstract
    - Formal characterization given to the thesaurus mapping problem - Interopearbility workflow - - Thesauri SKOS Core transformation - - Thesaurus Mapping algorithms implementation - The "gold standard" data set and the THALEN application - Thesaurus interoperability assessment measures - Experimental results
    Date
    7.11.2008 10:40:22
  17. Faro, S.; Francesconi, E.; Sandrucci, V.: Thesauri KOS analysis and selected thesaurus mapping methodology on the project case-study (2007) 0.01
    0.006874024 = product of:
      0.027496096 = sum of:
        0.016360147 = product of:
          0.04908044 = sum of:
            0.04908044 = weight(_text_:problem in 2227) [ClassicSimilarity], result of:
              0.04908044 = score(doc=2227,freq=2.0), product of:
                0.13082431 = queryWeight, product of:
                  4.244485 = idf(docFreq=1723, maxDocs=44218)
                  0.030822188 = queryNorm
                0.375163 = fieldWeight in 2227, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  4.244485 = idf(docFreq=1723, maxDocs=44218)
                  0.0625 = fieldNorm(doc=2227)
          0.33333334 = coord(1/3)
        0.01113595 = product of:
          0.03340785 = sum of:
            0.03340785 = weight(_text_:22 in 2227) [ClassicSimilarity], result of:
              0.03340785 = score(doc=2227,freq=2.0), product of:
                0.10793405 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.030822188 = queryNorm
                0.30952093 = fieldWeight in 2227, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0625 = fieldNorm(doc=2227)
          0.33333334 = coord(1/3)
      0.25 = coord(2/8)
    
    Abstract
    - Introduction to the Thesaurus Interoperability problem - Analysis of the thesauri for the project case study - Overview of Schema/Ontology Mapping methodologies - The proposed approach for thesaurus mapping - Standards for implementing the proposed methodology
    Date
    7.11.2008 10:40:22
  18. Rötzer, F.: Psychologen für die Künstliche Intelligenz (2018) 0.01
    0.006874024 = product of:
      0.027496096 = sum of:
        0.016360147 = product of:
          0.04908044 = sum of:
            0.04908044 = weight(_text_:problem in 4427) [ClassicSimilarity], result of:
              0.04908044 = score(doc=4427,freq=2.0), product of:
                0.13082431 = queryWeight, product of:
                  4.244485 = idf(docFreq=1723, maxDocs=44218)
                  0.030822188 = queryNorm
                0.375163 = fieldWeight in 4427, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  4.244485 = idf(docFreq=1723, maxDocs=44218)
                  0.0625 = fieldNorm(doc=4427)
          0.33333334 = coord(1/3)
        0.01113595 = product of:
          0.03340785 = sum of:
            0.03340785 = weight(_text_:22 in 4427) [ClassicSimilarity], result of:
              0.03340785 = score(doc=4427,freq=2.0), product of:
                0.10793405 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.030822188 = queryNorm
                0.30952093 = fieldWeight in 4427, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0625 = fieldNorm(doc=4427)
          0.33333334 = coord(1/3)
      0.25 = coord(2/8)
    
    Abstract
    Künstliche Intelligenz kann lernen anhand von Erfahrungen und Vorgaben. Das Problem ist aber, dass sich maschinelles Lernen zwar in eine bestimmte Richtung steuern lässt, um bestimmte Probleme zu lösen, dass aber der Programmierer mitunter nicht mehr weiß, was in dem "künstlichen Gehirn" bzw. dem neuronalen Netzwerk vor sich geht, das mit zunehmender Komplexität und Lernerfahrung für den Menschen zur Black Box wird, wo man nur sehen kann, was an Daten einfließt und was herauskommt.
    Date
    22. 1.2018 11:08:27
  19. Mayo, D.; Bowers, K.: ¬The devil's shoehorn : a case study of EAD to ArchivesSpace migration at a large university (2017) 0.01
    0.0061833817 = product of:
      0.024733527 = sum of:
        0.01771038 = product of:
          0.05313114 = sum of:
            0.05313114 = weight(_text_:problem in 3373) [ClassicSimilarity], result of:
              0.05313114 = score(doc=3373,freq=6.0), product of:
                0.13082431 = queryWeight, product of:
                  4.244485 = idf(docFreq=1723, maxDocs=44218)
                  0.030822188 = queryNorm
                0.4061259 = fieldWeight in 3373, product of:
                  2.4494898 = tf(freq=6.0), with freq of:
                    6.0 = termFreq=6.0
                  4.244485 = idf(docFreq=1723, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=3373)
          0.33333334 = coord(1/3)
        0.007023146 = product of:
          0.021069437 = sum of:
            0.021069437 = weight(_text_:29 in 3373) [ClassicSimilarity], result of:
              0.021069437 = score(doc=3373,freq=2.0), product of:
                0.108422816 = queryWeight, product of:
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.030822188 = queryNorm
                0.19432661 = fieldWeight in 3373, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=3373)
          0.33333334 = coord(1/3)
      0.25 = coord(2/8)
    
    Abstract
    A band of archivists and IT professionals at Harvard took on a project to convert nearly two million descriptions of archival collection components from marked-up text into the ArchivesSpace archival metadata management system. Starting in the mid-1990s, Harvard was an alpha implementer of EAD, an SGML (later XML) text markup language for electronic inventories, indexes, and finding aids that archivists use to wend their way through the sometimes quirky filing systems that bureaucracies establish for their records or the utter chaos in which some individuals keep their personal archives. These pathfinder documents, designed to cope with messy reality, can themselves be difficult to classify. Portions of them are rigorously structured, while other parts are narrative. Early documents predate the establishment of the standard; many feature idiosyncratic encoding that had been through several machine conversions, while others were freshly encoded and fairly consistent. In this paper, we will cover the practical and technical challenges involved in preparing a large (900MiB) corpus of XML for ingest into an open-source archival information system (ArchivesSpace). This case study will give an overview of the project, discuss problem discovery and problem solving, and address the technical challenges, analysis, solutions, and decisions and provide information on the tools produced and lessons learned. The authors of this piece are Kate Bowers, Collections Services Archivist for Metadata, Systems, and Standards at the Harvard University Archive, and Dave Mayo, a Digital Library Software Engineer for Harvard's Library and Technology Services. Kate was heavily involved in both metadata analysis and later problem solving, while Dave was the sole full-time developer assigned to the migration project.
    Date
    31. 1.2017 13:29:56
  20. Teal, W.: Alma enumerator : automating repetitive cataloging tasks with Python (2018) 0.01
    0.006036883 = product of:
      0.024147533 = sum of:
        0.014315128 = product of:
          0.042945385 = sum of:
            0.042945385 = weight(_text_:problem in 5348) [ClassicSimilarity], result of:
              0.042945385 = score(doc=5348,freq=2.0), product of:
                0.13082431 = queryWeight, product of:
                  4.244485 = idf(docFreq=1723, maxDocs=44218)
                  0.030822188 = queryNorm
                0.3282676 = fieldWeight in 5348, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  4.244485 = idf(docFreq=1723, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=5348)
          0.33333334 = coord(1/3)
        0.009832405 = product of:
          0.029497212 = sum of:
            0.029497212 = weight(_text_:29 in 5348) [ClassicSimilarity], result of:
              0.029497212 = score(doc=5348,freq=2.0), product of:
                0.108422816 = queryWeight, product of:
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.030822188 = queryNorm
                0.27205724 = fieldWeight in 5348, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=5348)
          0.33333334 = coord(1/3)
      0.25 = coord(2/8)
    
    Abstract
    In June 2016, the Warburg College library migrated to a new integrated library system, Alma. In the process, we lost the enumeration and chronology data for roughly 79,000 print serial item records. Re-entering all this data by hand seemed an unthinkable task. Fortunately, the information was recorded as free text in each item's description field. By using Python, Alma's API and much trial and error, the Wartburg College library was able to parse the serial item descriptions into enumeration and chronology data that was uploaded back into Alma. This paper discusses the design and feasibility considerations addressed in trying to solve this problem, the complications encountered during development, and the highlights and shortcomings of the collection of Python scripts that became Alma Enumerator.
    Date
    10.11.2018 16:29:37

Years

Languages

  • e 199
  • d 196
  • el 2
  • a 1
  • nl 1
  • More… Less…

Types

  • a 203
  • i 21
  • m 8
  • s 6
  • r 5
  • b 3
  • p 3
  • x 3
  • n 1
  • More… Less…