Search (270 results, page 1 of 14)

  • × year_i:[2020 TO 2030}
  1. Barth, T.: Inverse Panopticon : Digitalisierung & Transhumanismus [Transhumanismus II] (2020) 0.07
    0.06929384 = product of:
      0.103940755 = sum of:
        0.0542295 = product of:
          0.108459 = sum of:
            0.108459 = weight(_text_:t in 5592) [ClassicSimilarity], result of:
              0.108459 = score(doc=5592,freq=4.0), product of:
                0.17620352 = queryWeight, product of:
                  3.9394085 = idf(docFreq=2338, maxDocs=44218)
                  0.04472842 = queryNorm
                0.6155326 = fieldWeight in 5592, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  3.9394085 = idf(docFreq=2338, maxDocs=44218)
                  0.078125 = fieldNorm(doc=5592)
          0.5 = coord(1/2)
        0.049711253 = product of:
          0.09942251 = sum of:
            0.09942251 = weight(_text_:i in 5592) [ClassicSimilarity], result of:
              0.09942251 = score(doc=5592,freq=4.0), product of:
                0.16870351 = queryWeight, product of:
                  3.7717297 = idf(docFreq=2765, maxDocs=44218)
                  0.04472842 = queryNorm
                0.58933276 = fieldWeight in 5592, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  3.7717297 = idf(docFreq=2765, maxDocs=44218)
                  0.078125 = fieldNorm(doc=5592)
          0.5 = coord(1/2)
      0.6666667 = coord(2/3)
    
    Content
    Vgl.: Barth, T.: Digitalisierung und Lobby: Transhumanismus I. [12. Januar 2020]. Unter: https://www.heise.de/tp/features/Digitalisierung-und-Lobby-Transhumanismus-I-4633314.html?view=print.
  2. Gabler, S.: Vergabe von DDC-Sachgruppen mittels eines Schlagwort-Thesaurus (2021) 0.05
    0.05118405 = product of:
      0.07677607 = sum of:
        0.059200495 = product of:
          0.17760149 = sum of:
            0.17760149 = weight(_text_:3a in 1000) [ClassicSimilarity], result of:
              0.17760149 = score(doc=1000,freq=2.0), product of:
                0.37920806 = queryWeight, product of:
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.04472842 = queryNorm
                0.46834838 = fieldWeight in 1000, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=1000)
          0.33333334 = coord(1/3)
        0.01757558 = product of:
          0.03515116 = sum of:
            0.03515116 = weight(_text_:i in 1000) [ClassicSimilarity], result of:
              0.03515116 = score(doc=1000,freq=2.0), product of:
                0.16870351 = queryWeight, product of:
                  3.7717297 = idf(docFreq=2765, maxDocs=44218)
                  0.04472842 = queryNorm
                0.20836058 = fieldWeight in 1000, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.7717297 = idf(docFreq=2765, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=1000)
          0.5 = coord(1/2)
      0.6666667 = coord(2/3)
    
    Content
    Master thesis Master of Science (Library and Information Studies) (MSc), Universität Wien. Advisor: Christoph Steiner. Vgl.: https://www.researchgate.net/publication/371680244_Vergabe_von_DDC-Sachgruppen_mittels_eines_Schlagwort-Thesaurus. DOI: 10.25365/thesis.70030. Vgl. dazu die Präsentation unter: https://www.google.com/url?sa=i&rct=j&q=&esrc=s&source=web&cd=&ved=0CAIQw7AJahcKEwjwoZzzytz_AhUAAAAAHQAAAAAQAg&url=https%3A%2F%2Fwiki.dnb.de%2Fdownload%2Fattachments%2F252121510%2FDA3%2520Workshop-Gabler.pdf%3Fversion%3D1%26modificationDate%3D1671093170000%26api%3Dv2&psig=AOvVaw0szwENK1or3HevgvIDOfjx&ust=1687719410889597&opi=89978449.
  3. Barth, T.: Digitalisierung und Lobby : Transhumanismus I (2020) 0.05
    0.048505686 = product of:
      0.072758526 = sum of:
        0.037960652 = product of:
          0.075921305 = sum of:
            0.075921305 = weight(_text_:t in 5665) [ClassicSimilarity], result of:
              0.075921305 = score(doc=5665,freq=4.0), product of:
                0.17620352 = queryWeight, product of:
                  3.9394085 = idf(docFreq=2338, maxDocs=44218)
                  0.04472842 = queryNorm
                0.4308728 = fieldWeight in 5665, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  3.9394085 = idf(docFreq=2338, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=5665)
          0.5 = coord(1/2)
        0.034797877 = product of:
          0.069595754 = sum of:
            0.069595754 = weight(_text_:i in 5665) [ClassicSimilarity], result of:
              0.069595754 = score(doc=5665,freq=4.0), product of:
                0.16870351 = queryWeight, product of:
                  3.7717297 = idf(docFreq=2765, maxDocs=44218)
                  0.04472842 = queryNorm
                0.41253293 = fieldWeight in 5665, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  3.7717297 = idf(docFreq=2765, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=5665)
          0.5 = coord(1/2)
      0.6666667 = coord(2/3)
    
    Content
    Vgl. die Fortsetzung: Barth, T.: Inverse Panopticon: Digitalisierung & Transhumanismus [Transhumanismus II]. [25. Januar 2020]. Unter: https://www.heise.de/tp/features/Inverse-Panopticon-Digitalisierung-Transhumanismus-4645668.html?seite=all.
    Source
    https://www.heise.de/tp/features/Digitalisierung-und-Lobby-Transhumanismus-I-4633314.html?view=print
  4. Tappenbeck, I.; Michel, A.; Wittich, A.; Werr, N.; Gäde, M.; Spree, U.; Gläser, C.; Griesbaum, J.; Mandl, T.; Keller-Loibl, K.; Stang, R.: Framework Informationskompetenz : Ein gemeinsamer Standard für die Qualifikation in den bibliotheks- und informationswissenschaftlichen Studiengängen in Deutschland (2022) 0.03
    0.0342987 = product of:
      0.051448047 = sum of:
        0.026842235 = product of:
          0.05368447 = sum of:
            0.05368447 = weight(_text_:t in 540) [ClassicSimilarity], result of:
              0.05368447 = score(doc=540,freq=2.0), product of:
                0.17620352 = queryWeight, product of:
                  3.9394085 = idf(docFreq=2338, maxDocs=44218)
                  0.04472842 = queryNorm
                0.30467308 = fieldWeight in 540, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.9394085 = idf(docFreq=2338, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=540)
          0.5 = coord(1/2)
        0.024605814 = product of:
          0.04921163 = sum of:
            0.04921163 = weight(_text_:i in 540) [ClassicSimilarity], result of:
              0.04921163 = score(doc=540,freq=2.0), product of:
                0.16870351 = queryWeight, product of:
                  3.7717297 = idf(docFreq=2765, maxDocs=44218)
                  0.04472842 = queryNorm
                0.29170483 = fieldWeight in 540, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.7717297 = idf(docFreq=2765, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=540)
          0.5 = coord(1/2)
      0.6666667 = coord(2/3)
    
  5. Eibl, M.; Haupt, J.; Kahl, S.; Taubert, S.; Wilhelm-Stein, T.: Audio- und Musik-Retrieval (2023) 0.03
    0.0342987 = product of:
      0.051448047 = sum of:
        0.026842235 = product of:
          0.05368447 = sum of:
            0.05368447 = weight(_text_:t in 802) [ClassicSimilarity], result of:
              0.05368447 = score(doc=802,freq=2.0), product of:
                0.17620352 = queryWeight, product of:
                  3.9394085 = idf(docFreq=2338, maxDocs=44218)
                  0.04472842 = queryNorm
                0.30467308 = fieldWeight in 802, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.9394085 = idf(docFreq=2338, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=802)
          0.5 = coord(1/2)
        0.024605814 = product of:
          0.04921163 = sum of:
            0.04921163 = weight(_text_:i in 802) [ClassicSimilarity], result of:
              0.04921163 = score(doc=802,freq=2.0), product of:
                0.16870351 = queryWeight, product of:
                  3.7717297 = idf(docFreq=2765, maxDocs=44218)
                  0.04472842 = queryNorm
                0.29170483 = fieldWeight in 802, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.7717297 = idf(docFreq=2765, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=802)
          0.5 = coord(1/2)
      0.6666667 = coord(2/3)
    
    Abstract
    Das Gebiet Audio-Retrieval kann grob in drei Bereiche unterteilt werden: Musik-Retrieval, Retrieval gesprochener Sprache und Retrieval akustischer Ereignisse. Alle drei Bereiche gehen i. d. R. vom Audiosignal als Quelle aus, welches über eine Signalanalyse, meist eine Spektralanalyse über eine Fouriertransformation, weiterverarbeitet und in eine für das Retrieval geeignete Beschreibung gebracht wird. Dabei gibt es auch alternative Ansätze, wie z. B. die Nutzung der hier nicht diskutierten MIDI-Codierung im Musik-Retrieval, die ohne ein akustisches Signal auskommt und bereits eine für das Retrieval geeignete Form der Kodierung als Grundlage hat.
  6. Krüger, N.; Pianos, T.: Lernmaterialien für junge Forschende in den Wirtschaftswissenschaften als Open Educational Resources (OER) (2021) 0.03
    0.03203502 = product of:
      0.048052527 = sum of:
        0.026842235 = product of:
          0.05368447 = sum of:
            0.05368447 = weight(_text_:t in 252) [ClassicSimilarity], result of:
              0.05368447 = score(doc=252,freq=2.0), product of:
                0.17620352 = queryWeight, product of:
                  3.9394085 = idf(docFreq=2338, maxDocs=44218)
                  0.04472842 = queryNorm
                0.30467308 = fieldWeight in 252, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.9394085 = idf(docFreq=2338, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=252)
          0.5 = coord(1/2)
        0.021210292 = product of:
          0.042420585 = sum of:
            0.042420585 = weight(_text_:22 in 252) [ClassicSimilarity], result of:
              0.042420585 = score(doc=252,freq=2.0), product of:
                0.1566313 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.04472842 = queryNorm
                0.2708308 = fieldWeight in 252, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=252)
          0.5 = coord(1/2)
      0.6666667 = coord(2/3)
    
    Date
    22. 5.2021 12:43:05
  7. Tay, A.: ¬The next generation discovery citation indexes : a review of the landscape in 2020 (2020) 0.03
    0.030544072 = product of:
      0.09163222 = sum of:
        0.09163222 = sum of:
          0.04921163 = weight(_text_:i in 40) [ClassicSimilarity], result of:
            0.04921163 = score(doc=40,freq=2.0), product of:
              0.16870351 = queryWeight, product of:
                3.7717297 = idf(docFreq=2765, maxDocs=44218)
                0.04472842 = queryNorm
              0.29170483 = fieldWeight in 40, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                3.7717297 = idf(docFreq=2765, maxDocs=44218)
                0.0546875 = fieldNorm(doc=40)
          0.042420585 = weight(_text_:22 in 40) [ClassicSimilarity], result of:
            0.042420585 = score(doc=40,freq=2.0), product of:
              0.1566313 = queryWeight, product of:
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.04472842 = queryNorm
              0.2708308 = fieldWeight in 40, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                3.5018296 = idf(docFreq=3622, maxDocs=44218)
                0.0546875 = fieldNorm(doc=40)
      0.33333334 = coord(1/3)
    
    Abstract
    Conclusion There is a reason why Google Scholar and Web of Science/Scopus are kings of the hills in their various arenas. They have strong brand recogniton, a head start in development and a mass of eyeballs and users that leads to an almost virtious cycle of improvement. Competing against such well established competitors is not easy even when one has deep pockets (Microsoft) or a killer idea (scite). It will be interesting to see how the landscape will look like in 2030. Stay tuned for part II where I review each particular index.
    Date
    17.11.2020 12:22:59
  8. Reichmann, S.; Klebel, T.; Hasani-Mavriqi, I.; Ross-Hellauer, T.: Between administration and research : understanding data management practices in an institutional context (2021) 0.03
    0.029793557 = product of:
      0.044690333 = sum of:
        0.02711475 = product of:
          0.0542295 = sum of:
            0.0542295 = weight(_text_:t in 384) [ClassicSimilarity], result of:
              0.0542295 = score(doc=384,freq=4.0), product of:
                0.17620352 = queryWeight, product of:
                  3.9394085 = idf(docFreq=2338, maxDocs=44218)
                  0.04472842 = queryNorm
                0.3077663 = fieldWeight in 384, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  3.9394085 = idf(docFreq=2338, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=384)
          0.5 = coord(1/2)
        0.01757558 = product of:
          0.03515116 = sum of:
            0.03515116 = weight(_text_:i in 384) [ClassicSimilarity], result of:
              0.03515116 = score(doc=384,freq=2.0), product of:
                0.16870351 = queryWeight, product of:
                  3.7717297 = idf(docFreq=2765, maxDocs=44218)
                  0.04472842 = queryNorm
                0.20836058 = fieldWeight in 384, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.7717297 = idf(docFreq=2765, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=384)
          0.5 = coord(1/2)
      0.6666667 = coord(2/3)
    
  9. Lee, Y.-Y.; Ke, H.; Yen, T.-Y.; Huang, H.-H.; Chen, H.-H.: Combining and learning word embedding with WordNet for semantic relatedness and similarity measurement (2020) 0.03
    0.029398886 = product of:
      0.04409833 = sum of:
        0.02300763 = product of:
          0.04601526 = sum of:
            0.04601526 = weight(_text_:t in 5871) [ClassicSimilarity], result of:
              0.04601526 = score(doc=5871,freq=2.0), product of:
                0.17620352 = queryWeight, product of:
                  3.9394085 = idf(docFreq=2338, maxDocs=44218)
                  0.04472842 = queryNorm
                0.26114836 = fieldWeight in 5871, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.9394085 = idf(docFreq=2338, maxDocs=44218)
                  0.046875 = fieldNorm(doc=5871)
          0.5 = coord(1/2)
        0.0210907 = product of:
          0.0421814 = sum of:
            0.0421814 = weight(_text_:i in 5871) [ClassicSimilarity], result of:
              0.0421814 = score(doc=5871,freq=2.0), product of:
                0.16870351 = queryWeight, product of:
                  3.7717297 = idf(docFreq=2765, maxDocs=44218)
                  0.04472842 = queryNorm
                0.25003272 = fieldWeight in 5871, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.7717297 = idf(docFreq=2765, maxDocs=44218)
                  0.046875 = fieldNorm(doc=5871)
          0.5 = coord(1/2)
      0.6666667 = coord(2/3)
    
    Abstract
    In this research, we propose 3 different approaches to measure the semantic relatedness between 2 words: (i) boost the performance of GloVe word embedding model via removing or transforming abnormal dimensions; (ii) linearly combine the information extracted from WordNet and word embeddings; and (iii) utilize word embedding and 12 linguistic information extracted from WordNet as features for Support Vector Regression. We conducted our experiments on 8 benchmark data sets, and computed Spearman correlations between the outputs of our methods and the ground truth. We report our results together with 3 state-of-the-art approaches. The experimental results show that our method can outperform state-of-the-art approaches in all the selected English benchmark data sets.
  10. Lowe, D.B.; Dollinger, I.; Koster, T.; Herbert, B.E.: Text mining for type of research classification (2021) 0.03
    0.029398886 = product of:
      0.04409833 = sum of:
        0.02300763 = product of:
          0.04601526 = sum of:
            0.04601526 = weight(_text_:t in 720) [ClassicSimilarity], result of:
              0.04601526 = score(doc=720,freq=2.0), product of:
                0.17620352 = queryWeight, product of:
                  3.9394085 = idf(docFreq=2338, maxDocs=44218)
                  0.04472842 = queryNorm
                0.26114836 = fieldWeight in 720, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.9394085 = idf(docFreq=2338, maxDocs=44218)
                  0.046875 = fieldNorm(doc=720)
          0.5 = coord(1/2)
        0.0210907 = product of:
          0.0421814 = sum of:
            0.0421814 = weight(_text_:i in 720) [ClassicSimilarity], result of:
              0.0421814 = score(doc=720,freq=2.0), product of:
                0.16870351 = queryWeight, product of:
                  3.7717297 = idf(docFreq=2765, maxDocs=44218)
                  0.04472842 = queryNorm
                0.25003272 = fieldWeight in 720, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.7717297 = idf(docFreq=2765, maxDocs=44218)
                  0.046875 = fieldNorm(doc=720)
          0.5 = coord(1/2)
      0.6666667 = coord(2/3)
    
  11. Slota, S.C.; Fleischmann, K.R.; Lee, M.K.; Greenberg, S.R.; Nigam, I.; Zimmerman, T.; Rodriguez, S.; Snow, J.: ¬A feeling for the data : how government and nonprofit stakeholders negotiate value conflicts in data science approaches to ending homelessness (2023) 0.03
    0.029398886 = product of:
      0.04409833 = sum of:
        0.02300763 = product of:
          0.04601526 = sum of:
            0.04601526 = weight(_text_:t in 969) [ClassicSimilarity], result of:
              0.04601526 = score(doc=969,freq=2.0), product of:
                0.17620352 = queryWeight, product of:
                  3.9394085 = idf(docFreq=2338, maxDocs=44218)
                  0.04472842 = queryNorm
                0.26114836 = fieldWeight in 969, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.9394085 = idf(docFreq=2338, maxDocs=44218)
                  0.046875 = fieldNorm(doc=969)
          0.5 = coord(1/2)
        0.0210907 = product of:
          0.0421814 = sum of:
            0.0421814 = weight(_text_:i in 969) [ClassicSimilarity], result of:
              0.0421814 = score(doc=969,freq=2.0), product of:
                0.16870351 = queryWeight, product of:
                  3.7717297 = idf(docFreq=2765, maxDocs=44218)
                  0.04472842 = queryNorm
                0.25003272 = fieldWeight in 969, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.7717297 = idf(docFreq=2765, maxDocs=44218)
                  0.046875 = fieldNorm(doc=969)
          0.5 = coord(1/2)
      0.6666667 = coord(2/3)
    
  12. Parapar, J.; Losada, D.E.; Presedo-Quindimil, M.A.; Barreiro, A.: Using score distributions to compare statistical significance tests for information retrieval evaluation (2020) 0.03
    0.029352438 = product of:
      0.044028655 = sum of:
        0.019173026 = product of:
          0.038346052 = sum of:
            0.038346052 = weight(_text_:t in 5506) [ClassicSimilarity], result of:
              0.038346052 = score(doc=5506,freq=2.0), product of:
                0.17620352 = queryWeight, product of:
                  3.9394085 = idf(docFreq=2338, maxDocs=44218)
                  0.04472842 = queryNorm
                0.21762364 = fieldWeight in 5506, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.9394085 = idf(docFreq=2338, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=5506)
          0.5 = coord(1/2)
        0.024855627 = product of:
          0.049711253 = sum of:
            0.049711253 = weight(_text_:i in 5506) [ClassicSimilarity], result of:
              0.049711253 = score(doc=5506,freq=4.0), product of:
                0.16870351 = queryWeight, product of:
                  3.7717297 = idf(docFreq=2765, maxDocs=44218)
                  0.04472842 = queryNorm
                0.29466638 = fieldWeight in 5506, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  3.7717297 = idf(docFreq=2765, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=5506)
          0.5 = coord(1/2)
      0.6666667 = coord(2/3)
    
    Abstract
    Statistical significance tests can provide evidence that the observed difference in performance between 2 methods is not due to chance. In information retrieval (IR), some studies have examined the validity and suitability of such tests for comparing search systems. We argue here that current methods for assessing the reliability of statistical tests suffer from some methodological weaknesses, and we propose a novel way to study significance tests for retrieval evaluation. Using Score Distributions, we model the output of multiple search systems, produce simulated search results from such models, and compare them using various significance tests. A key strength of this approach is that we assess statistical tests under perfect knowledge about the truth or falseness of the null hypothesis. This new method for studying the power of significance tests in IR evaluation is formal and innovative. Following this type of analysis, we found that both the sign test and Wilcoxon signed test have more power than the permutation test and the t-test. The sign test and Wilcoxon signed test also have good behavior in terms of type I errors. The bootstrap test shows few type I errors, but it has less power than the other methods tested.
  13. Midtgarden, T.: Peirce's Classification of the Sciences (2020) 0.02
    0.024499072 = product of:
      0.036748607 = sum of:
        0.019173026 = product of:
          0.038346052 = sum of:
            0.038346052 = weight(_text_:t in 5885) [ClassicSimilarity], result of:
              0.038346052 = score(doc=5885,freq=2.0), product of:
                0.17620352 = queryWeight, product of:
                  3.9394085 = idf(docFreq=2338, maxDocs=44218)
                  0.04472842 = queryNorm
                0.21762364 = fieldWeight in 5885, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.9394085 = idf(docFreq=2338, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=5885)
          0.5 = coord(1/2)
        0.01757558 = product of:
          0.03515116 = sum of:
            0.03515116 = weight(_text_:i in 5885) [ClassicSimilarity], result of:
              0.03515116 = score(doc=5885,freq=2.0), product of:
                0.16870351 = queryWeight, product of:
                  3.7717297 = idf(docFreq=2765, maxDocs=44218)
                  0.04472842 = queryNorm
                0.20836058 = fieldWeight in 5885, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.7717297 = idf(docFreq=2765, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=5885)
          0.5 = coord(1/2)
      0.6666667 = coord(2/3)
    
    Abstract
    Charles Peirce's classification of the sciences was designed shortly after the turn of the twentieth century. The classification has two main sources of inspiration: Comte's science classification and Kant's theoretical philosophy. Peirce's classification, like that of Comte, is hierarchically organised in that the more general an abstract sciences provide principles for the less general and more concrete sciences. However, Peirce includes and assigns a superordinate role to philosophical disciplines which analyse and provide logical, methodological and ontological principles for the specialised sciences, and which are based on everyday life experience. Moreover, Peirce recognises two main branches of specialised empirical science: the natural sciences, on the one hand, and the social sciences, the humanities and psychology on the other. While both branches share logical and methodological principles, they are based on different ontological principles in studying physical nature and the human mind and its products, respectively. Peirce's most basic philosophical discipline, phenomenology, transforms his early engagement with Kant. Peirce's classification of aesthetics, ethics and logic as normative sub-disciplines of philosophy relate to his philosophical pragmatism. Yet his more overarching division between theoretical (philosophical and specialised) sciences and practical sciences may be seen as problematic. Taking Peirce's historical account of scientific developments into consideration, however, I argue that his science classification and its emphasis on the interdependencies between the sciences could be seen as sustaining and supporting interdisciplinarity and interaction across fields of research, even across the divide between theoretical and practical sciences.
  14. Oliphant, T.: Emerging (information) realities and epistemic injustice (2021) 0.02
    0.024499072 = product of:
      0.036748607 = sum of:
        0.019173026 = product of:
          0.038346052 = sum of:
            0.038346052 = weight(_text_:t in 358) [ClassicSimilarity], result of:
              0.038346052 = score(doc=358,freq=2.0), product of:
                0.17620352 = queryWeight, product of:
                  3.9394085 = idf(docFreq=2338, maxDocs=44218)
                  0.04472842 = queryNorm
                0.21762364 = fieldWeight in 358, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.9394085 = idf(docFreq=2338, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=358)
          0.5 = coord(1/2)
        0.01757558 = product of:
          0.03515116 = sum of:
            0.03515116 = weight(_text_:i in 358) [ClassicSimilarity], result of:
              0.03515116 = score(doc=358,freq=2.0), product of:
                0.16870351 = queryWeight, product of:
                  3.7717297 = idf(docFreq=2765, maxDocs=44218)
                  0.04472842 = queryNorm
                0.20836058 = fieldWeight in 358, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.7717297 = idf(docFreq=2765, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=358)
          0.5 = coord(1/2)
      0.6666667 = coord(2/3)
    
    Abstract
    Emergent realities such as the COVID-19 pandemic and corresponding "infodemic," the resurgence of Black Lives Matter, climate catastrophe, and fake news, misinformation, disinformation, and so on challenge information researchers to reconsider the limitations and potential of the user-centered paradigm that has guided much library and information studies (LIS) research. In order to engage with these emergent realities, understanding who people are in terms of their social identities, social power, and as epistemic agents-that is, knowers, speakers, listeners, and informants-may provide insight into human information interactions. These are matters of epistemic injustice. Drawing heavily from Miranda Fricker's work Epistemic Injustice: Power & the Ethics of Knowing, I use the concept of epistemic injustice (testimonial, systematic, and hermeneutical injustice) to consider people as epistemic beings rather than "users" in order to potentially illuminate new understandings of the subfields of information behavior and information literacy. Focusing on people as knowers, speakers, listeners, and informants rather than "users" presents an opportunity for information researchers, practitioners, and LIS educators to work in service of the epistemic interests of people and in alignment with liberatory aims.
  15. Zhang, Y.; Wu, D.; Hagen, L.; Song, I.-Y.; Mostafa, J.; Oh, S.; Anderson, T.; Shah, C.; Bishop, B.W.; Hopfgartner, F.; Eckert, K.; Federer, L.; Saltz, J.S.: Data science curriculum in the iField (2023) 0.02
    0.024499072 = product of:
      0.036748607 = sum of:
        0.019173026 = product of:
          0.038346052 = sum of:
            0.038346052 = weight(_text_:t in 964) [ClassicSimilarity], result of:
              0.038346052 = score(doc=964,freq=2.0), product of:
                0.17620352 = queryWeight, product of:
                  3.9394085 = idf(docFreq=2338, maxDocs=44218)
                  0.04472842 = queryNorm
                0.21762364 = fieldWeight in 964, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.9394085 = idf(docFreq=2338, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=964)
          0.5 = coord(1/2)
        0.01757558 = product of:
          0.03515116 = sum of:
            0.03515116 = weight(_text_:i in 964) [ClassicSimilarity], result of:
              0.03515116 = score(doc=964,freq=2.0), product of:
                0.16870351 = queryWeight, product of:
                  3.7717297 = idf(docFreq=2765, maxDocs=44218)
                  0.04472842 = queryNorm
                0.20836058 = fieldWeight in 964, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.7717297 = idf(docFreq=2765, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=964)
          0.5 = coord(1/2)
      0.6666667 = coord(2/3)
    
  16. Noever, D.; Ciolino, M.: ¬The Turing deception (2022) 0.02
    0.023680199 = product of:
      0.07104059 = sum of:
        0.07104059 = product of:
          0.21312177 = sum of:
            0.21312177 = weight(_text_:3a in 862) [ClassicSimilarity], result of:
              0.21312177 = score(doc=862,freq=2.0), product of:
                0.37920806 = queryWeight, product of:
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.04472842 = queryNorm
                0.56201804 = fieldWeight in 862, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.046875 = fieldNorm(doc=862)
          0.33333334 = coord(1/3)
      0.33333334 = coord(1/3)
    
    Source
    https%3A%2F%2Farxiv.org%2Fabs%2F2212.06721&usg=AOvVaw3i_9pZm9y_dQWoHi6uv0EN
  17. Bergman, O.; Israeli, T.; Whittaker, S.: Factors hindering shared files retrieval (2020) 0.02
    0.022882156 = product of:
      0.034323234 = sum of:
        0.019173026 = product of:
          0.038346052 = sum of:
            0.038346052 = weight(_text_:t in 5843) [ClassicSimilarity], result of:
              0.038346052 = score(doc=5843,freq=2.0), product of:
                0.17620352 = queryWeight, product of:
                  3.9394085 = idf(docFreq=2338, maxDocs=44218)
                  0.04472842 = queryNorm
                0.21762364 = fieldWeight in 5843, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.9394085 = idf(docFreq=2338, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=5843)
          0.5 = coord(1/2)
        0.015150209 = product of:
          0.030300418 = sum of:
            0.030300418 = weight(_text_:22 in 5843) [ClassicSimilarity], result of:
              0.030300418 = score(doc=5843,freq=2.0), product of:
                0.1566313 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.04472842 = queryNorm
                0.19345059 = fieldWeight in 5843, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=5843)
          0.5 = coord(1/2)
      0.6666667 = coord(2/3)
    
    Date
    20. 1.2015 18:30:22
  18. Gorichanaz, T.: Sanctuary : an institutional vision for the digital age (2021) 0.02
    0.022882156 = product of:
      0.034323234 = sum of:
        0.019173026 = product of:
          0.038346052 = sum of:
            0.038346052 = weight(_text_:t in 107) [ClassicSimilarity], result of:
              0.038346052 = score(doc=107,freq=2.0), product of:
                0.17620352 = queryWeight, product of:
                  3.9394085 = idf(docFreq=2338, maxDocs=44218)
                  0.04472842 = queryNorm
                0.21762364 = fieldWeight in 107, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.9394085 = idf(docFreq=2338, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=107)
          0.5 = coord(1/2)
        0.015150209 = product of:
          0.030300418 = sum of:
            0.030300418 = weight(_text_:22 in 107) [ClassicSimilarity], result of:
              0.030300418 = score(doc=107,freq=2.0), product of:
                0.1566313 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.04472842 = queryNorm
                0.19345059 = fieldWeight in 107, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=107)
          0.5 = coord(1/2)
      0.6666667 = coord(2/3)
    
    Date
    22. 1.2021 14:20:55
  19. Huang, T.; Nie, R.; Zhao, Y.: Archival knowledge in the field of personal archiving : an exploratory study based on grounded theory (2021) 0.02
    0.022882156 = product of:
      0.034323234 = sum of:
        0.019173026 = product of:
          0.038346052 = sum of:
            0.038346052 = weight(_text_:t in 173) [ClassicSimilarity], result of:
              0.038346052 = score(doc=173,freq=2.0), product of:
                0.17620352 = queryWeight, product of:
                  3.9394085 = idf(docFreq=2338, maxDocs=44218)
                  0.04472842 = queryNorm
                0.21762364 = fieldWeight in 173, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.9394085 = idf(docFreq=2338, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=173)
          0.5 = coord(1/2)
        0.015150209 = product of:
          0.030300418 = sum of:
            0.030300418 = weight(_text_:22 in 173) [ClassicSimilarity], result of:
              0.030300418 = score(doc=173,freq=2.0), product of:
                0.1566313 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.04472842 = queryNorm
                0.19345059 = fieldWeight in 173, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=173)
          0.5 = coord(1/2)
      0.6666667 = coord(2/3)
    
    Date
    22. 1.2021 14:20:27
  20. Haimson, O.L.; Carter, A.J.; Corvite, S.; Wheeler, B.; Wang, L.; Liu, T.; Lige, A.: ¬The major life events taxonomy : social readjustment, social media information sharing, and online network separation during times of life transition (2021) 0.02
    0.022882156 = product of:
      0.034323234 = sum of:
        0.019173026 = product of:
          0.038346052 = sum of:
            0.038346052 = weight(_text_:t in 263) [ClassicSimilarity], result of:
              0.038346052 = score(doc=263,freq=2.0), product of:
                0.17620352 = queryWeight, product of:
                  3.9394085 = idf(docFreq=2338, maxDocs=44218)
                  0.04472842 = queryNorm
                0.21762364 = fieldWeight in 263, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.9394085 = idf(docFreq=2338, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=263)
          0.5 = coord(1/2)
        0.015150209 = product of:
          0.030300418 = sum of:
            0.030300418 = weight(_text_:22 in 263) [ClassicSimilarity], result of:
              0.030300418 = score(doc=263,freq=2.0), product of:
                0.1566313 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.04472842 = queryNorm
                0.19345059 = fieldWeight in 263, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=263)
          0.5 = coord(1/2)
      0.6666667 = coord(2/3)
    
    Date
    10. 6.2021 19:22:47

Languages

  • e 193
  • d 76
  • m 1
  • More… Less…

Types

  • a 243
  • el 68
  • m 8
  • p 4
  • s 3
  • x 2
  • More… Less…