Search (1366 results, page 2 of 69)

  • × type_ss:"el"
  1. Gutknecht, C.: Zahlungen der ETH Zürich an Elsevier, Springer und Wiley nun öffentlich (2015) 0.04
    0.03870084 = product of:
      0.1290028 = sum of:
        0.02829376 = weight(_text_:23 in 4324) [ClassicSimilarity], result of:
          0.02829376 = score(doc=4324,freq=4.0), product of:
            0.07217676 = queryWeight, product of:
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.02013827 = queryNorm
            0.39200652 = fieldWeight in 4324, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.0546875 = fieldNorm(doc=4324)
        0.02829376 = weight(_text_:23 in 4324) [ClassicSimilarity], result of:
          0.02829376 = score(doc=4324,freq=4.0), product of:
            0.07217676 = queryWeight, product of:
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.02013827 = queryNorm
            0.39200652 = fieldWeight in 4324, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.0546875 = fieldNorm(doc=4324)
        0.017107777 = weight(_text_:und in 4324) [ClassicSimilarity], result of:
          0.017107777 = score(doc=4324,freq=10.0), product of:
            0.044633795 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.02013827 = queryNorm
            0.38329202 = fieldWeight in 4324, product of:
              3.1622777 = tf(freq=10.0), with freq of:
                10.0 = termFreq=10.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.0546875 = fieldNorm(doc=4324)
        0.02829376 = weight(_text_:23 in 4324) [ClassicSimilarity], result of:
          0.02829376 = score(doc=4324,freq=4.0), product of:
            0.07217676 = queryWeight, product of:
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.02013827 = queryNorm
            0.39200652 = fieldWeight in 4324, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.0546875 = fieldNorm(doc=4324)
        0.017377444 = weight(_text_:der in 4324) [ClassicSimilarity], result of:
          0.017377444 = score(doc=4324,freq=10.0), product of:
            0.044984195 = queryWeight, product of:
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.02013827 = queryNorm
            0.38630107 = fieldWeight in 4324, product of:
              3.1622777 = tf(freq=10.0), with freq of:
                10.0 = termFreq=10.0
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.0546875 = fieldNorm(doc=4324)
        0.009636286 = product of:
          0.019272571 = sum of:
            0.019272571 = weight(_text_:29 in 4324) [ClassicSimilarity], result of:
              0.019272571 = score(doc=4324,freq=2.0), product of:
                0.070840135 = queryWeight, product of:
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.02013827 = queryNorm
                0.27205724 = fieldWeight in 4324, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=4324)
          0.5 = coord(1/2)
      0.3 = coord(6/20)
    
    Abstract
    Was bezahlt die ETH Bibliothek an Elsevier, Springer und Wiley? Die Antwort auf diese einfache Frage liegt nun nach gut 14 Monaten und einem Entscheid der ersten Rekursinstanz (EDÖB) vor. Werfen wir nun also einen Blick in diese nun erstmals öffentlich zugänglichen Daten (auch als XLSX). Die ETH-Bibliothek schlüsselte die Ausgaben wie von mir gewünscht in Datenbanken, E-Books und Zeitschriften auf.
    Content
    Vgl. auch: https://wisspub.net/2018/06/23/transparenz-bei-subskriptionskosten-in-der-schweiz-bilanz-nach-vier-jahren/ <https://wisspub.net/2018/06/23/transparenz-bei-subskriptionskosten-in-der-schweiz-bilanz-nach-vier-jahren/>.
    Source
    http://wisspub.net/2015/08/29/zahlungen-der-eth-zuerich-an-elsevier-springer-und-wiley-nun-oeffentlich/
  2. Kende, J.: Software as a Service : Herausforderungen bei der Einführung des Bibliothekssystems Alma in der Freien Universität Berlin (2015) 0.04
    0.038560487 = product of:
      0.15424195 = sum of:
        0.03961775 = weight(_text_:software in 2475) [ClassicSimilarity], result of:
          0.03961775 = score(doc=2475,freq=4.0), product of:
            0.07989157 = queryWeight, product of:
              3.9671519 = idf(docFreq=2274, maxDocs=44218)
              0.02013827 = queryNorm
            0.49589399 = fieldWeight in 2475, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              3.9671519 = idf(docFreq=2274, maxDocs=44218)
              0.0625 = fieldNorm(doc=2475)
        0.008743806 = weight(_text_:und in 2475) [ClassicSimilarity], result of:
          0.008743806 = score(doc=2475,freq=2.0), product of:
            0.044633795 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.02013827 = queryNorm
            0.19590102 = fieldWeight in 2475, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.0625 = fieldNorm(doc=2475)
        0.03961775 = weight(_text_:software in 2475) [ClassicSimilarity], result of:
          0.03961775 = score(doc=2475,freq=4.0), product of:
            0.07989157 = queryWeight, product of:
              3.9671519 = idf(docFreq=2274, maxDocs=44218)
              0.02013827 = queryNorm
            0.49589399 = fieldWeight in 2475, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              3.9671519 = idf(docFreq=2274, maxDocs=44218)
              0.0625 = fieldNorm(doc=2475)
        0.026644897 = weight(_text_:der in 2475) [ClassicSimilarity], result of:
          0.026644897 = score(doc=2475,freq=18.0), product of:
            0.044984195 = queryWeight, product of:
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.02013827 = queryNorm
            0.59231687 = fieldWeight in 2475, product of:
              4.2426405 = tf(freq=18.0), with freq of:
                18.0 = termFreq=18.0
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.0625 = fieldNorm(doc=2475)
        0.03961775 = weight(_text_:software in 2475) [ClassicSimilarity], result of:
          0.03961775 = score(doc=2475,freq=4.0), product of:
            0.07989157 = queryWeight, product of:
              3.9671519 = idf(docFreq=2274, maxDocs=44218)
              0.02013827 = queryNorm
            0.49589399 = fieldWeight in 2475, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              3.9671519 = idf(docFreq=2274, maxDocs=44218)
              0.0625 = fieldNorm(doc=2475)
      0.25 = coord(5/20)
    
    Abstract
    Moderne Bibliothekssysteme werden zunehmend als Software as a Service (SaaS) angeboten. Die Berliner Bibliotheken der Freien Universität Berlin, der Humboldt Universität, der Technischen Universität und der Universität der Künste haben für 2016 gemeinsam den Umstieg auf das cloudbasierte Bibliothekssystem Alma beschlossen. Der Aufsatz berichtet über die Herausforderungen während der zweijährigen Vertragsverhandlungen mit besonderem Augenmerk auf den Datenschutz.
  3. Jörs, B.: Informationskompetenz oder Information Literacy : Das große Missverständnis und Versäumnis der Bibliotheks- und Informationswissenschaft im Zeitalter der Desinformation. Teil 2: Wie sich "Informationskompetenz" methodisch-operativ untersuchen lässt Ergänzende Anmerkungen zum "16th International Symposium of Information Science" ("ISI 2021", Regensburg 8. März - 10. März 2021) (2021) 0.04
    0.03704079 = product of:
      0.12346929 = sum of:
        0.017148608 = weight(_text_:23 in 346) [ClassicSimilarity], result of:
          0.017148608 = score(doc=346,freq=2.0), product of:
            0.07217676 = queryWeight, product of:
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.02013827 = queryNorm
            0.23759183 = fieldWeight in 346, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.046875 = fieldNorm(doc=346)
        0.017148608 = weight(_text_:23 in 346) [ClassicSimilarity], result of:
          0.017148608 = score(doc=346,freq=2.0), product of:
            0.07217676 = queryWeight, product of:
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.02013827 = queryNorm
            0.23759183 = fieldWeight in 346, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.046875 = fieldNorm(doc=346)
        0.018548414 = weight(_text_:und in 346) [ClassicSimilarity], result of:
          0.018548414 = score(doc=346,freq=16.0), product of:
            0.044633795 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.02013827 = queryNorm
            0.41556883 = fieldWeight in 346, product of:
              4.0 = tf(freq=16.0), with freq of:
                16.0 = termFreq=16.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.046875 = fieldNorm(doc=346)
        0.017148608 = weight(_text_:23 in 346) [ClassicSimilarity], result of:
          0.017148608 = score(doc=346,freq=2.0), product of:
            0.07217676 = queryWeight, product of:
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.02013827 = queryNorm
            0.23759183 = fieldWeight in 346, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.046875 = fieldNorm(doc=346)
        0.035851102 = weight(_text_:methoden in 346) [ClassicSimilarity], result of:
          0.035851102 = score(doc=346,freq=2.0), product of:
            0.10436003 = queryWeight, product of:
              5.1821747 = idf(docFreq=674, maxDocs=44218)
              0.02013827 = queryNorm
            0.3435329 = fieldWeight in 346, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              5.1821747 = idf(docFreq=674, maxDocs=44218)
              0.046875 = fieldNorm(doc=346)
        0.017623944 = weight(_text_:der in 346) [ClassicSimilarity], result of:
          0.017623944 = score(doc=346,freq=14.0), product of:
            0.044984195 = queryWeight, product of:
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.02013827 = queryNorm
            0.3917808 = fieldWeight in 346, product of:
              3.7416575 = tf(freq=14.0), with freq of:
                14.0 = termFreq=14.0
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.046875 = fieldNorm(doc=346)
      0.3 = coord(6/20)
    
    Abstract
    In Teil 1 dieser Themenreihe zur Informationskompetenz wurde auf das Versäumnis der Bibliotheks- und Informationswissenschaft hingewiesen, das die Schlüsselqualifikation der Informations- und Medienkompetenz für die Lösung von Desinformationsproblemen für die eigene Forschung und Lehre in Anspruch nimmt, den damit implizit eingegangenen Verpflichtungen aber nicht nachkommt oder nicht nachkommen kann. Wenn zum Beispiel Informationskompetenz an Schulen vermittelt werden soll, dann müssten den Lehrern konkrete Handlungsanweisungen und Methoden an die Hand gegeben werden, um den Grad der (persönlichen) Informationskompetenz operativ messbar zu machen, zu testen und vor allem zu spezifizieren, was unter dem angeblich omnipotenten Begriff der Informationskompetenz zu verstehen sei. Das wird nicht geleistet.
    Date
    20. 8.2021 19:23:12
    Series
    Zukunft der Bibliotheks- und Informationswissenschaft
  4. Lezius, W.: Morphy - Morphologie und Tagging für das Deutsche (2013) 0.04
    0.036485195 = product of:
      0.12161731 = sum of:
        0.028013978 = weight(_text_:software in 1490) [ClassicSimilarity], result of:
          0.028013978 = score(doc=1490,freq=2.0), product of:
            0.07989157 = queryWeight, product of:
              3.9671519 = idf(docFreq=2274, maxDocs=44218)
              0.02013827 = queryNorm
            0.35064998 = fieldWeight in 1490, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.9671519 = idf(docFreq=2274, maxDocs=44218)
              0.0625 = fieldNorm(doc=1490)
        0.021417862 = weight(_text_:und in 1490) [ClassicSimilarity], result of:
          0.021417862 = score(doc=1490,freq=12.0), product of:
            0.044633795 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.02013827 = queryNorm
            0.47985753 = fieldWeight in 1490, product of:
              3.4641016 = tf(freq=12.0), with freq of:
                12.0 = termFreq=12.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.0625 = fieldNorm(doc=1490)
        0.028013978 = weight(_text_:software in 1490) [ClassicSimilarity], result of:
          0.028013978 = score(doc=1490,freq=2.0), product of:
            0.07989157 = queryWeight, product of:
              3.9671519 = idf(docFreq=2274, maxDocs=44218)
              0.02013827 = queryNorm
            0.35064998 = fieldWeight in 1490, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.9671519 = idf(docFreq=2274, maxDocs=44218)
              0.0625 = fieldNorm(doc=1490)
        0.008881632 = weight(_text_:der in 1490) [ClassicSimilarity], result of:
          0.008881632 = score(doc=1490,freq=2.0), product of:
            0.044984195 = queryWeight, product of:
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.02013827 = queryNorm
            0.19743896 = fieldWeight in 1490, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.0625 = fieldNorm(doc=1490)
        0.028013978 = weight(_text_:software in 1490) [ClassicSimilarity], result of:
          0.028013978 = score(doc=1490,freq=2.0), product of:
            0.07989157 = queryWeight, product of:
              3.9671519 = idf(docFreq=2274, maxDocs=44218)
              0.02013827 = queryNorm
            0.35064998 = fieldWeight in 1490, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.9671519 = idf(docFreq=2274, maxDocs=44218)
              0.0625 = fieldNorm(doc=1490)
        0.007275887 = product of:
          0.02182766 = sum of:
            0.02182766 = weight(_text_:22 in 1490) [ClassicSimilarity], result of:
              0.02182766 = score(doc=1490,freq=2.0), product of:
                0.07052079 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.02013827 = queryNorm
                0.30952093 = fieldWeight in 1490, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0625 = fieldNorm(doc=1490)
          0.33333334 = coord(1/3)
      0.3 = coord(6/20)
    
    Abstract
    Morphy ist ein frei verfügbares Softwarepaket für die morphologische Analyse und Synthese und die kontextsensitive Wortartenbestimmung des Deutschen. Die Verwendung der Software unterliegt keinen Beschränkungen. Da die Weiterentwicklung eingestellt worden ist, verwenden Sie Morphy as is, d.h. auf eigenes Risiko, ohne jegliche Haftung und Gewährleistung und vor allem ohne Support. Morphy ist nur für die Windows-Plattform verfügbar und nur auf Standalone-PCs lauffähig.
    Date
    22. 3.2015 9:30:24
  5. Lehmann, M.: Neue Tools für die Online-Recherche : EU-Projekt EEXCESS veröffentlicht vier neue Prototypen (2015) 0.04
    0.035705876 = product of:
      0.11901958 = sum of:
        0.024260817 = weight(_text_:software in 2324) [ClassicSimilarity], result of:
          0.024260817 = score(doc=2324,freq=6.0), product of:
            0.07989157 = queryWeight, product of:
              3.9671519 = idf(docFreq=2274, maxDocs=44218)
              0.02013827 = queryNorm
            0.3036718 = fieldWeight in 2324, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              3.9671519 = idf(docFreq=2274, maxDocs=44218)
              0.03125 = fieldNorm(doc=2324)
        0.009775872 = weight(_text_:und in 2324) [ClassicSimilarity], result of:
          0.009775872 = score(doc=2324,freq=10.0), product of:
            0.044633795 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.02013827 = queryNorm
            0.219024 = fieldWeight in 2324, product of:
              3.1622777 = tf(freq=10.0), with freq of:
                10.0 = termFreq=10.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.03125 = fieldNorm(doc=2324)
        0.024260817 = weight(_text_:software in 2324) [ClassicSimilarity], result of:
          0.024260817 = score(doc=2324,freq=6.0), product of:
            0.07989157 = queryWeight, product of:
              3.9671519 = idf(docFreq=2274, maxDocs=44218)
              0.02013827 = queryNorm
            0.3036718 = fieldWeight in 2324, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              3.9671519 = idf(docFreq=2274, maxDocs=44218)
              0.03125 = fieldNorm(doc=2324)
        0.023900734 = weight(_text_:methoden in 2324) [ClassicSimilarity], result of:
          0.023900734 = score(doc=2324,freq=2.0), product of:
            0.10436003 = queryWeight, product of:
              5.1821747 = idf(docFreq=674, maxDocs=44218)
              0.02013827 = queryNorm
            0.22902192 = fieldWeight in 2324, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              5.1821747 = idf(docFreq=674, maxDocs=44218)
              0.03125 = fieldNorm(doc=2324)
        0.012560525 = weight(_text_:der in 2324) [ClassicSimilarity], result of:
          0.012560525 = score(doc=2324,freq=16.0), product of:
            0.044984195 = queryWeight, product of:
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.02013827 = queryNorm
            0.27922085 = fieldWeight in 2324, product of:
              4.0 = tf(freq=16.0), with freq of:
                16.0 = termFreq=16.0
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.03125 = fieldNorm(doc=2324)
        0.024260817 = weight(_text_:software in 2324) [ClassicSimilarity], result of:
          0.024260817 = score(doc=2324,freq=6.0), product of:
            0.07989157 = queryWeight, product of:
              3.9671519 = idf(docFreq=2274, maxDocs=44218)
              0.02013827 = queryNorm
            0.3036718 = fieldWeight in 2324, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              3.9671519 = idf(docFreq=2274, maxDocs=44218)
              0.03125 = fieldNorm(doc=2324)
      0.3 = coord(6/20)
    
    Abstract
    Mitteilung an Inetbib vom 17.11.2015: "die EEXCESS <http://www.eexcess.eu/>-Projektpartner, darunter auch die ZBW - Leibniz Informationszentrum Wirtschaft, stellen vier Prototypen für die personalisierte Daten- und Literaturrecherche vor. Ziel der neuen Software ist es, dem Nutzer bei der wissenschaftlichen Arbeit genau die Spezialressourcen zugänglich zu machen, die zu seinem Profil und zu seinem aktuellen Suchverhalten passen. Dazu wurden neuartige Empfehlungstechnologien entwickelt. Das EU-Projekt EEXCESS befasst sich seit Anfang 2013 mit neuen Methoden der Informationsbereitstellung im Internet. Prof. Dr. Klaus Tochtermann, Direktor der ZBW: "Ich freue mich, dass wir nach dreijähriger Forschungsarbeit diese neuen Recherchewerkzeuge zugänglich machen können. Wir läuten damit einen grundsätzlichen Paradigmenwechsel ein: Bringt die Inhalte zu den Nutzern, nicht die Nutzer zu den Inhalten!" Über ein Userinterface, das in den Google-Chrome-Browser, in den Google-Docs-Editor, in die Blogsoftware WordPress oder auch in das Lernmanagementsystem Moodle integriert werden kann, erhalten die Nutzerinnen und Nutzer automatisch auf sie persönlich zugeschnittene Empfehlungen aus kulturhistorischen Datenbanken wie www.europeana.eu <http://www.europeana.eu> oder auch aus wissenschaftlichen Literaturdatenbanken wie www.mendeley.com <http://www.mendeley.com> (wissenschaftliche Aufsätze) oder www.econbiz.de <http://www.econbiz.de> (wirtschaftswissenschaftliche Literatur). Diese Vorschläge können sie dann bei Bedarf beispielsweise direkt als Referenzen in ihre Texte einbinden. Die Anzahl der an die Software angeschlossenen Datenbanken wird laufend erweitert. Nach einer ersten Beta-Phase im letzten Jahr stehen nun neue Versionen der Prototypen zum kostenlosen Download bereit. Die Links sind auf www.eexcess.eu <http://www.eexcess.eu> zu finden. An dem Forschungsprojekt beteiligt sind insgesamt zehn internationale Projektpartner, darunter die ZBW - Leibniz Informationszentrum Wirtschaft <http://www.zbw.eu/>. Hier wurde das Plug-In für die Blogsoftware WordPress maßgeblich mitentwickelt. Blogger können damit, noch während sie ihren Post verfassen, nach geeigneten Referenzen oder auch Bildern suchen. Zudem kann die ZBW mit ihrer großen Online-Nutzerbasis eine sehr gute Testumgebung für die verschiedenen Tools bieten. Einzelne Visualisierungselemente, die im Projekt entwickelt wurden, kommen zukünftig auch im ZBW-Rechercheportal EconBiz zum Einsatz und können dort getestet werden. Darüber hinaus werden EconBiz-Inhalte über die EEXCESS-Module auch in beliebige andere Umgebungen eingebunden. Die weiteren Projektpartner kommen aus Österreich, Großbritannien, Frankreich, der Schweiz und Deutschland. Die wissenschaftliche Leitung hat Professor Michael Granitzer von der Universität Passau, die allgemeine Projektleitung liegt beim Forschungsinstitut Joanneum Research aus Graz. Das Projektvolumen beträgt insgesamt ca. 6,9 Millionen Euro, Projektlaufzeit ist noch bis Juli 2016.
    Theme
    Bibliographische Software
  6. Bittner, T.; Donnelly, M.; Winter, S.: Ontology and semantic interoperability (2006) 0.04
    0.035469428 = product of:
      0.14187771 = sum of:
        0.036391225 = weight(_text_:software in 4820) [ClassicSimilarity], result of:
          0.036391225 = score(doc=4820,freq=6.0), product of:
            0.07989157 = queryWeight, product of:
              3.9671519 = idf(docFreq=2274, maxDocs=44218)
              0.02013827 = queryNorm
            0.4555077 = fieldWeight in 4820, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              3.9671519 = idf(docFreq=2274, maxDocs=44218)
              0.046875 = fieldNorm(doc=4820)
        0.036391225 = weight(_text_:software in 4820) [ClassicSimilarity], result of:
          0.036391225 = score(doc=4820,freq=6.0), product of:
            0.07989157 = queryWeight, product of:
              3.9671519 = idf(docFreq=2274, maxDocs=44218)
              0.02013827 = queryNorm
            0.4555077 = fieldWeight in 4820, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              3.9671519 = idf(docFreq=2274, maxDocs=44218)
              0.046875 = fieldNorm(doc=4820)
        0.036391225 = weight(_text_:software in 4820) [ClassicSimilarity], result of:
          0.036391225 = score(doc=4820,freq=6.0), product of:
            0.07989157 = queryWeight, product of:
              3.9671519 = idf(docFreq=2274, maxDocs=44218)
              0.02013827 = queryNorm
            0.4555077 = fieldWeight in 4820, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              3.9671519 = idf(docFreq=2274, maxDocs=44218)
              0.046875 = fieldNorm(doc=4820)
        0.027247135 = product of:
          0.05449427 = sum of:
            0.05449427 = weight(_text_:engineering in 4820) [ClassicSimilarity], result of:
              0.05449427 = score(doc=4820,freq=4.0), product of:
                0.10819342 = queryWeight, product of:
                  5.372528 = idf(docFreq=557, maxDocs=44218)
                  0.02013827 = queryNorm
                0.5036745 = fieldWeight in 4820, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  5.372528 = idf(docFreq=557, maxDocs=44218)
                  0.046875 = fieldNorm(doc=4820)
          0.5 = coord(1/2)
        0.005456915 = product of:
          0.016370745 = sum of:
            0.016370745 = weight(_text_:22 in 4820) [ClassicSimilarity], result of:
              0.016370745 = score(doc=4820,freq=2.0), product of:
                0.07052079 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.02013827 = queryNorm
                0.23214069 = fieldWeight in 4820, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.046875 = fieldNorm(doc=4820)
          0.33333334 = coord(1/3)
      0.25 = coord(5/20)
    
    Abstract
    One of the major problems facing systems for Computer Aided Design (CAD), Architecture Engineering and Construction (AEC) and Geographic Information Systems (GIS) applications today is the lack of interoperability among the various systems. When integrating software applications, substantial di culties can arise in translating information from one application to the other. In this paper, we focus on semantic di culties that arise in software integration. Applications may use di erent terminologies to describe the same domain. Even when appli-cations use the same terminology, they often associate di erent semantics with the terms. This obstructs information exchange among applications. To cir-cumvent this obstacle, we need some way of explicitly specifying the semantics for each terminology in an unambiguous fashion. Ontologies can provide such specification. It will be the task of this paper to explain what ontologies are and how they can be used to facilitate interoperability between software systems used in computer aided design, architecture engineering and construction, and geographic information processing.
    Date
    3.12.2016 18:39:22
  7. Klußmann, N.: Lexikon der Kommunikations- und Informationstechnik : Telekommunikation, Datenkommunikation, Multimedia, Computer (1999) 0.04
    0.035070214 = product of:
      0.14028086 = sum of:
        0.034297217 = weight(_text_:23 in 7096) [ClassicSimilarity], result of:
          0.034297217 = score(doc=7096,freq=2.0), product of:
            0.07217676 = queryWeight, product of:
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.02013827 = queryNorm
            0.47518367 = fieldWeight in 7096, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.09375 = fieldNorm(doc=7096)
        0.034297217 = weight(_text_:23 in 7096) [ClassicSimilarity], result of:
          0.034297217 = score(doc=7096,freq=2.0), product of:
            0.07217676 = queryWeight, product of:
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.02013827 = queryNorm
            0.47518367 = fieldWeight in 7096, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.09375 = fieldNorm(doc=7096)
        0.018548414 = weight(_text_:und in 7096) [ClassicSimilarity], result of:
          0.018548414 = score(doc=7096,freq=4.0), product of:
            0.044633795 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.02013827 = queryNorm
            0.41556883 = fieldWeight in 7096, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.09375 = fieldNorm(doc=7096)
        0.034297217 = weight(_text_:23 in 7096) [ClassicSimilarity], result of:
          0.034297217 = score(doc=7096,freq=2.0), product of:
            0.07217676 = queryWeight, product of:
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.02013827 = queryNorm
            0.47518367 = fieldWeight in 7096, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.09375 = fieldNorm(doc=7096)
        0.018840788 = weight(_text_:der in 7096) [ClassicSimilarity], result of:
          0.018840788 = score(doc=7096,freq=4.0), product of:
            0.044984195 = queryWeight, product of:
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.02013827 = queryNorm
            0.4188313 = fieldWeight in 7096, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.09375 = fieldNorm(doc=7096)
      0.25 = coord(5/20)
    
    Date
    17. 7.2002 14:59:23
    Issue
    CD-ROM-Ausg. der 2., erw. und aktualisierten Aufl.
  8. Si, L.E.; O'Brien, A.; Probets, S.: Integration of distributed terminology resources to facilitate subject cross-browsing for library portal systems (2009) 0.03
    0.034980807 = product of:
      0.099945165 = sum of:
        0.014290508 = weight(_text_:23 in 3628) [ClassicSimilarity], result of:
          0.014290508 = score(doc=3628,freq=2.0), product of:
            0.07217676 = queryWeight, product of:
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.02013827 = queryNorm
            0.1979932 = fieldWeight in 3628, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.0390625 = fieldNorm(doc=3628)
        0.014290508 = weight(_text_:23 in 3628) [ClassicSimilarity], result of:
          0.014290508 = score(doc=3628,freq=2.0), product of:
            0.07217676 = queryWeight, product of:
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.02013827 = queryNorm
            0.1979932 = fieldWeight in 3628, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.0390625 = fieldNorm(doc=3628)
        0.017508736 = weight(_text_:software in 3628) [ClassicSimilarity], result of:
          0.017508736 = score(doc=3628,freq=2.0), product of:
            0.07989157 = queryWeight, product of:
              3.9671519 = idf(docFreq=2274, maxDocs=44218)
              0.02013827 = queryNorm
            0.21915624 = fieldWeight in 3628, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.9671519 = idf(docFreq=2274, maxDocs=44218)
              0.0390625 = fieldNorm(doc=3628)
        0.014290508 = weight(_text_:23 in 3628) [ClassicSimilarity], result of:
          0.014290508 = score(doc=3628,freq=2.0), product of:
            0.07217676 = queryWeight, product of:
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.02013827 = queryNorm
            0.1979932 = fieldWeight in 3628, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.0390625 = fieldNorm(doc=3628)
        0.017508736 = weight(_text_:software in 3628) [ClassicSimilarity], result of:
          0.017508736 = score(doc=3628,freq=2.0), product of:
            0.07989157 = queryWeight, product of:
              3.9671519 = idf(docFreq=2274, maxDocs=44218)
              0.02013827 = queryNorm
            0.21915624 = fieldWeight in 3628, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.9671519 = idf(docFreq=2274, maxDocs=44218)
              0.0390625 = fieldNorm(doc=3628)
        0.017508736 = weight(_text_:software in 3628) [ClassicSimilarity], result of:
          0.017508736 = score(doc=3628,freq=2.0), product of:
            0.07989157 = queryWeight, product of:
              3.9671519 = idf(docFreq=2274, maxDocs=44218)
              0.02013827 = queryNorm
            0.21915624 = fieldWeight in 3628, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.9671519 = idf(docFreq=2274, maxDocs=44218)
              0.0390625 = fieldNorm(doc=3628)
        0.0045474293 = product of:
          0.013642288 = sum of:
            0.013642288 = weight(_text_:22 in 3628) [ClassicSimilarity], result of:
              0.013642288 = score(doc=3628,freq=2.0), product of:
                0.07052079 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.02013827 = queryNorm
                0.19345059 = fieldWeight in 3628, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=3628)
          0.33333334 = coord(1/3)
      0.35 = coord(7/20)
    
    Abstract
    Purpose: To develop a prototype middleware framework between different terminology resources in order to provide a subject cross-browsing service for library portal systems. Design/methodology/approach: Nine terminology experts were interviewed to collect appropriate knowledge to support the development of a theoretical framework for the research. Based on this, a simplified software-based prototype system was constructed incorporating the knowledge acquired. The prototype involved mappings between the computer science schedule of the Dewey Decimal Classification (which acted as a spine) and two controlled vocabularies UKAT and ACM Computing Classification. Subsequently, six further experts in the field were invited to evaluate the prototype system and provide feedback to improve the framework. Findings: The major findings showed that given the large variety of terminology resources distributed on the web, the proposed middleware service is essential to integrate technically and semantically the different terminology resources in order to facilitate subject cross-browsing. A set of recommendations are also made outlining the important approaches and features that support such a cross browsing middleware service.
    Content
    This paper is a pre-print version presented at the ISKO UK 2009 conference, 22-23 June, prior to peer review and editing. For published proceedings see special issue of Aslib Proceedings journal.
  9. Deutsch Korrekt : Das Prüfprogramm für Texte (1996) 0.03
    0.034489855 = product of:
      0.13795942 = sum of:
        0.034297217 = weight(_text_:23 in 5968) [ClassicSimilarity], result of:
          0.034297217 = score(doc=5968,freq=2.0), product of:
            0.07217676 = queryWeight, product of:
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.02013827 = queryNorm
            0.47518367 = fieldWeight in 5968, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.09375 = fieldNorm(doc=5968)
        0.034297217 = weight(_text_:23 in 5968) [ClassicSimilarity], result of:
          0.034297217 = score(doc=5968,freq=2.0), product of:
            0.07217676 = queryWeight, product of:
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.02013827 = queryNorm
            0.47518367 = fieldWeight in 5968, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.09375 = fieldNorm(doc=5968)
        0.018548414 = weight(_text_:und in 5968) [ClassicSimilarity], result of:
          0.018548414 = score(doc=5968,freq=4.0), product of:
            0.044633795 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.02013827 = queryNorm
            0.41556883 = fieldWeight in 5968, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.09375 = fieldNorm(doc=5968)
        0.034297217 = weight(_text_:23 in 5968) [ClassicSimilarity], result of:
          0.034297217 = score(doc=5968,freq=2.0), product of:
            0.07217676 = queryWeight, product of:
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.02013827 = queryNorm
            0.47518367 = fieldWeight in 5968, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.09375 = fieldNorm(doc=5968)
        0.016519347 = product of:
          0.033038694 = sum of:
            0.033038694 = weight(_text_:29 in 5968) [ClassicSimilarity], result of:
              0.033038694 = score(doc=5968,freq=2.0), product of:
                0.070840135 = queryWeight, product of:
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.02013827 = queryNorm
                0.46638384 = fieldWeight in 5968, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.09375 = fieldNorm(doc=5968)
          0.5 = coord(1/2)
      0.25 = coord(5/20)
    
    Abstract
    Automatisches Prüfprogramm zur deutschen Rechtschreibung; kontrolliert die Rechtschreibung nach den neuen Regeln - inklusive korrekter Silbentrennungen, Wortzusammensetzungen und Wortableitungen
    Date
    21.12.1996 10:23:29
    Issue
    Für Windows 3.x und Windows95
  10. Korthof, G.: Information Content, Compressibility and Meaning : Published: 18 June 2000. Updated 31 May 2006. Postscript 20 Oct 2009. (2000) 0.03
    0.034343183 = product of:
      0.11447728 = sum of:
        0.017148608 = weight(_text_:23 in 4245) [ClassicSimilarity], result of:
          0.017148608 = score(doc=4245,freq=2.0), product of:
            0.07217676 = queryWeight, product of:
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.02013827 = queryNorm
            0.23759183 = fieldWeight in 4245, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.046875 = fieldNorm(doc=4245)
        0.017148608 = weight(_text_:23 in 4245) [ClassicSimilarity], result of:
          0.017148608 = score(doc=4245,freq=2.0), product of:
            0.07217676 = queryWeight, product of:
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.02013827 = queryNorm
            0.23759183 = fieldWeight in 4245, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.046875 = fieldNorm(doc=4245)
        0.021010485 = weight(_text_:software in 4245) [ClassicSimilarity], result of:
          0.021010485 = score(doc=4245,freq=2.0), product of:
            0.07989157 = queryWeight, product of:
              3.9671519 = idf(docFreq=2274, maxDocs=44218)
              0.02013827 = queryNorm
            0.2629875 = fieldWeight in 4245, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.9671519 = idf(docFreq=2274, maxDocs=44218)
              0.046875 = fieldNorm(doc=4245)
        0.017148608 = weight(_text_:23 in 4245) [ClassicSimilarity], result of:
          0.017148608 = score(doc=4245,freq=2.0), product of:
            0.07217676 = queryWeight, product of:
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.02013827 = queryNorm
            0.23759183 = fieldWeight in 4245, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.046875 = fieldNorm(doc=4245)
        0.021010485 = weight(_text_:software in 4245) [ClassicSimilarity], result of:
          0.021010485 = score(doc=4245,freq=2.0), product of:
            0.07989157 = queryWeight, product of:
              3.9671519 = idf(docFreq=2274, maxDocs=44218)
              0.02013827 = queryNorm
            0.2629875 = fieldWeight in 4245, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.9671519 = idf(docFreq=2274, maxDocs=44218)
              0.046875 = fieldNorm(doc=4245)
        0.021010485 = weight(_text_:software in 4245) [ClassicSimilarity], result of:
          0.021010485 = score(doc=4245,freq=2.0), product of:
            0.07989157 = queryWeight, product of:
              3.9671519 = idf(docFreq=2274, maxDocs=44218)
              0.02013827 = queryNorm
            0.2629875 = fieldWeight in 4245, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.9671519 = idf(docFreq=2274, maxDocs=44218)
              0.046875 = fieldNorm(doc=4245)
      0.3 = coord(6/20)
    
    Abstract
    In New Scientist issue 18 Sept 1999, "Life force" pp27-30 Paul Davies writes "an apparently random sequence such as 110101001010010111... cannot be condensed into a simple set of instructions, so it has a high information content." (p29). This notion of 'information content' leads to paradoxes. Consider random number generator software. Let it generate 100 and 1000 random numbers. According to the above definition the second sequence of numbers has an information content ten times higher than the first, because its description would be ten times longer. However they are both generated by the same simple set of instructions, so should have exactly the same 'information content'. There is the paradox. It seems clear that this measure of 'information content' misses the point. It measures compressibility of a sequence, not 'information content'. One needs meaning of a sequence to capture information content.
    Date
    23. 1.2011 18:12:04
  11. Giunchiglia, F.; Zaihrayeu, I.; Farazi, F.: Converting classifications into OWL ontologies (2009) 0.03
    0.034343183 = product of:
      0.11447728 = sum of:
        0.017148608 = weight(_text_:23 in 4690) [ClassicSimilarity], result of:
          0.017148608 = score(doc=4690,freq=2.0), product of:
            0.07217676 = queryWeight, product of:
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.02013827 = queryNorm
            0.23759183 = fieldWeight in 4690, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.046875 = fieldNorm(doc=4690)
        0.017148608 = weight(_text_:23 in 4690) [ClassicSimilarity], result of:
          0.017148608 = score(doc=4690,freq=2.0), product of:
            0.07217676 = queryWeight, product of:
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.02013827 = queryNorm
            0.23759183 = fieldWeight in 4690, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.046875 = fieldNorm(doc=4690)
        0.021010485 = weight(_text_:software in 4690) [ClassicSimilarity], result of:
          0.021010485 = score(doc=4690,freq=2.0), product of:
            0.07989157 = queryWeight, product of:
              3.9671519 = idf(docFreq=2274, maxDocs=44218)
              0.02013827 = queryNorm
            0.2629875 = fieldWeight in 4690, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.9671519 = idf(docFreq=2274, maxDocs=44218)
              0.046875 = fieldNorm(doc=4690)
        0.017148608 = weight(_text_:23 in 4690) [ClassicSimilarity], result of:
          0.017148608 = score(doc=4690,freq=2.0), product of:
            0.07217676 = queryWeight, product of:
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.02013827 = queryNorm
            0.23759183 = fieldWeight in 4690, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.046875 = fieldNorm(doc=4690)
        0.021010485 = weight(_text_:software in 4690) [ClassicSimilarity], result of:
          0.021010485 = score(doc=4690,freq=2.0), product of:
            0.07989157 = queryWeight, product of:
              3.9671519 = idf(docFreq=2274, maxDocs=44218)
              0.02013827 = queryNorm
            0.2629875 = fieldWeight in 4690, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.9671519 = idf(docFreq=2274, maxDocs=44218)
              0.046875 = fieldNorm(doc=4690)
        0.021010485 = weight(_text_:software in 4690) [ClassicSimilarity], result of:
          0.021010485 = score(doc=4690,freq=2.0), product of:
            0.07989157 = queryWeight, product of:
              3.9671519 = idf(docFreq=2274, maxDocs=44218)
              0.02013827 = queryNorm
            0.2629875 = fieldWeight in 4690, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.9671519 = idf(docFreq=2274, maxDocs=44218)
              0.046875 = fieldNorm(doc=4690)
      0.3 = coord(6/20)
    
    Abstract
    Classification schemes, such as the DMoZ web directory, provide a convenient and intuitive way for humans to access classified contents. While being easy to be dealt with for humans, classification schemes remain hard to be reasoned about by automated software agents. Among other things, this hardness is conditioned by the ambiguous na- ture of the natural language used to describe classification categories. In this paper we describe how classification schemes can be converted into OWL ontologies, thus enabling reasoning on them by Semantic Web applications. The proposed solution is based on a two phase approach in which category names are first encoded in a concept language and then, together with the structure of the classification scheme, are converted into an OWL ontology. We demonstrate the practical applicability of our approach by showing how the results of reasoning on these OWL ontologies can help improve the organization and use of web directories.
    Date
    23. 8.2011 19:43:39
  12. Gore, E.; Bitta, M.D.; Cohen, D.: ¬The Digital Public Library of America and the National Digital Platform (2017) 0.03
    0.034343183 = product of:
      0.11447728 = sum of:
        0.017148608 = weight(_text_:23 in 3655) [ClassicSimilarity], result of:
          0.017148608 = score(doc=3655,freq=2.0), product of:
            0.07217676 = queryWeight, product of:
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.02013827 = queryNorm
            0.23759183 = fieldWeight in 3655, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.046875 = fieldNorm(doc=3655)
        0.017148608 = weight(_text_:23 in 3655) [ClassicSimilarity], result of:
          0.017148608 = score(doc=3655,freq=2.0), product of:
            0.07217676 = queryWeight, product of:
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.02013827 = queryNorm
            0.23759183 = fieldWeight in 3655, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.046875 = fieldNorm(doc=3655)
        0.021010485 = weight(_text_:software in 3655) [ClassicSimilarity], result of:
          0.021010485 = score(doc=3655,freq=2.0), product of:
            0.07989157 = queryWeight, product of:
              3.9671519 = idf(docFreq=2274, maxDocs=44218)
              0.02013827 = queryNorm
            0.2629875 = fieldWeight in 3655, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.9671519 = idf(docFreq=2274, maxDocs=44218)
              0.046875 = fieldNorm(doc=3655)
        0.017148608 = weight(_text_:23 in 3655) [ClassicSimilarity], result of:
          0.017148608 = score(doc=3655,freq=2.0), product of:
            0.07217676 = queryWeight, product of:
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.02013827 = queryNorm
            0.23759183 = fieldWeight in 3655, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.046875 = fieldNorm(doc=3655)
        0.021010485 = weight(_text_:software in 3655) [ClassicSimilarity], result of:
          0.021010485 = score(doc=3655,freq=2.0), product of:
            0.07989157 = queryWeight, product of:
              3.9671519 = idf(docFreq=2274, maxDocs=44218)
              0.02013827 = queryNorm
            0.2629875 = fieldWeight in 3655, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.9671519 = idf(docFreq=2274, maxDocs=44218)
              0.046875 = fieldNorm(doc=3655)
        0.021010485 = weight(_text_:software in 3655) [ClassicSimilarity], result of:
          0.021010485 = score(doc=3655,freq=2.0), product of:
            0.07989157 = queryWeight, product of:
              3.9671519 = idf(docFreq=2274, maxDocs=44218)
              0.02013827 = queryNorm
            0.2629875 = fieldWeight in 3655, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.9671519 = idf(docFreq=2274, maxDocs=44218)
              0.046875 = fieldNorm(doc=3655)
      0.3 = coord(6/20)
    
    Abstract
    The Digital Public Library of America brings together the riches of America's libraries, archives, and museums, and makes them freely available to the world. In order to do this, DPLA has had to build elements of the national digital platform to connect to those institutions and to serve their digitized materials to audiences. In this article, we detail the construction of two critical elements of our work: the decentralized national network of "hubs," which operate in states across the country; and a version of the Hydra repository software that is tailored to the needs of our community. This technology and the organizations that make use of it serve as the foundation of the future of DPLA and other projects that seek to take advantage of the national digital platform.
    Source
    D-Lib magazine. 23(2017) nos.5/6, xx S
  13. Knorz, G.; Rein, B.: Semantische Suche in einer Hochschulontologie : Ontologie-basiertes Information-Filtering und -Retrieval mit relationalen Datenbanken (2005) 0.03
    0.034255974 = product of:
      0.11418658 = sum of:
        0.02451223 = weight(_text_:software in 4324) [ClassicSimilarity], result of:
          0.02451223 = score(doc=4324,freq=2.0), product of:
            0.07989157 = queryWeight, product of:
              3.9671519 = idf(docFreq=2274, maxDocs=44218)
              0.02013827 = queryNorm
            0.30681872 = fieldWeight in 4324, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.9671519 = idf(docFreq=2274, maxDocs=44218)
              0.0546875 = fieldNorm(doc=4324)
        0.01874063 = weight(_text_:und in 4324) [ClassicSimilarity], result of:
          0.01874063 = score(doc=4324,freq=12.0), product of:
            0.044633795 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.02013827 = queryNorm
            0.41987535 = fieldWeight in 4324, product of:
              3.4641016 = tf(freq=12.0), with freq of:
                12.0 = termFreq=12.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.0546875 = fieldNorm(doc=4324)
        0.02451223 = weight(_text_:software in 4324) [ClassicSimilarity], result of:
          0.02451223 = score(doc=4324,freq=2.0), product of:
            0.07989157 = queryWeight, product of:
              3.9671519 = idf(docFreq=2274, maxDocs=44218)
              0.02013827 = queryNorm
            0.30681872 = fieldWeight in 4324, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.9671519 = idf(docFreq=2274, maxDocs=44218)
              0.0546875 = fieldNorm(doc=4324)
        0.015542857 = weight(_text_:der in 4324) [ClassicSimilarity], result of:
          0.015542857 = score(doc=4324,freq=8.0), product of:
            0.044984195 = queryWeight, product of:
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.02013827 = queryNorm
            0.34551817 = fieldWeight in 4324, product of:
              2.828427 = tf(freq=8.0), with freq of:
                8.0 = termFreq=8.0
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.0546875 = fieldNorm(doc=4324)
        0.02451223 = weight(_text_:software in 4324) [ClassicSimilarity], result of:
          0.02451223 = score(doc=4324,freq=2.0), product of:
            0.07989157 = queryWeight, product of:
              3.9671519 = idf(docFreq=2274, maxDocs=44218)
              0.02013827 = queryNorm
            0.30681872 = fieldWeight in 4324, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.9671519 = idf(docFreq=2274, maxDocs=44218)
              0.0546875 = fieldNorm(doc=4324)
        0.006366401 = product of:
          0.019099202 = sum of:
            0.019099202 = weight(_text_:22 in 4324) [ClassicSimilarity], result of:
              0.019099202 = score(doc=4324,freq=2.0), product of:
                0.07052079 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.02013827 = queryNorm
                0.2708308 = fieldWeight in 4324, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=4324)
          0.33333334 = coord(1/3)
      0.3 = coord(6/20)
    
    Abstract
    Ontologien werden eingesetzt, um durch semantische Fundierung insbesondere für das Dokumentenretrieval eine grundlegend bessere Basis zu haben, als dies gegenwärtiger Stand der Technik ist. Vorgestellt wird eine an der FH Darmstadt entwickelte und eingesetzte Ontologie, die den Gegenstandsbereich Hochschule sowohl breit abdecken und gleichzeitig differenziert semantisch beschreiben soll. Das Problem der semantischen Suche besteht nun darin, dass sie für Informationssuchende so einfach wie bei gängigen Suchmaschinen zu nutzen sein soll, und gleichzeitig auf der Grundlage des aufwendigen Informationsmodells hochwertige Ergebnisse liefern muss. Es wird beschrieben, welche Möglichkeiten die verwendete Software K-Infinity bereitstellt und mit welchem Konzept diese Möglichkeiten für eine semantische Suche nach Dokumenten und anderen Informationseinheiten (Personen, Veranstaltungen, Projekte etc.) eingesetzt werden.
    Date
    11. 2.2011 18:22:25
  14. Dietz, K.: en.wikipedia.org > 6 Mio. Artikel (2020) 0.03
    0.03420702 = product of:
      0.09773435 = sum of:
        0.026654093 = product of:
          0.079962276 = sum of:
            0.079962276 = weight(_text_:3a in 5669) [ClassicSimilarity], result of:
              0.079962276 = score(doc=5669,freq=2.0), product of:
                0.17073247 = queryWeight, product of:
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.02013827 = queryNorm
                0.46834838 = fieldWeight in 5669, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=5669)
          0.33333334 = coord(1/3)
        0.014290508 = weight(_text_:23 in 5669) [ClassicSimilarity], result of:
          0.014290508 = score(doc=5669,freq=2.0), product of:
            0.07217676 = queryWeight, product of:
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.02013827 = queryNorm
            0.1979932 = fieldWeight in 5669, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.0390625 = fieldNorm(doc=5669)
        0.014290508 = weight(_text_:23 in 5669) [ClassicSimilarity], result of:
          0.014290508 = score(doc=5669,freq=2.0), product of:
            0.07217676 = queryWeight, product of:
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.02013827 = queryNorm
            0.1979932 = fieldWeight in 5669, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.0390625 = fieldNorm(doc=5669)
        0.0077285054 = weight(_text_:und in 5669) [ClassicSimilarity], result of:
          0.0077285054 = score(doc=5669,freq=4.0), product of:
            0.044633795 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.02013827 = queryNorm
            0.17315367 = fieldWeight in 5669, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.0390625 = fieldNorm(doc=5669)
        0.014290508 = weight(_text_:23 in 5669) [ClassicSimilarity], result of:
          0.014290508 = score(doc=5669,freq=2.0), product of:
            0.07217676 = queryWeight, product of:
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.02013827 = queryNorm
            0.1979932 = fieldWeight in 5669, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.0390625 = fieldNorm(doc=5669)
        0.013597167 = weight(_text_:der in 5669) [ClassicSimilarity], result of:
          0.013597167 = score(doc=5669,freq=12.0), product of:
            0.044984195 = queryWeight, product of:
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.02013827 = queryNorm
            0.30226544 = fieldWeight in 5669, product of:
              3.4641016 = tf(freq=12.0), with freq of:
                12.0 = termFreq=12.0
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.0390625 = fieldNorm(doc=5669)
        0.0068830615 = product of:
          0.013766123 = sum of:
            0.013766123 = weight(_text_:29 in 5669) [ClassicSimilarity], result of:
              0.013766123 = score(doc=5669,freq=2.0), product of:
                0.070840135 = queryWeight, product of:
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.02013827 = queryNorm
                0.19432661 = fieldWeight in 5669, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=5669)
          0.5 = coord(1/2)
      0.35 = coord(7/20)
    
    Content
    "Die Englischsprachige Wikipedia verfügt jetzt über mehr als 6 Millionen Artikel. An zweiter Stelle kommt die deutschsprachige Wikipedia mit 2.3 Millionen Artikeln, an dritter Stelle steht die französischsprachige Wikipedia mit 2.1 Millionen Artikeln (via Researchbuzz: Firehose <https://rbfirehose.com/2020/01/24/techcrunch-wikipedia-now-has-more-than-6-million-articles-in-english/> und Techcrunch <https://techcrunch.com/2020/01/23/wikipedia-english-six-million-articles/?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+Techcrunch+%28TechCrunch%29&guccounter=1&guce_referrer=aHR0cHM6Ly9yYmZpcmVob3NlLmNvbS8yMDIwLzAxLzI0L3RlY2hjcnVuY2gtd2lraXBlZGlhLW5vdy1oYXMtbW9yZS10aGFuLTYtbWlsbGlvbi1hcnRpY2xlcy1pbi1lbmdsaXNoLw&guce_referrer_sig=AQAAAK0zHfjdDZ_spFZBF_z-zDjtL5iWvuKDumFTzm4HvQzkUfE2pLXQzGS6FGB_y-VISdMEsUSvkNsg2U_NWQ4lwWSvOo3jvXo1I3GtgHpP8exukVxYAnn5mJspqX50VHIWFADHhs5AerkRn3hMRtf_R3F1qmEbo8EROZXp328HMC-o>). 250120 via digithek ch = #fineBlog s.a.: Angesichts der Veröffentlichung des 6-millionsten Artikels vergangene Woche in der englischsprachigen Wikipedia hat die Community-Zeitungsseite "Wikipedia Signpost" ein Moratorium bei der Veröffentlichung von Unternehmensartikeln gefordert. Das sei kein Vorwurf gegen die Wikimedia Foundation, aber die derzeitigen Maßnahmen, um die Enzyklopädie gegen missbräuchliches undeklariertes Paid Editing zu schützen, funktionierten ganz klar nicht. *"Da die ehrenamtlichen Autoren derzeit von Werbung in Gestalt von Wikipedia-Artikeln überwältigt werden, und da die WMF nicht in der Lage zu sein scheint, dem irgendetwas entgegenzusetzen, wäre der einzige gangbare Weg für die Autoren, fürs erste die Neuanlage von Artikeln über Unternehmen zu untersagen"*, schreibt der Benutzer Smallbones in seinem Editorial <https://en.wikipedia.org/wiki/Wikipedia:Wikipedia_Signpost/2020-01-27/From_the_editor> zur heutigen Ausgabe."
  15. Retti, G.: "Schlagwortnormdatei" und "Regeln für den Schlagwortkatalog" (1995) 0.03
    0.03387031 = product of:
      0.112901025 = sum of:
        0.02000671 = weight(_text_:23 in 1354) [ClassicSimilarity], result of:
          0.02000671 = score(doc=1354,freq=2.0), product of:
            0.07217676 = queryWeight, product of:
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.02013827 = queryNorm
            0.27719048 = fieldWeight in 1354, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.0546875 = fieldNorm(doc=1354)
        0.02000671 = weight(_text_:23 in 1354) [ClassicSimilarity], result of:
          0.02000671 = score(doc=1354,freq=2.0), product of:
            0.07217676 = queryWeight, product of:
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.02013827 = queryNorm
            0.27719048 = fieldWeight in 1354, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.0546875 = fieldNorm(doc=1354)
        0.010819908 = weight(_text_:und in 1354) [ClassicSimilarity], result of:
          0.010819908 = score(doc=1354,freq=4.0), product of:
            0.044633795 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.02013827 = queryNorm
            0.24241515 = fieldWeight in 1354, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.0546875 = fieldNorm(doc=1354)
        0.02000671 = weight(_text_:23 in 1354) [ClassicSimilarity], result of:
          0.02000671 = score(doc=1354,freq=2.0), product of:
            0.07217676 = queryWeight, product of:
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.02013827 = queryNorm
            0.27719048 = fieldWeight in 1354, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.0546875 = fieldNorm(doc=1354)
        0.021499714 = product of:
          0.042999428 = sum of:
            0.042999428 = weight(_text_:allgemein in 1354) [ClassicSimilarity], result of:
              0.042999428 = score(doc=1354,freq=2.0), product of:
                0.10581345 = queryWeight, product of:
                  5.254347 = idf(docFreq=627, maxDocs=44218)
                  0.02013827 = queryNorm
                0.40637016 = fieldWeight in 1354, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  5.254347 = idf(docFreq=627, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=1354)
          0.5 = coord(1/2)
        0.020561269 = weight(_text_:der in 1354) [ClassicSimilarity], result of:
          0.020561269 = score(doc=1354,freq=14.0), product of:
            0.044984195 = queryWeight, product of:
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.02013827 = queryNorm
            0.4570776 = fieldWeight in 1354, product of:
              3.7416575 = tf(freq=14.0), with freq of:
                14.0 = termFreq=14.0
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.0546875 = fieldNorm(doc=1354)
      0.3 = coord(6/20)
    
    Abstract
    Es scheint naheliegend, daß der Unsicherheit, welche durch die "Vielfalt der möglichen sprachlichen Ausdrucksweisen für einen Sachverhalt" beim Beschlagworter ausgelöst wird, dadurch begegnet werden soll, "daß immer nur eine der möglichen Formen gewählt wird". Der Ruf nach einer "Standardisie- rung der Schlagwörter" geht damit Hand in Hand. Zwei Ergebnisse dieser Standardisierungsbemühungen werden im folgenden dargestellt; ein Punkt aber sollte dabei nicht übersehen werden: "Ein allgemein akzeptiertes Verfahren der Inhaltsanalyse gibt es bisher nicht und es ist offen, ob das Problem überhaupt gelöst werden kann." Standardisiert kann demnach nur das Schlagwortsystem werden, nicht aber seine konkrete Anwendung bei der inhaltlichen Erschließung von Dokumenten.
    Date
    23. 8.2014 10:48:30
  16. Pumuckl musiziert (1996) 0.03
    0.03374264 = product of:
      0.1687132 = sum of:
        0.04902446 = weight(_text_:software in 5477) [ClassicSimilarity], result of:
          0.04902446 = score(doc=5477,freq=2.0), product of:
            0.07989157 = queryWeight, product of:
              3.9671519 = idf(docFreq=2274, maxDocs=44218)
              0.02013827 = queryNorm
            0.61363745 = fieldWeight in 5477, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.9671519 = idf(docFreq=2274, maxDocs=44218)
              0.109375 = fieldNorm(doc=5477)
        0.021639816 = weight(_text_:und in 5477) [ClassicSimilarity], result of:
          0.021639816 = score(doc=5477,freq=4.0), product of:
            0.044633795 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.02013827 = queryNorm
            0.4848303 = fieldWeight in 5477, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.109375 = fieldNorm(doc=5477)
        0.04902446 = weight(_text_:software in 5477) [ClassicSimilarity], result of:
          0.04902446 = score(doc=5477,freq=2.0), product of:
            0.07989157 = queryWeight, product of:
              3.9671519 = idf(docFreq=2274, maxDocs=44218)
              0.02013827 = queryNorm
            0.61363745 = fieldWeight in 5477, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.9671519 = idf(docFreq=2274, maxDocs=44218)
              0.109375 = fieldNorm(doc=5477)
        0.04902446 = weight(_text_:software in 5477) [ClassicSimilarity], result of:
          0.04902446 = score(doc=5477,freq=2.0), product of:
            0.07989157 = queryWeight, product of:
              3.9671519 = idf(docFreq=2274, maxDocs=44218)
              0.02013827 = queryNorm
            0.61363745 = fieldWeight in 5477, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.9671519 = idf(docFreq=2274, maxDocs=44218)
              0.109375 = fieldNorm(doc=5477)
      0.2 = coord(4/20)
    
    Abstract
    Pumuckl musiziert ist ein tolles Programm, mit dem ihr spielerisch anhand bekannter und beliebter Kinderlieder ans Notenlernen und Komponieren herangeführt werdet
    Imprint
    ? : ESCAL Software
  17. Schubert, C.; Kinkeldey, C.; Reich, H.: Handbuch Datenbankanwendung zur Wissensrepräsentation im Verbundprojekt DeCOVER (2006) 0.03
    0.033549864 = product of:
      0.11183288 = sum of:
        0.022864813 = weight(_text_:23 in 4256) [ClassicSimilarity], result of:
          0.022864813 = score(doc=4256,freq=2.0), product of:
            0.07217676 = queryWeight, product of:
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.02013827 = queryNorm
            0.31678912 = fieldWeight in 4256, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.0625 = fieldNorm(doc=4256)
        0.022864813 = weight(_text_:23 in 4256) [ClassicSimilarity], result of:
          0.022864813 = score(doc=4256,freq=2.0), product of:
            0.07217676 = queryWeight, product of:
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.02013827 = queryNorm
            0.31678912 = fieldWeight in 4256, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.0625 = fieldNorm(doc=4256)
        0.0123656085 = weight(_text_:und in 4256) [ClassicSimilarity], result of:
          0.0123656085 = score(doc=4256,freq=4.0), product of:
            0.044633795 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.02013827 = queryNorm
            0.27704588 = fieldWeight in 4256, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.0625 = fieldNorm(doc=4256)
        0.022864813 = weight(_text_:23 in 4256) [ClassicSimilarity], result of:
          0.022864813 = score(doc=4256,freq=2.0), product of:
            0.07217676 = queryWeight, product of:
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.02013827 = queryNorm
            0.31678912 = fieldWeight in 4256, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.0625 = fieldNorm(doc=4256)
        0.019859934 = weight(_text_:der in 4256) [ClassicSimilarity], result of:
          0.019859934 = score(doc=4256,freq=10.0), product of:
            0.044984195 = queryWeight, product of:
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.02013827 = queryNorm
            0.44148692 = fieldWeight in 4256, product of:
              3.1622777 = tf(freq=10.0), with freq of:
                10.0 = termFreq=10.0
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.0625 = fieldNorm(doc=4256)
        0.011012898 = product of:
          0.022025796 = sum of:
            0.022025796 = weight(_text_:29 in 4256) [ClassicSimilarity], result of:
              0.022025796 = score(doc=4256,freq=2.0), product of:
                0.070840135 = queryWeight, product of:
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.02013827 = queryNorm
                0.31092256 = fieldWeight in 4256, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5176873 = idf(docFreq=3565, maxDocs=44218)
                  0.0625 = fieldNorm(doc=4256)
          0.5 = coord(1/2)
      0.3 = coord(6/20)
    
    Abstract
    Die Datenbank basierte Objektartenbeschreibung dient zur eigenschaftsbasierten Aufnahme aller Objektarten der Kataloge BNTK, CLC; GMES M 2.1, ATKIS und des DeCOVER Vorschlags. Das Ziel der Datenbankanwendung besteht in der 'manuellen' Beziehungsauswertung und Darstellung der gesamten Objektarten bezogen auf die erstellte Wissensrepräsentation. Anhand einer hierarchisch strukturierten Wissensrepräsentation lassen sich mit Ontologien Überführungen von Objektarten verwirklichen, die im Sinne der semantischen Interoperabilität als Zielstellung in dem Verbundprojekt DeCOVER besteht.
    Date
    29. 1.2011 18:45:23
  18. Klarmann, S.: easydb. Flexibles Framework zum Aufbau von Metadaten- und Medien-Repositorien : Anwendungsfall: Forschungsdaten (2020) 0.03
    0.032902867 = product of:
      0.13161147 = sum of:
        0.03501747 = weight(_text_:software in 58) [ClassicSimilarity], result of:
          0.03501747 = score(doc=58,freq=2.0), product of:
            0.07989157 = queryWeight, product of:
              3.9671519 = idf(docFreq=2274, maxDocs=44218)
              0.02013827 = queryNorm
            0.43831247 = fieldWeight in 58, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.9671519 = idf(docFreq=2274, maxDocs=44218)
              0.078125 = fieldNorm(doc=58)
        0.015457011 = weight(_text_:und in 58) [ClassicSimilarity], result of:
          0.015457011 = score(doc=58,freq=4.0), product of:
            0.044633795 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.02013827 = queryNorm
            0.34630734 = fieldWeight in 58, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.078125 = fieldNorm(doc=58)
        0.03501747 = weight(_text_:software in 58) [ClassicSimilarity], result of:
          0.03501747 = score(doc=58,freq=2.0), product of:
            0.07989157 = queryWeight, product of:
              3.9671519 = idf(docFreq=2274, maxDocs=44218)
              0.02013827 = queryNorm
            0.43831247 = fieldWeight in 58, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.9671519 = idf(docFreq=2274, maxDocs=44218)
              0.078125 = fieldNorm(doc=58)
        0.01110204 = weight(_text_:der in 58) [ClassicSimilarity], result of:
          0.01110204 = score(doc=58,freq=2.0), product of:
            0.044984195 = queryWeight, product of:
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.02013827 = queryNorm
            0.2467987 = fieldWeight in 58, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.078125 = fieldNorm(doc=58)
        0.03501747 = weight(_text_:software in 58) [ClassicSimilarity], result of:
          0.03501747 = score(doc=58,freq=2.0), product of:
            0.07989157 = queryWeight, product of:
              3.9671519 = idf(docFreq=2274, maxDocs=44218)
              0.02013827 = queryNorm
            0.43831247 = fieldWeight in 58, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.9671519 = idf(docFreq=2274, maxDocs=44218)
              0.078125 = fieldNorm(doc=58)
      0.25 = coord(5/20)
    
    Abstract
    Präsentationsfolien zum sehr schönen Vortrag von Herrn Sebastian Klarmann (Software easydb) in der vergangenen Woche am Donnerstag zum Thema "easydb. Flexibles Framework zum Aufbau von Metadaten- und Medien-Repositorien. Anwendungsfall: Forschungsdaten" (vgl. Mail von A. Strauch an Inetbib vom 15.12.2020.
  19. Pro-Cite 2.0 for the IBM and Biblio-Link to USMARC comunications format records (1993) 0.03
    0.032752104 = product of:
      0.21834734 = sum of:
        0.07278245 = weight(_text_:software in 5618) [ClassicSimilarity], result of:
          0.07278245 = score(doc=5618,freq=6.0), product of:
            0.07989157 = queryWeight, product of:
              3.9671519 = idf(docFreq=2274, maxDocs=44218)
              0.02013827 = queryNorm
            0.9110154 = fieldWeight in 5618, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              3.9671519 = idf(docFreq=2274, maxDocs=44218)
              0.09375 = fieldNorm(doc=5618)
        0.07278245 = weight(_text_:software in 5618) [ClassicSimilarity], result of:
          0.07278245 = score(doc=5618,freq=6.0), product of:
            0.07989157 = queryWeight, product of:
              3.9671519 = idf(docFreq=2274, maxDocs=44218)
              0.02013827 = queryNorm
            0.9110154 = fieldWeight in 5618, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              3.9671519 = idf(docFreq=2274, maxDocs=44218)
              0.09375 = fieldNorm(doc=5618)
        0.07278245 = weight(_text_:software in 5618) [ClassicSimilarity], result of:
          0.07278245 = score(doc=5618,freq=6.0), product of:
            0.07989157 = queryWeight, product of:
              3.9671519 = idf(docFreq=2274, maxDocs=44218)
              0.02013827 = queryNorm
            0.9110154 = fieldWeight in 5618, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              3.9671519 = idf(docFreq=2274, maxDocs=44218)
              0.09375 = fieldNorm(doc=5618)
      0.15 = coord(3/20)
    
    Imprint
    Ann Arbor, MI 48106 : Personal Bibliographic Software, P.O. box 4250
    Issue
    [Software]
    Theme
    Bibliographische Software
  20. Gigerenzer, G.; Jahberg, H.: "Deutschland wird eine Überwachungsgesellschaft" (2019) 0.03
    0.03233245 = product of:
      0.1293298 = sum of:
        0.034297217 = weight(_text_:23 in 4691) [ClassicSimilarity], result of:
          0.034297217 = score(doc=4691,freq=2.0), product of:
            0.07217676 = queryWeight, product of:
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.02013827 = queryNorm
            0.47518367 = fieldWeight in 4691, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.09375 = fieldNorm(doc=4691)
        0.034297217 = weight(_text_:23 in 4691) [ClassicSimilarity], result of:
          0.034297217 = score(doc=4691,freq=2.0), product of:
            0.07217676 = queryWeight, product of:
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.02013827 = queryNorm
            0.47518367 = fieldWeight in 4691, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.09375 = fieldNorm(doc=4691)
        0.013115709 = weight(_text_:und in 4691) [ClassicSimilarity], result of:
          0.013115709 = score(doc=4691,freq=2.0), product of:
            0.044633795 = queryWeight, product of:
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.02013827 = queryNorm
            0.29385152 = fieldWeight in 4691, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              2.216367 = idf(docFreq=13101, maxDocs=44218)
              0.09375 = fieldNorm(doc=4691)
        0.034297217 = weight(_text_:23 in 4691) [ClassicSimilarity], result of:
          0.034297217 = score(doc=4691,freq=2.0), product of:
            0.07217676 = queryWeight, product of:
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.02013827 = queryNorm
            0.47518367 = fieldWeight in 4691, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.5840597 = idf(docFreq=3336, maxDocs=44218)
              0.09375 = fieldNorm(doc=4691)
        0.013322448 = weight(_text_:der in 4691) [ClassicSimilarity], result of:
          0.013322448 = score(doc=4691,freq=2.0), product of:
            0.044984195 = queryWeight, product of:
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.02013827 = queryNorm
            0.29615843 = fieldWeight in 4691, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              2.2337668 = idf(docFreq=12875, maxDocs=44218)
              0.09375 = fieldNorm(doc=4691)
      0.25 = coord(5/20)
    
    Abstract
    Der Psychologe und Ex-Regierungsberater schlägt Alarm: Sprachassistenten sind Heimspione, Barbie verrät Geheimnisse aus dem Kinderzimmer.
    Date
    17. 1.2019 14:23:41

Languages

  • d 880
  • e 446
  • a 12
  • m 3
  • el 2
  • i 1
  • nl 1
  • no 1
  • s 1
  • More… Less…

Types

  • a 595
  • i 115
  • b 23
  • m 23
  • r 22
  • x 16
  • n 11
  • s 9
  • p 6
  • l 3
  • More… Less…

Themes