Search (86 results, page 1 of 5)

  • × type_ss:"a"
  • × type_ss:"el"
  1. Kuzma, M.: Are you able to find the maps you need? (2019) 0.06
    0.06188125 = product of:
      0.1237625 = sum of:
        0.1237625 = product of:
          0.247525 = sum of:
            0.247525 = weight(_text_:maps in 4569) [ClassicSimilarity], result of:
              0.247525 = score(doc=4569,freq=2.0), product of:
                0.28477904 = queryWeight, product of:
                  5.619245 = idf(docFreq=435, maxDocs=44218)
                  0.050679237 = queryNorm
                0.86918265 = fieldWeight in 4569, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  5.619245 = idf(docFreq=435, maxDocs=44218)
                  0.109375 = fieldNorm(doc=4569)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
  2. Pacek, Z.M.: Cataloguing and presentation tools for old maps and map series (2019) 0.06
    0.06188125 = product of:
      0.1237625 = sum of:
        0.1237625 = product of:
          0.247525 = sum of:
            0.247525 = weight(_text_:maps in 5607) [ClassicSimilarity], result of:
              0.247525 = score(doc=5607,freq=2.0), product of:
                0.28477904 = queryWeight, product of:
                  5.619245 = idf(docFreq=435, maxDocs=44218)
                  0.050679237 = queryNorm
                0.86918265 = fieldWeight in 5607, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  5.619245 = idf(docFreq=435, maxDocs=44218)
                  0.109375 = fieldNorm(doc=5607)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
  3. Crom, W.: Digitisation and georeferencing of maps : key terms or stimulus words? (2019) 0.06
    0.06188125 = product of:
      0.1237625 = sum of:
        0.1237625 = product of:
          0.247525 = sum of:
            0.247525 = weight(_text_:maps in 5623) [ClassicSimilarity], result of:
              0.247525 = score(doc=5623,freq=2.0), product of:
                0.28477904 = queryWeight, product of:
                  5.619245 = idf(docFreq=435, maxDocs=44218)
                  0.050679237 = queryNorm
                0.86918265 = fieldWeight in 5623, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  5.619245 = idf(docFreq=435, maxDocs=44218)
                  0.109375 = fieldNorm(doc=5623)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
  4. Pepper, S.; Groenmo, G.O.: Towards a general theory of scope (2002) 0.05
    0.054134816 = product of:
      0.10826963 = sum of:
        0.10826963 = product of:
          0.21653926 = sum of:
            0.21653926 = weight(_text_:maps in 539) [ClassicSimilarity], result of:
              0.21653926 = score(doc=539,freq=12.0), product of:
                0.28477904 = queryWeight, product of:
                  5.619245 = idf(docFreq=435, maxDocs=44218)
                  0.050679237 = queryNorm
                0.7603764 = fieldWeight in 539, product of:
                  3.4641016 = tf(freq=12.0), with freq of:
                    12.0 = termFreq=12.0
                  5.619245 = idf(docFreq=435, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=539)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    This paper is concerned with the issue of scope in topic maps. Topic maps are a form of knowledge representation suitable for solving a number of complex problems in the area of information management, ranging from findability (navigation and querying) to knowledge management and enterprise application integration (EAI). The topic map paradigm has its roots in efforts to understand the essential semantics of back-of-book indexes in order that they might be captured in a form suitable for computer processing. Once understood, the model of a back-of-book index was generalised in order to cover the needs of digital information, and extended to encompass glossaries and thesauri, as well as indexes. The resulting core model, of typed topics, associations, and occurrences, has many similarities with the semantic networks developed by the artificial intelligence community for representing knowledge structures. One key requirement of topic maps from the earliest days was to be able to merge indexes from disparate origins. This requirement accounts for two further concepts that greatly enhance the power of topic maps: subject identity and scope. This paper concentrates on scope, but also includes a brief discussion of the feature known as the topic naming constraint, with which it is closely related. It is based on the authors' experience in creating topic maps (in particular, the Italian Opera Topic Map, and in implementing processing systems for topic maps (in particular, the Ontopia Topic Map Engine and Navigator.
  5. Popper, K.R.: Three worlds : the Tanner lecture on human values. Deliverd at the University of Michigan, April 7, 1978 (1978) 0.05
    0.053661376 = product of:
      0.10732275 = sum of:
        0.10732275 = product of:
          0.32196826 = sum of:
            0.32196826 = weight(_text_:3a in 230) [ClassicSimilarity], result of:
              0.32196826 = score(doc=230,freq=2.0), product of:
                0.42965913 = queryWeight, product of:
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.050679237 = queryNorm
                0.7493574 = fieldWeight in 230, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  8.478011 = idf(docFreq=24, maxDocs=44218)
                  0.0625 = fieldNorm(doc=230)
          0.33333334 = coord(1/3)
      0.5 = coord(1/2)
    
    Source
    https%3A%2F%2Ftannerlectures.utah.edu%2F_documents%2Fa-to-z%2Fp%2Fpopper80.pdf&usg=AOvVaw3f4QRTEH-OEBmoYr2J_c7H
  6. Kraker, P.; Kittel, C,; Enkhbayar, A.: Open Knowledge Maps : creating a visual interface to the world's scientific knowledge based on natural language processing (2016) 0.05
    0.053041074 = product of:
      0.10608215 = sum of:
        0.10608215 = product of:
          0.2121643 = sum of:
            0.2121643 = weight(_text_:maps in 3205) [ClassicSimilarity], result of:
              0.2121643 = score(doc=3205,freq=8.0), product of:
                0.28477904 = queryWeight, product of:
                  5.619245 = idf(docFreq=435, maxDocs=44218)
                  0.050679237 = queryNorm
                0.7450137 = fieldWeight in 3205, product of:
                  2.828427 = tf(freq=8.0), with freq of:
                    8.0 = termFreq=8.0
                  5.619245 = idf(docFreq=435, maxDocs=44218)
                  0.046875 = fieldNorm(doc=3205)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    The goal of Open Knowledge Maps is to create a visual interface to the world's scientific knowledge. The base for this visual interface consists of so-called knowledge maps, which enable the exploration of existing knowledge and the discovery of new knowledge. Our open source knowledge mapping software applies a mixture of summarization techniques and similarity measures on article metadata, which are iteratively chained together. After processing, the representation is saved in a database for use in a web visualization. In the future, we want to create a space for collective knowledge mapping that brings together individuals and communities involved in exploration and discovery. We want to enable people to guide each other in their discovery by collaboratively annotating and modifying the automatically created maps.
  7. Maaten, L. van den; Hinton, G.: Visualizing non-metric similarities in multiple maps (2012) 0.05
    0.053041074 = product of:
      0.10608215 = sum of:
        0.10608215 = product of:
          0.2121643 = sum of:
            0.2121643 = weight(_text_:maps in 3884) [ClassicSimilarity], result of:
              0.2121643 = score(doc=3884,freq=8.0), product of:
                0.28477904 = queryWeight, product of:
                  5.619245 = idf(docFreq=435, maxDocs=44218)
                  0.050679237 = queryNorm
                0.7450137 = fieldWeight in 3884, product of:
                  2.828427 = tf(freq=8.0), with freq of:
                    8.0 = termFreq=8.0
                  5.619245 = idf(docFreq=435, maxDocs=44218)
                  0.046875 = fieldNorm(doc=3884)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    Techniques for multidimensional scaling visualize objects as points in a low-dimensional metric map. As a result, the visualizations are subject to the fundamental limitations of metric spaces. These limitations prevent multidimensional scaling from faithfully representing non-metric similarity data such as word associations or event co-occurrences. In particular, multidimensional scaling cannot faithfully represent intransitive pairwise similarities in a visualization, and it cannot faithfully visualize "central" objects. In this paper, we present an extension of a recently proposed multidimensional scaling technique called t-SNE. The extension aims to address the problems of traditional multidimensional scaling techniques when these techniques are used to visualize non-metric similarities. The new technique, called multiple maps t-SNE, alleviates these problems by constructing a collection of maps that reveal complementary structure in the similarity data. We apply multiple maps t-SNE to a large data set of word association data and to a data set of NIPS co-authorships, demonstrating its ability to successfully visualize non-metric similarities.
  8. Bold, N.; Kim, W.-J.; Yang, J.-D.: Converting object-based thesauri into XML Topic Maps (2010) 0.05
    0.045934916 = product of:
      0.09186983 = sum of:
        0.09186983 = product of:
          0.18373966 = sum of:
            0.18373966 = weight(_text_:maps in 4799) [ClassicSimilarity], result of:
              0.18373966 = score(doc=4799,freq=6.0), product of:
                0.28477904 = queryWeight, product of:
                  5.619245 = idf(docFreq=435, maxDocs=44218)
                  0.050679237 = queryNorm
                0.6452008 = fieldWeight in 4799, product of:
                  2.4494898 = tf(freq=6.0), with freq of:
                    6.0 = termFreq=6.0
                  5.619245 = idf(docFreq=435, maxDocs=44218)
                  0.046875 = fieldNorm(doc=4799)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    Constructing ontology is considerably time consuming process in general. Since there are a vast amount of thesauri currently available, it may be a feasible solution to exploit thesauri, when constructing ontology in a short period of time. This paper designs and implements a XTM (XML Topic Maps) code converter generating XTM coded ontology from an object based thesaurus. It is an extended thesaurus, which enriches the conventional thesauri with user defined associations, a notion of instances and occurrences associated with them. The reason we adopt XTM is that it is a verified and practical methodology to semantically reorganize the conceptual structure of extant web applications with minimal effort. Moreover, since XTM is conceptually similar to our object based thesauri, recommendation and inference mechanism already developed in our system could be easily applied to the generated XTM ontology. To show that the XTM ontology is correct, we also verify it with onto pia Omnigator and Vizigator, the components of Ontopia Knowledge Suite (OKS) tool.
    Object
    Topic maps
  9. Hook, P.A.; Gantchev, A.: Using combined metadata sources to visualize a small library (OBL's English Language Books) (2017) 0.04
    0.038279098 = product of:
      0.076558195 = sum of:
        0.076558195 = product of:
          0.15311639 = sum of:
            0.15311639 = weight(_text_:maps in 3870) [ClassicSimilarity], result of:
              0.15311639 = score(doc=3870,freq=6.0), product of:
                0.28477904 = queryWeight, product of:
                  5.619245 = idf(docFreq=435, maxDocs=44218)
                  0.050679237 = queryNorm
                0.53766733 = fieldWeight in 3870, product of:
                  2.4494898 = tf(freq=6.0), with freq of:
                    6.0 = termFreq=6.0
                  5.619245 = idf(docFreq=435, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=3870)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    Data from multiple knowledge organization systems are combined to provide a global overview of the content holdings of a small personal library. Subject headings and classification data are used to effectively map the combined book and topic space of the library. While harvested and manipulated by hand, the work reveals issues and potential solutions when using automated techniques to produce topic maps of much larger libraries. The small library visualized consists of the thirty-nine, digital, English language books found in the Osama Bin Laden (OBL) compound in Abbottabad, Pakistan upon his death. As this list of books has garnered considerable media attention, it is worth providing a visual overview of the subject content of these books - some of which is not readily apparent from the titles. Metadata from subject headings and classification numbers was combined to create book-subject maps. Tree maps of the classification data were also produced. The books contain 328 subject headings. In order to enhance the base map with meaningful thematic overlay, library holding count data was also harvested (and aggregated from duplicates). This additional data revealed the relative scarcity or popularity of individual books.
  10. Wu, Y.; Bai, R.: ¬An event relationship model for knowledge organization and visualization (2017) 0.04
    0.0375057 = product of:
      0.0750114 = sum of:
        0.0750114 = product of:
          0.1500228 = sum of:
            0.1500228 = weight(_text_:maps in 3867) [ClassicSimilarity], result of:
              0.1500228 = score(doc=3867,freq=4.0), product of:
                0.28477904 = queryWeight, product of:
                  5.619245 = idf(docFreq=435, maxDocs=44218)
                  0.050679237 = queryNorm
                0.5268042 = fieldWeight in 3867, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  5.619245 = idf(docFreq=435, maxDocs=44218)
                  0.046875 = fieldNorm(doc=3867)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    An event is a specific occurrence involving participants, which is a typed, n-ary association of entities or other events, each identified as a participant in a specific semantic role in the event (Pyysalo et al. 2012; Linguistic Data Consortium 2005). Event types may vary across domains. Representing relationships between events can facilitate the understanding of knowledge in complex systems (such as economic systems, human body, social systems). In the simplest form, an event can be represented as Entity A <Relation> Entity B. This paper evaluates several knowledge organization and visualization models and tools, such as concept maps (Cmap), topic maps (Ontopia), network analysis models (Gephi), and ontology (Protégé), then proposes an event relationship model that aims to integrate the strengths of these models, and can represent complex knowledge expressed in events and their relationships.
  11. Jackson, R.: Information Literacy and its relationship to cognitive development and reflective judgment (2008) 0.04
    0.035360713 = product of:
      0.070721425 = sum of:
        0.070721425 = product of:
          0.14144285 = sum of:
            0.14144285 = weight(_text_:maps in 111) [ClassicSimilarity], result of:
              0.14144285 = score(doc=111,freq=2.0), product of:
                0.28477904 = queryWeight, product of:
                  5.619245 = idf(docFreq=435, maxDocs=44218)
                  0.050679237 = queryNorm
                0.4966758 = fieldWeight in 111, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  5.619245 = idf(docFreq=435, maxDocs=44218)
                  0.0625 = fieldNorm(doc=111)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    This chapter maps the Association of College and Research Libraries' Information Competency Standards for Higher Education to the cognitive development levels developed by William G. Perry and Patricia King and Karen Kitchener to suggest which competencies are appropriate for which level of cognitive development.
  12. Williams, B.: Dimensions & VOSViewer bibliometrics in the reference interview (2020) 0.03
    0.030940626 = product of:
      0.06188125 = sum of:
        0.06188125 = product of:
          0.1237625 = sum of:
            0.1237625 = weight(_text_:maps in 5719) [ClassicSimilarity], result of:
              0.1237625 = score(doc=5719,freq=2.0), product of:
                0.28477904 = queryWeight, product of:
                  5.619245 = idf(docFreq=435, maxDocs=44218)
                  0.050679237 = queryNorm
                0.43459132 = fieldWeight in 5719, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  5.619245 = idf(docFreq=435, maxDocs=44218)
                  0.0546875 = fieldNorm(doc=5719)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    The VOSviewer software provides easy access to bibliometric mapping using data from Dimensions, Scopus and Web of Science. The properly formatted and structured citation data, and the ease in which it can be exported open up new avenues for use during citation searches and eference interviews. This paper details specific techniques for using advanced searches in Dimensions, exporting the citation data, and drawing insights from the maps produced in VOS Viewer. These search techniques and data export practices are fast and accurate enough to build into reference interviews for graduate students, faculty, and post-PhD researchers. The search results derived from them are accurate and allow a more comprehensive view of citation networks embedded in ordinary complex boolean searches.
  13. Barbaresi, A.: Toponyms as entry points into a digital edition : mapping Die Fackel (2018) 0.03
    0.026520537 = product of:
      0.053041074 = sum of:
        0.053041074 = product of:
          0.10608215 = sum of:
            0.10608215 = weight(_text_:maps in 5058) [ClassicSimilarity], result of:
              0.10608215 = score(doc=5058,freq=2.0), product of:
                0.28477904 = queryWeight, product of:
                  5.619245 = idf(docFreq=435, maxDocs=44218)
                  0.050679237 = queryNorm
                0.37250686 = fieldWeight in 5058, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  5.619245 = idf(docFreq=435, maxDocs=44218)
                  0.046875 = fieldNorm(doc=5058)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    The emergence of Spatial Humanities has prompted for interdisciplinary work on digitized texts, especially since the significance of place names exceeds the usually admitted frame of deictic and indexical functions. In this perspective, I present a visualization of toponyms co-occurrences in the literary journal Die Fackel ("The Torch"), published by the satirist and language critic Karl Kraus in Vienna from 1899 until 1936. The distant reading experiments consist in drawing lines on maps in order to uncover patterns which are not easily retraceable during close reading. I discuss their status in the context of a digital humanities study. This is not an authoritative cartography of the work but rather an indirect depiction of the viewpoint of Kraus and his contemporaries. Drawing on Kraus' vitriolic recording of political life, toponyms in Die Fackel tell a story about the ongoing reconfiguration of Europe.
  14. Van der Veer Martens, B.: Do citation systems represent theories of truth? (2001) 0.02
    0.024276167 = product of:
      0.048552334 = sum of:
        0.048552334 = product of:
          0.09710467 = sum of:
            0.09710467 = weight(_text_:22 in 3925) [ClassicSimilarity], result of:
              0.09710467 = score(doc=3925,freq=4.0), product of:
                0.17747006 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.050679237 = queryNorm
                0.54716086 = fieldWeight in 3925, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.078125 = fieldNorm(doc=3925)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Date
    22. 7.2006 15:22:28
  15. Wolchover, N.: Wie ein Aufsehen erregender Beweis kaum Beachtung fand (2017) 0.02
    0.024276167 = product of:
      0.048552334 = sum of:
        0.048552334 = product of:
          0.09710467 = sum of:
            0.09710467 = weight(_text_:22 in 3582) [ClassicSimilarity], result of:
              0.09710467 = score(doc=3582,freq=4.0), product of:
                0.17747006 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.050679237 = queryNorm
                0.54716086 = fieldWeight in 3582, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.078125 = fieldNorm(doc=3582)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Date
    22. 4.2017 10:42:05
    22. 4.2017 10:48:38
  16. Dunning, A.: Do we still need search engines? (1999) 0.02
    0.02403218 = product of:
      0.04806436 = sum of:
        0.04806436 = product of:
          0.09612872 = sum of:
            0.09612872 = weight(_text_:22 in 6021) [ClassicSimilarity], result of:
              0.09612872 = score(doc=6021,freq=2.0), product of:
                0.17747006 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.050679237 = queryNorm
                0.5416616 = fieldWeight in 6021, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.109375 = fieldNorm(doc=6021)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Source
    Ariadne. 1999, no.22
  17. Qin, J.; Paling, S.: Converting a controlled vocabulary into an ontology : the case of GEM (2001) 0.02
    0.020599011 = product of:
      0.041198023 = sum of:
        0.041198023 = product of:
          0.082396045 = sum of:
            0.082396045 = weight(_text_:22 in 3895) [ClassicSimilarity], result of:
              0.082396045 = score(doc=3895,freq=2.0), product of:
                0.17747006 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.050679237 = queryNorm
                0.46428138 = fieldWeight in 3895, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.09375 = fieldNorm(doc=3895)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Date
    24. 8.2005 19:20:22
  18. Jaeger, L.: Wissenschaftler versus Wissenschaft (2020) 0.02
    0.020599011 = product of:
      0.041198023 = sum of:
        0.041198023 = product of:
          0.082396045 = sum of:
            0.082396045 = weight(_text_:22 in 4156) [ClassicSimilarity], result of:
              0.082396045 = score(doc=4156,freq=2.0), product of:
                0.17747006 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.050679237 = queryNorm
                0.46428138 = fieldWeight in 4156, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.09375 = fieldNorm(doc=4156)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Date
    2. 3.2020 14:08:22
  19. Buckland, M.; Lancaster, L.: Combining place, time, and topic : the Electronic Cultural Atlas Initiative (2004) 0.02
    0.017680356 = product of:
      0.035360713 = sum of:
        0.035360713 = product of:
          0.070721425 = sum of:
            0.070721425 = weight(_text_:maps in 1194) [ClassicSimilarity], result of:
              0.070721425 = score(doc=1194,freq=2.0), product of:
                0.28477904 = queryWeight, product of:
                  5.619245 = idf(docFreq=435, maxDocs=44218)
                  0.050679237 = queryNorm
                0.2483379 = fieldWeight in 1194, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  5.619245 = idf(docFreq=435, maxDocs=44218)
                  0.03125 = fieldNorm(doc=1194)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    The Electronic Cultural Atlas Initiative was formed to encourage scholarly communication and the sharing of data among researchers who emphasize the relationships between place, time, and topic in the study of culture and history. In an effort to develop better tools and practices, The Electronic Cultural Atlas Initiative has sponsored the collaborative development of software for downloading and editing geo-temporal data to create dynamic maps, a clearinghouse of shared datasets accessible through a map-based interface, projects on format and content standards for gazetteers and time period directories, studies to improve geo-temporal aspects in online catalogs, good practice guidelines for preparing e-publications with dynamic geo-temporal displays, and numerous international conferences. The Electronic Cultural Atlas Initiative (ECAI) grew out of discussions among an international group of scholars interested in religious history and area studies. It was established as a unit under the Dean of International and Area Studies at the University of California, Berkeley in 1997. ECAI's mission is to promote an international collaborative effort to transform humanities scholarship through use of the digital environment to share data and by placing greater emphasis on the notions of place and time. Professor Lewis Lancaster is the Director. Professor Michael Buckland, with a library and information studies background, joined the effort as Co-Director in 2000. Assistance from the Lilly Foundation, the California Digital Library (University of California), and other sources has enabled ECAI to nurture a community; to develop a catalog ("clearinghouse") of Internet-accessible georeferenced resources; to support the development of software for obtaining, editing, manipulating, and dynamically visualizing geo-temporally encoded data; and to undertake research and development projects as needs and resources determine. Several hundred scholars worldwide, from a wide range of disciplines, are informally affiliated with ECAI, all interested in shared use of historical and cultural data. The Academia Sinica (Taiwan), The British Library, and the Arts and Humanities Data Service (UK) are among the well-known affiliates. However, ECAI mainly comprises individual scholars and small teams working on their own small projects on a very wide range of cultural, social, and historical topics. Numerous specialist committees have been fostering standardization and collaboration by area and by themes such as trade-routes, cities, religion, and sacred sites.
  20. Guidi, F.; Sacerdoti Coen, C.: ¬A survey on retrieval of mathematical knowledge (2015) 0.02
    0.017165843 = product of:
      0.034331687 = sum of:
        0.034331687 = product of:
          0.06866337 = sum of:
            0.06866337 = weight(_text_:22 in 5865) [ClassicSimilarity], result of:
              0.06866337 = score(doc=5865,freq=2.0), product of:
                0.17747006 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.050679237 = queryNorm
                0.38690117 = fieldWeight in 5865, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.078125 = fieldNorm(doc=5865)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Date
    22. 2.2017 12:51:57

Years

Languages

  • e 43
  • d 42
  • a 1
  • More… Less…