Search (5 results, page 1 of 1)

  • × theme_ss:"Informationsethik"
  • × year_i:[2020 TO 2030}
  1. Rubel, A.; Castro, C.; Pham, A.: Algorithms and autonomy : the ethics of automated decision systems (2021) 0.01
    0.0070961965 = product of:
      0.014192393 = sum of:
        0.014192393 = product of:
          0.028384786 = sum of:
            0.028384786 = weight(_text_:science in 671) [ClassicSimilarity], result of:
              0.028384786 = score(doc=671,freq=4.0), product of:
                0.13793045 = queryWeight, product of:
                  2.6341193 = idf(docFreq=8627, maxDocs=44218)
                  0.052363027 = queryNorm
                0.20579056 = fieldWeight in 671, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  2.6341193 = idf(docFreq=8627, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=671)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    LCSH
    Expert systems (Computer science) / Moral and ethical aspects
    Subject
    Expert systems (Computer science) / Moral and ethical aspects
  2. Bawden, D.; Robinson, L.: ¬"The dearest of our possessions" : applying Floridi's information privacy concept in models of information behavior and information literacy (2020) 0.01
    0.006021322 = product of:
      0.012042644 = sum of:
        0.012042644 = product of:
          0.024085289 = sum of:
            0.024085289 = weight(_text_:science in 5939) [ClassicSimilarity], result of:
              0.024085289 = score(doc=5939,freq=2.0), product of:
                0.13793045 = queryWeight, product of:
                  2.6341193 = idf(docFreq=8627, maxDocs=44218)
                  0.052363027 = queryNorm
                0.17461908 = fieldWeight in 5939, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  2.6341193 = idf(docFreq=8627, maxDocs=44218)
                  0.046875 = fieldNorm(doc=5939)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Source
    Journal of the Association for Information Science and Technology. 71(2020) no.9, S.1030-1043
  3. Slota, S.C.; Fleischmann, K.R.; Greenberg, S.; Verma, N.; Cummings, B.; Li, L.; Shenefiel, C.: Locating the work of artificial intelligence ethics (2023) 0.01
    0.006021322 = product of:
      0.012042644 = sum of:
        0.012042644 = product of:
          0.024085289 = sum of:
            0.024085289 = weight(_text_:science in 899) [ClassicSimilarity], result of:
              0.024085289 = score(doc=899,freq=2.0), product of:
                0.13793045 = queryWeight, product of:
                  2.6341193 = idf(docFreq=8627, maxDocs=44218)
                  0.052363027 = queryNorm
                0.17461908 = fieldWeight in 899, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  2.6341193 = idf(docFreq=8627, maxDocs=44218)
                  0.046875 = fieldNorm(doc=899)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Source
    Journal of the Association for Information Science and Technology. 74(2023) no.3, S.311-322
  4. Martin, K.: Predatory predictions and the ethics of predictive analytics (2023) 0.01
    0.0050177686 = product of:
      0.010035537 = sum of:
        0.010035537 = product of:
          0.020071074 = sum of:
            0.020071074 = weight(_text_:science in 946) [ClassicSimilarity], result of:
              0.020071074 = score(doc=946,freq=2.0), product of:
                0.13793045 = queryWeight, product of:
                  2.6341193 = idf(docFreq=8627, maxDocs=44218)
                  0.052363027 = queryNorm
                0.1455159 = fieldWeight in 946, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  2.6341193 = idf(docFreq=8627, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=946)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Source
    Journal of the Association for Information Science and Technology. 74(2023) no.5, S.531-545
  5. Bagatini, J.A.; Chaves Guimarães, J.A.: Algorithmic discriminations and their ethical impacts on knowledge organization : a thematic domain-analysis (2023) 0.01
    0.0050177686 = product of:
      0.010035537 = sum of:
        0.010035537 = product of:
          0.020071074 = sum of:
            0.020071074 = weight(_text_:science in 1134) [ClassicSimilarity], result of:
              0.020071074 = score(doc=1134,freq=2.0), product of:
                0.13793045 = queryWeight, product of:
                  2.6341193 = idf(docFreq=8627, maxDocs=44218)
                  0.052363027 = queryNorm
                0.1455159 = fieldWeight in 1134, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  2.6341193 = idf(docFreq=8627, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=1134)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    Personal data play a fundamental role in contemporary socioeconomic dynamics, with one of its primary aspects being the potential to facilitate discriminatory situations. This situation impacts the knowledge organization field especially because it considers personal data as elements (facets) to categorize persons under an economic and sometimes discriminatory perspective. The research corpus was collected at Scopus and Web of Science until the end of 2021, under the terms "data discrimination", "algorithmic bias", "algorithmic discrimination" and "fair algorithms". The obtained results allowed to infer that the analyzed knowledge domain predominantly incorporates personal data, whether in its behavioral dimension or in the scope of the so-called sensitive data. These data are susceptible to the action of algorithms of different orders, such as relevance, filtering, predictive, social ranking, content recommendation and random classification. Such algorithms can have discriminatory biases in their programming related to gender, sexual orientation, race, nationality, religion, age, social class, socioeconomic profile, physical appearance, and political positioning.