Search (2 results, page 1 of 1)

  • × theme_ss:"Informationsethik"
  • × year_i:[2020 TO 2030}
  1. Martin, K.: Predatory predictions and the ethics of predictive analytics (2023) 0.02
    0.018921455 = product of:
      0.03784291 = sum of:
        0.03784291 = product of:
          0.07568582 = sum of:
            0.07568582 = weight(_text_:i in 946) [ClassicSimilarity], result of:
              0.07568582 = score(doc=946,freq=8.0), product of:
                0.18162222 = queryWeight, product of:
                  3.7717297 = idf(docFreq=2765, maxDocs=44218)
                  0.04815356 = queryNorm
                0.41672117 = fieldWeight in 946, product of:
                  2.828427 = tf(freq=8.0), with freq of:
                    8.0 = termFreq=8.0
                  3.7717297 = idf(docFreq=2765, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=946)
          0.5 = coord(1/2)
      0.5 = coord(1/2)
    
    Abstract
    In this paper, I critically examine ethical issues introduced by predictive analytics. I argue firms can have a market incentive to construct deceptively inflated true-positive outcomes: individuals are over-categorized as requiring a penalizing treatment and the treatment leads to mistakenly thinking this label was correct. I show that differences in power between firms developing and using predictive analytics compared to subjects can lead to firms reaping the benefits of predatory predictions while subjects can bear the brunt of the costs. While profitable, the use of predatory predictions can deceive stakeholders by inflating the measurement of accuracy, diminish the individuality of subjects, and exert arbitrary power. I then argue that firms have a responsibility to distinguish between the treatment effect and predictive power of the predictive analytics program, better internalize the costs of categorizing someone as needing a penalizing treatment, and justify the predictions of subjects and general use of predictive analytics. Subjecting individuals to predatory predictions only for a firms' efficiency and benefit is unethical and an arbitrary exertion of power. Firms developing and deploying a predictive analytics program can benefit from constructing predatory predictions while the cost is borne by the less powerful subjects of the program.
  2. Rubel, A.; Castro, C.; Pham, A.: Algorithms and autonomy : the ethics of automated decision systems (2021) 0.01
    0.0055284984 = product of:
      0.011056997 = sum of:
        0.011056997 = product of:
          0.055284984 = sum of:
            0.055284984 = weight(_text_:authors in 671) [ClassicSimilarity], result of:
              0.055284984 = score(doc=671,freq=2.0), product of:
                0.21952313 = queryWeight, product of:
                  4.558814 = idf(docFreq=1258, maxDocs=44218)
                  0.04815356 = queryNorm
                0.25184128 = fieldWeight in 671, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  4.558814 = idf(docFreq=1258, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=671)
          0.2 = coord(1/5)
      0.5 = coord(1/2)
    
    Abstract
    Algorithms influence every facet of modern life: criminal justice, education, housing, entertainment, elections, social media, news feeds, work... the list goes on. Delegating important decisions to machines, however, gives rise to deep moral concerns about responsibility, transparency, freedom, fairness, and democracy. Algorithms and Autonomy connects these concerns to the core human value of autonomy in the contexts of algorithmic teacher evaluation, risk assessment in criminal sentencing, predictive policing, background checks, news feeds, ride-sharing platforms, social media, and election interference. Using these case studies, the authors provide a better understanding of machine fairness and algorithmic transparency. They explain why interventions in algorithmic systems are necessary to ensure that algorithms are not used to control citizens' participation in politics and undercut democracy. This title is also available as Open Access on Cambridge Core