Search (3 results, page 1 of 1)

  • × author_ss:"Rubel, A."
  1. Rubel, A.; Castro, C.; Pham, A.: Algorithms and autonomy : the ethics of automated decision systems (2021) 0.07
    0.06725572 = product of:
      0.13451144 = sum of:
        0.07048073 = weight(_text_:social in 671) [ClassicSimilarity], result of:
          0.07048073 = score(doc=671,freq=6.0), product of:
            0.1847249 = queryWeight, product of:
              3.9875789 = idf(docFreq=2228, maxDocs=44218)
              0.046325076 = queryNorm
            0.3815443 = fieldWeight in 671, product of:
              2.4494898 = tf(freq=6.0), with freq of:
                6.0 = termFreq=6.0
              3.9875789 = idf(docFreq=2228, maxDocs=44218)
              0.0390625 = fieldNorm(doc=671)
        0.06403072 = product of:
          0.12806144 = sum of:
            0.12806144 = weight(_text_:aspects in 671) [ClassicSimilarity], result of:
              0.12806144 = score(doc=671,freq=12.0), product of:
                0.20938325 = queryWeight, product of:
                  4.5198684 = idf(docFreq=1308, maxDocs=44218)
                  0.046325076 = queryNorm
                0.6116126 = fieldWeight in 671, product of:
                  3.4641016 = tf(freq=12.0), with freq of:
                    12.0 = termFreq=12.0
                  4.5198684 = idf(docFreq=1308, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=671)
          0.5 = coord(1/2)
      0.5 = coord(2/4)
    
    Abstract
    Algorithms influence every facet of modern life: criminal justice, education, housing, entertainment, elections, social media, news feeds, work... the list goes on. Delegating important decisions to machines, however, gives rise to deep moral concerns about responsibility, transparency, freedom, fairness, and democracy. Algorithms and Autonomy connects these concerns to the core human value of autonomy in the contexts of algorithmic teacher evaluation, risk assessment in criminal sentencing, predictive policing, background checks, news feeds, ride-sharing platforms, social media, and election interference. Using these case studies, the authors provide a better understanding of machine fairness and algorithmic transparency. They explain why interventions in algorithmic systems are necessary to ensure that algorithms are not used to control citizens' participation in politics and undercut democracy. This title is also available as Open Access on Cambridge Core
    Content
    Inhalt: Introduction -- Autonomy, agency, and responsibility -- What can agents reasonably endorse? -- What we informationally owe each other -- Freedom, agency, and information technology -- Epistemic paternalism and social media -- Agency laundering and information technologies -- Democratic obligations and technological threats to legitimacy -- Conclusions and caveats
    LCSH
    Artificial intelligence / Law and legislation / Moral and ethical aspects
    Decision support systems / Moral and ethical aspects
    Expert systems (Computer science) / Moral and ethical aspects
    Subject
    Artificial intelligence / Law and legislation / Moral and ethical aspects
    Decision support systems / Moral and ethical aspects
    Expert systems (Computer science) / Moral and ethical aspects
  2. Rubel, A.; Biava, R.: ¬A framework for analyzing and comparing privacy states (2014) 0.04
    0.040099498 = product of:
      0.080198996 = sum of:
        0.04883048 = weight(_text_:social in 1340) [ClassicSimilarity], result of:
          0.04883048 = score(doc=1340,freq=2.0), product of:
            0.1847249 = queryWeight, product of:
              3.9875789 = idf(docFreq=2228, maxDocs=44218)
              0.046325076 = queryNorm
            0.26434162 = fieldWeight in 1340, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.9875789 = idf(docFreq=2228, maxDocs=44218)
              0.046875 = fieldNorm(doc=1340)
        0.031368516 = product of:
          0.06273703 = sum of:
            0.06273703 = weight(_text_:aspects in 1340) [ClassicSimilarity], result of:
              0.06273703 = score(doc=1340,freq=2.0), product of:
                0.20938325 = queryWeight, product of:
                  4.5198684 = idf(docFreq=1308, maxDocs=44218)
                  0.046325076 = queryNorm
                0.29962775 = fieldWeight in 1340, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  4.5198684 = idf(docFreq=1308, maxDocs=44218)
                  0.046875 = fieldNorm(doc=1340)
          0.5 = coord(1/2)
      0.5 = coord(2/4)
    
    Abstract
    This article develops a framework for analyzing and comparing privacy and privacy protections across (inter alia) time, place, and polity and for examining factors that affect privacy and privacy protection. This framework provides a method to describe precisely aspects of privacy and context and a flexible vocabulary and notation for such descriptions and comparisons. Moreover, it links philosophical and conceptual work on privacy to social science and policy work and accommodates different conceptions of the nature and value of privacy. The article begins with an outline of the framework. It then refines the view by describing a hypothetical application. Finally, it applies the framework to a real-world privacy issue-campaign finance disclosure laws in the United States and France. The article concludes with an argument that the framework offers important advantages to privacy scholarship and for privacy policy makers.
  3. Jones, K.M.L.; Rubel, A.; LeClere, E.: ¬A matter of trust : higher education institutions as information fiduciaries in an age of educational data mining and learning analytics (2020) 0.01
    0.010173016 = product of:
      0.040692065 = sum of:
        0.040692065 = weight(_text_:social in 5968) [ClassicSimilarity], result of:
          0.040692065 = score(doc=5968,freq=2.0), product of:
            0.1847249 = queryWeight, product of:
              3.9875789 = idf(docFreq=2228, maxDocs=44218)
              0.046325076 = queryNorm
            0.22028469 = fieldWeight in 5968, product of:
              1.4142135 = tf(freq=2.0), with freq of:
                2.0 = termFreq=2.0
              3.9875789 = idf(docFreq=2228, maxDocs=44218)
              0.0390625 = fieldNorm(doc=5968)
      0.25 = coord(1/4)
    
    Abstract
    Higher education institutions are mining and analyzing student data to effect educational, political, and managerial outcomes. Done under the banner of "learning analytics," this work can-and often does-surface sensitive data and information about, inter alia, a student's demographics, academic performance, offline and online movements, physical fitness, mental wellbeing, and social network. With these data, institutions and third parties are able to describe student life, predict future behaviors, and intervene to address academic or other barriers to student success (however defined). Learning analytics, consequently, raise serious issues concerning student privacy, autonomy, and the appropriate flow of student data. We argue that issues around privacy lead to valid questions about the degree to which students should trust their institution to use learning analytics data and other artifacts (algorithms, predictive scores) with their interests in mind. We argue that higher education institutions are paradigms of information fiduciaries. As such, colleges and universities have a special responsibility to their students. In this article, we use the information fiduciary concept to analyze cases when learning analytics violate an institution's responsibility to its students.