Search (6 results, page 1 of 1)

  • × classification_ss:"71.43 (Technologische Faktoren) <Soziologie>"
  1. Degele, N.: Informiertes Wissen : Eine Wissenssoziologie der computerisierten Gesellschaft (2000) 0.14
    0.1360165 = product of:
      0.20402475 = sum of:
        0.19900289 = weight(_text_:sociology in 3986) [ClassicSimilarity], result of:
          0.19900289 = score(doc=3986,freq=4.0), product of:
            0.30495512 = queryWeight, product of:
              6.9606886 = idf(docFreq=113, maxDocs=44218)
              0.043811057 = queryNorm
            0.6525645 = fieldWeight in 3986, product of:
              2.0 = tf(freq=4.0), with freq of:
                4.0 = termFreq=4.0
              6.9606886 = idf(docFreq=113, maxDocs=44218)
              0.046875 = fieldNorm(doc=3986)
        0.0050218496 = product of:
          0.010043699 = sum of:
            0.010043699 = weight(_text_:of in 3986) [ClassicSimilarity], result of:
              0.010043699 = score(doc=3986,freq=4.0), product of:
                0.06850986 = queryWeight, product of:
                  1.5637573 = idf(docFreq=25162, maxDocs=44218)
                  0.043811057 = queryNorm
                0.14660224 = fieldWeight in 3986, product of:
                  2.0 = tf(freq=4.0), with freq of:
                    4.0 = termFreq=4.0
                  1.5637573 = idf(docFreq=25162, maxDocs=44218)
                  0.046875 = fieldNorm(doc=3986)
          0.5 = coord(1/2)
      0.6666667 = coord(2/3)
    
    LCSH
    Knowledge, Sociology of
    Subject
    Knowledge, Sociology of
  2. Curcio, R.: ¬Das virtuelle Reich : die Kolonialisierung der Phantasie und die soziale Kontrolle (2017) 0.00
    0.0049464945 = product of:
      0.014839483 = sum of:
        0.014839483 = product of:
          0.029678967 = sum of:
            0.029678967 = weight(_text_:22 in 5306) [ClassicSimilarity], result of:
              0.029678967 = score(doc=5306,freq=2.0), product of:
                0.15341885 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.043811057 = queryNorm
                0.19345059 = fieldWeight in 5306, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=5306)
          0.5 = coord(1/2)
      0.33333334 = coord(1/3)
    
    Date
    18. 9.2018 12:57:22
  3. Kuhlen, R.: ¬Die Konsequenzen von Informationsassistenten : was bedeutet informationelle Autonomie oder wie kann Vertrauen in elektronische Dienste in offenen Informationsmärkten gesichert werden? (1999) 0.00
    0.0049464945 = product of:
      0.014839483 = sum of:
        0.014839483 = product of:
          0.029678967 = sum of:
            0.029678967 = weight(_text_:22 in 5376) [ClassicSimilarity], result of:
              0.029678967 = score(doc=5376,freq=2.0), product of:
                0.15341885 = queryWeight, product of:
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.043811057 = queryNorm
                0.19345059 = fieldWeight in 5376, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  3.5018296 = idf(docFreq=3622, maxDocs=44218)
                  0.0390625 = fieldNorm(doc=5376)
          0.5 = coord(1/2)
      0.33333334 = coord(1/3)
    
    Date
    28. 8.2019 19:21:22
  4. Aral, S.: ¬The hype machine : how social media disrupts our elections, our economy, and our health - and how we must adapt (2020) 0.00
    0.0027335489 = product of:
      0.008200646 = sum of:
        0.008200646 = product of:
          0.016401293 = sum of:
            0.016401293 = weight(_text_:of in 550) [ClassicSimilarity], result of:
              0.016401293 = score(doc=550,freq=24.0), product of:
                0.06850986 = queryWeight, product of:
                  1.5637573 = idf(docFreq=25162, maxDocs=44218)
                  0.043811057 = queryNorm
                0.23940048 = fieldWeight in 550, product of:
                  4.8989797 = tf(freq=24.0), with freq of:
                    24.0 = termFreq=24.0
                  1.5637573 = idf(docFreq=25162, maxDocs=44218)
                  0.03125 = fieldNorm(doc=550)
          0.5 = coord(1/2)
      0.33333334 = coord(1/3)
    
    Abstract
    Social media connected the world--and gave rise to fake news and increasing polarization. Now a leading researcher at MIT draws on 20 years of research to show how these trends threaten our political, economic, and emotional health in this eye-opening exploration of the dark side of technological progress. Today we have the ability, unprecedented in human history, to amplify our interactions with each other through social media. It is paramount, MIT social media expert Sinan Aral says, that we recognize the outsized impact social media has on our culture, our democracy, and our lives in order to steer today's social technology toward good, while avoiding the ways it can pull us apart. Otherwise, we could fall victim to what Aral calls "The Hype Machine." As a senior researcher of the longest-running study of fake news ever conducted, Aral found that lies spread online farther and faster than the truth--a harrowing conclusion that was featured on the cover of Science magazine. Among the questions Aral explores following twenty years of field research: Did Russian interference change the 2016 election? And how is it affecting the vote in 2020? Why does fake news travel faster than the truth online? How do social ratings and automated sharing determine which products succeed and fail? How does social media affect our kids? First, Aral links alarming data and statistics to three accelerating social media shifts: hyper-socialization, personalized mass persuasion, and the tyranny of trends. Next, he grapples with the consequences of the Hype Machine for elections, businesses, dating, and health. Finally, he maps out strategies for navigating the Hype Machine, offering his singular guidance for managing social media to fulfill its promise going forward. Rarely has a book so directly wrestled with the secret forces that drive the news cycle every day"
    Content
    Inhalt: Pandemics, Promise, and Peril -- The New Social Age -- The End of Reality -- The Hype Machine -- Your Brain on Social Media -- A Network's Gravity is Proportional to Its Mass -- Personalized Mass Persuasion -- Hypersocialization -- Strategies for a Hypersocialized World -- The Attention Economy and the Tyranny of Trends -- The Wisdom and Madness of Crowds -- Social Media's Promise Is Also Its Peril -- Building a Better Hype Machine.
  5. O'Neil, C.: Angriff der Algorithmen : wie sie Wahlen manipulieren, Berufschancen zerstören und unsere Gesundheit gefährden (2017) 0.00
    0.0015439361 = product of:
      0.004631808 = sum of:
        0.004631808 = product of:
          0.009263616 = sum of:
            0.009263616 = weight(_text_:of in 4060) [ClassicSimilarity], result of:
              0.009263616 = score(doc=4060,freq=10.0), product of:
                0.06850986 = queryWeight, product of:
                  1.5637573 = idf(docFreq=25162, maxDocs=44218)
                  0.043811057 = queryNorm
                0.1352158 = fieldWeight in 4060, product of:
                  3.1622777 = tf(freq=10.0), with freq of:
                    10.0 = termFreq=10.0
                  1.5637573 = idf(docFreq=25162, maxDocs=44218)
                  0.02734375 = fieldNorm(doc=4060)
          0.5 = coord(1/2)
      0.33333334 = coord(1/3)
    
    Abstract
    A former Wall Street quant sounds an alarm on the mathematical models that pervade modern life - and threaten to rip apart our social fabric. We live in the age of the algorithm. Increasingly, the decisions that affect our lives - where we go to school, whether we get a loan, how much we pay for insurance - are being made not by humans, but by mathematical models. In theory, this should lead to greater fairness: everyone is judged according to the same rules, and bias is eliminated. And yet, as Cathy O'Neil reveals in this urgent and necessary book, the opposite is true. The models being used today are opaque, unregulated, and incontestable, even when they're wrong. Most troubling, they reinforce discrimination. Tracing the arc of a person's life, O'Neil exposes the black box models that shape our future, both as individuals and as a society. These "weapons of math destruction" score teachers and students, sort CVs, grant or deny loans, evaluate workers, target voters, and monitor our health. O'Neil calls on modellers to take more responsibility for their algorithms and on policy makers to regulate their use. But in the end, it's up to us to become more savvy about the models that govern our lives. This important book empowers us to ask the tough questions, uncover the truth, and demand change.
    Content
    Kommentare: 'Fascinating and deeply disturbing' - Yuval Noah Harari, Guardian Books of the Year 'A manual for the 21st-century citizen... accessible, refreshingly critical, relevant and urgent' - Federica Cocco, Financial Times
    Footnote
    Originaltitel: Weapons of math destruction:: how Big Data increases inequality and threatens democracy. Vgl. auch den Rezensions-Beitrag: Krüger, J.: Wie der Mensch die Kontrolle über den Algorithmus behalten kann. [19.01.2018]. In: https://netzpolitik.org/2018/algorithmen-regulierung-im-kontext-aktueller-gesetzgebung/.
  6. Tegmark, M.: Leben 3.0 : Mensch sein im Zeitalter Künstlicher Intelligenz (2017) 0.00
    6.9046917E-4 = product of:
      0.0020714074 = sum of:
        0.0020714074 = product of:
          0.004142815 = sum of:
            0.004142815 = weight(_text_:of in 4558) [ClassicSimilarity], result of:
              0.004142815 = score(doc=4558,freq=2.0), product of:
                0.06850986 = queryWeight, product of:
                  1.5637573 = idf(docFreq=25162, maxDocs=44218)
                  0.043811057 = queryNorm
                0.060470343 = fieldWeight in 4558, product of:
                  1.4142135 = tf(freq=2.0), with freq of:
                    2.0 = termFreq=2.0
                  1.5637573 = idf(docFreq=25162, maxDocs=44218)
                  0.02734375 = fieldNorm(doc=4558)
          0.5 = coord(1/2)
      0.33333334 = coord(1/3)
    
    Abstract
    Künstliche Intelligenz ist unsere unausweichliche Zukunft. Wird sie uns ins Verderben stürzen oder zur Weiterentwicklung des Homo sapiens beitragen? Die Nobelpreis-Schmiede Massachusetts Institute of Technology ist der bedeutendste technologische Think Tank der USA. Dort arbeitet Professor Max Tegmark mit den weltweit führenden Entwicklern künstlicher Intelligenz zusammen, die ihm exklusive Einblicke in ihre Labors gewähren. Die Erkenntnisse, die er daraus zieht, sind atemberaubend und zutiefst verstörend zugleich. Neigt sich die Ära der Menschen dem Ende zu? Der Physikprofessor Max Tegmark zeigt anhand der neusten Forschung, was die Menschheit erwartet. Hier eine Auswahl möglicher Szenarien: - Eroberer: Künstliche Intelligenz übernimmt die Macht und entledigt sich der Menschheit mit Methoden, die wir noch nicht einmal verstehen. - Der versklavte Gott: Die Menschen bemächtigen sich einer superintelligenten künstlichen Intelligenz und nutzen sie, um Hochtechnologien herzustellen. - Umkehr: Der technologische Fortschritt wird radikal unterbunden und wir kehren zu einer prä-technologischen Gesellschaft im Stil der Amish zurück. - Selbstzerstörung: Superintelligenz wird nicht erreicht, weil sich die Menschheit vorher nuklear oder anders selbst vernichtet. - Egalitäres Utopia: Es gibt weder Superintelligenz noch Besitz, Menschen und kybernetische Organismen existieren friedlich nebeneinander. Max Tegmark bietet kluge und fundierte Zukunftsszenarien basierend auf seinen exklusiven Einblicken in die aktuelle Forschung zur künstlichen Intelligenz.

Years

Languages

Themes