Amazon cover image
Image from Amazon.com

The technological singularity / Murray Shanahan.

By: Material type: TextTextLanguage: English Series: The MIT Press essential knowledge seriesPublisher: Cambridge, Massachusetts : The MIT Press, 2015Description: xxiii, 244 pages ; 18 cmContent type:
  • text
Media type:
  • unmediated
Carrier type:
  • volume
ISBN:
  • 9780262527804 (pbk. : alk. paper)
  • 0262527804 (pbk. : alk. paper)
Subject(s): DDC classification:
  • 006.3
LOC classification:
  • Q 335 S528t 2015
Summary: The idea that human history is approaching a "singularity" -- that ordinary humans will someday be overtaken by artificially intelligent machines or cognitively enhanced biological intelligence, or both -- has moved from the realm of science fiction to serious debate. Some singularity theorists predict that if the field of artificial intelligence (AI) continues to develop at its current dizzying rate, the singularity could come about in the middle of the present century. Murray Shanahan offers an introduction to the idea of the singularity and considers the ramifications of such a potentially seismic event.Shanahan's aim is not to make predictions but rather to investigate a range of scenarios. Whether we believe that singularity is near or far, likely or impossible, apocalypse or utopia, the very idea raises crucial philosophical and pragmatic questions, forcing us to think seriously about what we want as a species. Shanahan describes technological advances in AI, both biologically inspired and engineered from scratch. Once human-level AI -- theoretically possible, but difficult to accomplish -- has been achieved, he explains, the transition to superintelligent AI could be very rapid. Shanahan considers what the existence of superintelligent machines could mean for such matters as personhood, responsibility, rights, and identity. Some superhuman AI agents might be created to benefit humankind; some might go rogue. (Is Siri the template, or HAL?) The singularity presents both an existential threat to humanity and an existential opportunity for humanity to transcend its limitations. Shanahan makes it clear that we need to imagine both possibilities if we want to bring about the better outcome
Tags from this library: No tags from this library for this title. Log in to add tags.
Star ratings
    Average rating: 0.0 (0 votes)
Holdings
Item type Current library Home library Collection Shelving location Call number Copy number Status Date due Barcode
Libro Libro Biblioteca Juan Bosch Biblioteca Juan Bosch Humanidades Humanidades (4to. Piso) Q 335 S528t 2015 (Browse shelf(Opens below)) 1 Available 00000178342

Includes bibliographical references and index.

The idea that human history is approaching a "singularity" -- that ordinary humans will someday be overtaken by artificially intelligent machines or cognitively enhanced biological intelligence, or both -- has moved from the realm of science fiction to serious debate. Some singularity theorists predict that if the field of artificial intelligence (AI) continues to develop at its current dizzying rate, the singularity could come about in the middle of the present century. Murray Shanahan offers an introduction to the idea of the singularity and considers the ramifications of such a potentially seismic event.Shanahan's aim is not to make predictions but rather to investigate a range of scenarios. Whether we believe that singularity is near or far, likely or impossible, apocalypse or utopia, the very idea raises crucial philosophical and pragmatic questions, forcing us to think seriously about what we want as a species. Shanahan describes technological advances in AI, both biologically inspired and engineered from scratch. Once human-level AI -- theoretically possible, but difficult to accomplish -- has been achieved, he explains, the transition to superintelligent AI could be very rapid. Shanahan considers what the existence of superintelligent machines could mean for such matters as personhood, responsibility, rights, and identity. Some superhuman AI agents might be created to benefit humankind; some might go rogue. (Is Siri the template, or HAL?) The singularity presents both an existential threat to humanity and an existential opportunity for humanity to transcend its limitations. Shanahan makes it clear that we need to imagine both possibilities if we want to bring about the better outcome

There are no comments on this title.

to post a comment.