Superintelligence: Paths, Dangers, Strategies

Author:   Nick Bostrom (Professor in the Faculty of Philosophy & Oxford Martin School and Director, Future of Humanity Institute, University of Oxford)
Publisher:   Oxford University Press
ISBN:  

9780199678112


Pages:   352
Publication Date:   03 July 2014
Format:   Hardback
Availability:   To order   Availability explained
Stock availability from the supplier is unknown. We will order it for you and ship this item to you once it is received by us.

Our Price $51.95 Quantity:  
Add to Cart

Share |

Superintelligence: Paths, Dangers, Strategies


Add your own review!

Overview

Full Product Details

Author:   Nick Bostrom (Professor in the Faculty of Philosophy & Oxford Martin School and Director, Future of Humanity Institute, University of Oxford)
Publisher:   Oxford University Press
Imprint:   Oxford University Press
Dimensions:   Width: 16.10cm , Height: 2.70cm , Length: 24.00cm
Weight:   0.679kg
ISBN:  

9780199678112


ISBN 10:   0199678111
Pages:   352
Publication Date:   03 July 2014
Audience:   Professional and scholarly ,  Professional & Vocational
Format:   Hardback
Publisher's Status:   Active
Availability:   To order   Availability explained
Stock availability from the supplier is unknown. We will order it for you and ship this item to you once it is received by us.

Table of Contents

Reviews

I highly recommend this book Bill Gates Nick Bostrom makes a persuasive case that the future impact of AI is perhaps the most important issue the human race has ever faced. Instead of passively drifting, we need to steer a course. Superintelligence charts the submerged rocks of the future with unprecedented detail. It marks the beginning of a new era Stuart Russell, Professor of Computer Science, University of California, Berkley Those disposed to dismiss an 'AI takeover' as science fiction may think again after reading this original and well-argued book Martin Rees, Past President, Royal Society This superb analysis by one of the worlds clearest thinkers tackles one of humanitys greatest challenges: if future superhuman artificial intelligence becomes the biggest event in human history, then how can we ensure that it doesnt become the last? Max Tegmark, Professor of Physics, MIT Terribly important ... groundbreaking... extraordinary sagacity and clarity, enabling him to combine his wide-ranging knowledge over an impressively broad spectrum of disciplines - engineering, natural sciences, medicine, social sciences and philosophy - into a comprehensible whole... If this book gets the reception that it deserves, it may turn out the most important alarm bell since Rachel Carson's Silent Spring from 1962, or ever Olle Haggstrom, Professor of Mathematical Statistics Valuable. The implications of introducing a second intelligent species onto Earth are far-reaching enough to deserve hard thinking The Economist There is no doubting the force of [Bostroms] arguments the problem is a research challenge worthy of the next generations best mathematical talent. Human civilisation is at stake Financial Times His book Superintelligence: Paths, Dangers, Strategies became an improbable bestseller in 2014 Alex Massie, Times (Scotland) Worth reading... We need to be super careful with AI. Potentially more dangerous than nukes Elon Musk, Founder of SpaceX and Tesla A damn hard read Sunday Telegraph I recommend Superintelligence by Nick Bostrom as an excellent book on this topic Jolyon Brown, Linux Format Every intelligent person should read it. Nils Nilsson, Artificial Intelligence Pioneer, Stanford University


Interesting from an economics and business perspective, but also more widely City A.M. Nick Bostrom makes a persuasive case that the future impact of AI is perhaps the most important issue the human race has ever faced. Instead of passively drifting, we need to steer a course. Superintelligence charts the submerged rocks of the future with unprecedented detail. It marks the beginning of a new era Stuart Russell, Professor of Computer Science, University of California, Berkley Those disposed to dismiss an 'AI takeover' as science fiction may think again after reading this original and well-argued book Martin Rees, Past President, Royal Society This superb analysis by one of the worlds clearest thinkers tackles one of humanitys greatest challenges: if future superhuman artificial intelligence becomes the biggest event in human history, then how can we ensure that it doesnt become the last? Max Tegmark, Professor of Physics, MIT Terribly important ... groundbreaking... extraordinary sagacity and clarity, enabling him to combine his wide-ranging knowledge over an impressively broad spectrum of disciplines - engineering, natural sciences, medicine, social sciences and philosophy - into a comprehensible whole... If this book gets the reception that it deserves, it may turn out the most important alarm bell since Rachel Carson's Silent Springfrom 1962, or ever Olle Haggstrom, Professor of Mathematical Statistics Valuable. The implications of introducing a second intelligent species onto Earth are far-reaching enough to deserve hard thinking The Economist There is no doubting the force of [Bostroms] arguments the problem is a research challenge worthy of the next generations best mathematical talent. Human civilisation is at stake Financial Times Worth reading... We need to be super careful with AI. Potentially more dangerous than nukes Elon Musk, Founder of SpaceX and Tesla A damn hard read Sunday Telegraph


Rewarding. Sunday Telegraph A fascinating and rational analysis of a topic that is easily sensationalised, and thought-provoking reading. Engineering and Technology Nick Bostrom makes a persuasive case that the future impact of AI is perhaps the most important issue the human race has ever faced. Instead of passively drifting, we need to steer a course. Superintelligence charts the submerged rocks of the future with unprecedented detail. It marks the beginning of a new era. Stuart Russell, Professor of Computer Science, University of California, Berkley Those disposed to dismiss an 'AI takeover' as science fiction may think again after reading this original and well-argued book. Martin Rees, Past President, Royal Society a magnificent conception ... it ought to be required reading on all philosophy undergraduate courses, by anyone attempting to build AIs ... and by physicists who think there is no point to philosophy. Brian Clegg, Popular Science There is no doubting the force of [Bostrom's] arguments ...the problem is a research challenge worthy of the next generation's best mathematical talent. Human civilisation is at stake. Clive Cookson, Financial Times This superb analysis by one of the world's clearest thinkers tackles one of humanity's greatest challenges: if future superhuman artificial intelligence becomes the biggest event in human history, then how can we ensure that it doesn't become the last? Professor Max Tegmark, MIT


A fascinating and rational analysis of a topic that is easily sensationalised, and thought-provoking reading. Engineering and Technology Nick Bostrom makes a persuasive case that the future impact of AI is perhaps the most important issue the human race has ever faced. Instead of passively drifting, we need to steer a course. Superintelligence charts the submerged rocks of the future with unprecedented detail. It marks the beginning of a new era. Stuart Russell, Professor of Computer Science, University of California, Berkley Those disposed to dismiss an 'AI takeover' as science fiction may think again after reading this original and well-argued book. Martin Rees, Past President, Royal Society a magnificent conception ... it ought to be required reading on all philosophy undergraduate courses, by anyone attempting to build AIs ... and by physicists who think there is no point to philosophy. Brian Clegg, Popular Science There is no doubting the force of [Bostrom's] arguments ...the problem is a research challenge worthy of the next generation's best mathematical talent. Human civilisation is at stake. Clive Cookson, Financial Times This superb analysis by one of the world's clearest thinkers tackles one of humanity's greatest challenges: if future superhuman artificial intelligence becomes the biggest event in human history, then how can we ensure that it doesn't become the last? Professor Max Tegmark, MIT It's a fascinating and rational analysis of a topic that is easily sensationalised, and thought-provoking reading. Dominic Lenton, Engineering and Technology


Nick Bostrom makes a persuasive case that the future impact of AI is perhaps the most important issue the human race has ever faced. Instead of passively drifting, we need to steer a course. Superintelligence charts the submerged rocks of the future with unprecedented detail. It marks the beginning of a new era. Stuart Russell, Professor of Computer Science, University of California, Berkley Those disposed to dismiss an 'AI takeover' as science fiction may think again after reading this original and well-argued book. Martin Rees, Past President, Royal Society a magnificent conception ... it ought to be required reading on all philosophy undergraduate courses, by anyone attempting to build AIs ... and by physicists who think there is no point to philosophy. Brian Clegg, Popular Science There is no doubting the force of [Bostrom's] arguments ...the problem is a research challenge worthy of the next generation's best mathematical talent. Human civilisation is at stake. Clive Cookson, Financial Times This superb analysis by one of the worlds clearest thinkers tackles one of humanitys greatest challenges: if future superhuman artificial intelligence becomes the biggest event in human history, then how can we ensure that it doesnt become the last? Professor Max Tegmark, MIT It's a fascinating and rational analysis of a topic that is easily sensationalised, and thought-provoking reading. Dominic Lenton, Engineering and Technology


This superb analysis by one of the worlds clearest thinkers tackles one of humanitys greatest challenges: if future superhuman artificial intelligence becomes the biggest event in human history, then how can we ensure that it doesnt become the last? Professor Max Tegmark, MIT


Nick Bostrom makes a persuasive case that the future impact of AI is perhaps the most important issue the human race has ever faced. Instead of passively drifting, we need to steer a course. Superintelligence charts the submerged rocks of the future with unprecedented detail. It marks the beginning of a new era. Stuart Russell, Professor of Computer Science, University of California, Berkley Those disposed to dismiss an 'AI takeover' as science fiction may think again after reading this original and well-argued book. Martin Rees, Past President, Royal Society This superb analysis by one of the worlds clearest thinkers tackles one of humanitys greatest challenges: if future superhuman artificial intelligence becomes the biggest event in human history, then how can we ensure that it doesnt become the last? Professor Max Tegmark, MIT


I highly recommend this book Bill Gates Nick Bostrom makes a persuasive case that the future impact of AI is perhaps the most important issue the human race has ever faced. Instead of passively drifting, we need to steer a course. Superintelligence charts the submerged rocks of the future with unprecedented detail. It marks the beginning of a new era Stuart Russell, Professor of Computer Science, University of California, Berkley Those disposed to dismiss an 'AI takeover' as science fiction may think again after reading this original and well-argued book Martin Rees, Past President, Royal Society This superb analysis by one of the worlds clearest thinkers tackles one of humanitys greatest challenges: if future superhuman artificial intelligence becomes the biggest event in human history, then how can we ensure that it doesnt become the last? Max Tegmark, Professor of Physics, MIT Terribly important ... groundbreaking... extraordinary sagacity and clarity, enabling him to combine his wide-ranging knowledge over an impressively broad spectrum of disciplines - engineering, natural sciences, medicine, social sciences and philosophy - into a comprehensible whole... If this book gets the reception that it deserves, it may turn out the most important alarm bell since Rachel Carson's Silent Spring from 1962, or ever Olle Haggstrom, Professor of Mathematical Statistics Valuable. The implications of introducing a second intelligent species onto Earth are far-reaching enough to deserve hard thinking The Economist There is no doubting the force of [Bostroms] arguments the problem is a research challenge worthy of the next generations best mathematical talent. Human civilisation is at stake Financial Times Worth reading... We need to be super careful with AI. Potentially more dangerous than nukes Elon Musk, Founder of SpaceX and Tesla A damn hard read Sunday Telegraph


Author Information

Nick Bostrom is Professor in the Faculty of Philosophy at Oxford University and founding Director of the Future of Humanity Institute and of the Programme on the Impacts of Future Technology within the Oxford Martin School. He is the author of some 200 publications, including Anthropic Bias(Routledge, 2002), Global Catastrophic Risks (ed., OUP, 2008), and Human Enhancement (ed., OUP, 2009), and a forthcoming book on Superintelligence. He previously taught at Yale, and he was a Postdoctoral Fellow of the British Academy. Bostrom has a background in physics, computational neuroscience, and mathematical logic as well as philosophy.

Tab Content 6

Author Website:  

Customer Reviews

Recent Reviews

No review item found!

Add your own review!

Countries Available

All regions
Latest Reading Guide

MRG2025CC

 

Shopping Cart
Your cart is empty
Shopping cart
Mailing List