Ratings2
Average rating4.5
“Artificial intelligence has always inspired outlandish visions—that AI is going to destroy us, save us, or at the very least radically transform us. Erik Larson exposes the vast gap between the actual science underlying AI and the dramatic claims being made for it. This is a timely, important, and even essential book.” —John Horgan, author of The End of Science Many futurists insist that AI will soon achieve human levels of intelligence. From there, it will quickly eclipse the most gifted human mind. The Myth of Artificial Intelligence argues that such claims are just that: myths. We are not on the path to developing truly intelligent machines. We don’t even know where that path might be. Erik Larson charts a journey through the landscape of AI, from Alan Turing’s early work to today’s dominant models of machine learning. Since the beginning, AI researchers and enthusiasts have equated the reasoning approaches of AI with those of human intelligence. But this is a profound mistake. Even cutting-edge AI looks nothing like human intelligence. Modern AI is based on inductive reasoning: computers make statistical correlations to determine which answer is likely to be right, allowing software to, say, detect a particular face in an image. But human reasoning is entirely different. Humans do not correlate data sets; we make conjectures sensitive to context—the best guess, given our observations and what we already know about the world. We haven’t a clue how to program this kind of reasoning, known as abduction. Yet it is the heart of common sense. Larson argues that all this AI hype is bad science and bad for science. A culture of invention thrives on exploring unknowns, not overselling existing methods. Inductive AI will continue to improve at narrow tasks, but if we are to make real progress, we must abandon futuristic talk and learn to better appreciate the only true intelligence we know—our own.
Reviews with the most likes.
There are no reviews for this book. Add yours and it'll show up right here!