Contemporary Authors

Project and content management for Contemporary Authors volumes

Fernbach, Philip

WORK TITLE: The Knowledge Illusion
WORK NOTES: with Steven Sloman
PSEUDONYM(S):
BIRTHDATE:
WEBSITE:
CITY: Boulder
STATE: CO
COUNTRY:
NATIONALITY:

http://www.colorado.edu/business/philip-fernbach * https://sites.google.com/site/philfernbachswebpage/ * https://0d201a26-a-62cb3a1a-s-sites.googlegroups.com/site/philfernbachswebpage/attachments/fernbach_cv.pdf

RESEARCHER NOTES:

PERSONAL EDUCATION:

Williams College, Williamstown, MA, B.A. (magna cum laude), 2001; Brown University, Providence, RI, Ph.D., 2010.

ADDRESS

  • Office - University of Colorado at Boulder, Leeds School of Business, 419 UCB, Koelbel Hall Room 489, Boulder, CO 80309-0419

CAREER

Writer, marketing expert, business consultant, and educator. Dove Consulting, Boston, MA, research analyst, 2002-03; Auctive Incorporated, Boston, MA, analyst, 2003-05; Brown University, Department of Cognitive, Linguistic and Psychological Sciences, postdoctoral research associate and lecturer, 2010-11; University of Colorado at Boulder, Leeds School of Business, Center for Research on Consumer Financial Decision-Making, research scholar, 2011-12; University of Colorado Boulder, Leeds School of Business, assistant professor of marketing, 2012—, Institute of Cognitive Science, associate member, 2013—. Presenter at academic conferences and meetings.

AVOCATIONS:

Hiking, music, guitar playing, ice hockey.

AWARDS:
Brown University Graduate Fellowship, 2005-06;Galner Dissertation Fellowship, Brown University, 2009-10; American Psychological Association Dissertation Research Award, 2009, for paper “Understanding Predictive and Diagnostic Reasoning;” postdoctoral research grant, Unilever Corporation, 2010-11; Sterling-Rice Research Grant, 2012, 2013; Marketing Science Institute award, 2012; Templeton Foundation and Thrive Center for Human Development research grant, 2013-15; Varieties of Understanding Project research grant (with Bard De Langhe and J. D. Trout), 2014-16.

WRITINGS

  • (With Steven Sloman) The Knowledge Illusion: Why We Never Think Alone, Riverhead Books (New York, NY), 2017

Contributor to books, including The Probabilistic Mind: Prospects for Rational Models of Cognition, edited by N. Chater and M. Oaksford, Oxford University Press (Oxford, England), 2008; Moral Judgment and Decision-Making: The Psychology of Learning and Motivation, vol. 50, edited by D. Bartels, C. W. Bauman, L. J. Skitka, and D. Medin, Elsevier (San Diego, CA), 2009; and The Oxford Handbook of Causal Reasoning, edited by M. Waldmann, Oxford University Press (Oxford, England), 2016.

Contributor to journals and periodicals, including Management Science, Psychological Science, Argument & Computation, Mind & Language, Journal of Experimental Psychology: Learning, Memory, & Cognition, Cognitive Development, Behavioral and Brain Sciences, Organizational Behavior and Human Decision Processes, Journal of Consumer Research, Harvard Business Review, New York Times, and Journal of Marketing Behavior.

SIDELIGHTS

Educator and marketing expert Philip Fernbach started his business career as an analyst and consultant with two strategy consulting firms in Boston. He then became an assistant professor of marketing at the Leeds School of Business at the University of Colorado Boulder. Fernbach’s research has focused on marketing and consumer behavior, particularly causal reasoning, financial decision making, moral judgment, and probability judgment. In simplest terms, “I study how people think and make decisions,” stated Fernback in an article at his website. Fernbach holds a B.A. in philosophy from Williams College and a Ph.D. in cognitive science from Brown University.

In The Knowledge Illusion: Why We Never Think Alone,  Fernbach and coauthor Steven Sloman offer an in-depth exploration of the human tendency to assume one knows more that he or she actually does, and how that assumption affects both individuals and society in the larger sense.

People use many devices every day—some mechanical, some electronic; some simple, some complex. More of than not, people believe they know how the devices they use work. However, Fernbach and Sloman suggest that in most cases our level of belief in what we know far exceeds what we actually know. The authors present the example of persons being asked to explain the workings of a common clothing zipper. Before tackling the task, participants believed they knew how a zipper works. After all, they have used zippers almost every day and should be able to explain how zippers work with little difficulty. Once the task of explanation has been undertaken, however, participants soon realize they cannot completely explain how zippers function. Almost all of us function under this “knowledge illusion” unless we have some level of specialized knowledge about a subject. Explaining anything outside of that realm of knowledge would likely meet with failure.

Fernbach and Sloman explain this in terms of evolutionary psychology. “The illusion exists, [the authors] argue, because humans evolved as part of a hive mind, and are so intuitively adept at co-operation that the lines between minds become blurred,” explained a reviewer in the Economist. At base, most of us don’t need to know how a zipper works. We don’t need to know how a computer operates, or how an automobile runs, or how a nuclear power plant produces electricity. Someone else knows, and we can rely on what that person knows whenever our own knowledge is lacking. Most things that we use every day are designed, built, and operated by more than one person in collaboration and cooperation with many others. Each person knows some part of the process in detail, and when all of those individual parts are put together, a computer or a car or a nuclear reactor is the result. The authors thus shed light on “the paradox that while individual humans may be fairly ignorant of how the world works, we’re collectively capable of brilliance,” summarized Psychology Today contributor Gary Drevitch.

Fernbach and Sloman also caution that the knowledge illusion can be potentially harmful. For example, some people may have very strong opinions on political or social issues, but these opinions may also be misinformed. They may be based on no real knowledge of the issues at hand, or solely on the opinions of a particular social or political group. This kind of thinking results in poor policy decisions, support of various political and legislative initiatives without really understanding their implications, and the possibility of detrimental actions being taken. Fernbach and Sloman provide numerous suggestions for “minimizing the damage” that policies based on erroneous or misinformed opinions can cause,” noted Mary Ann Hughes in a Library Journal review.

“In the context of partisan bubbles and fake news, the authors bring a necessary shot of humility,” observed the Economist reviewer. In an “increasingly polarized culture” where strong opinions and certainty can be based on little actual knowledge, a “book advocating intellectual humility and recognition of the limits of understanding feels both revolutionary and necessary,” commented a Publishers Weekly writer.

BIOCRIT

PERIODICALS

  • Economist, April 8, 2017, “Mind Meld; Cognitive Science,” review of The Knowledge Illusion: Why We Never Think Alone, p. 75.

  • Kirkus Reviews, January 15, 2017, review of The Knowledge Illusion.

  • Library Journal, March 1, 2017, Mary Ann Hughes, review of The Knowledge Illusion, p. 94.

  • New York Times Book Review, April 18, 2o17, Yuval Harari, “People Have Limited Knowledge. What’s the Remedy? Nobody Knows,” review of The Knowledge Illusion.

  • Psychology Today, March-April, 2017. Gary Drevitch, “Massively Intelligent,” review of The Knowledge Illusion, p. 44.

  • Publishers Weekly, November 14, 2016, review of The Knowledge Illusion, p. 41.

ONLINE

  • Phil Fernbach Web Page, http://sites.google.com/site/philfernbachswebpage/ (September 14, 2017).

  • University of Colorado Boulder Leeds School of Business Website, http://www.colorado.edu/business/ (September 14, 2017), faculty profile.

  • The Knowledge Illusion: Why We Never Think Alone Riverhead Books (New York, NY), 2017
1.  The knowledge illusion : why we never think alone LCCN 2016036297 Type of material Book Personal name Sloman, Steven A., author. Main title The knowledge illusion : why we never think alone / Steven Sloman and Philip Fernbach. Published/Produced New York : Riverhead Books, 2017. Description 296 pages ; 24 cm ISBN 9780399184352 CALL NUMBER B105.T54 S56 2017 CABIN BRANCH Copy 1 Request in Jefferson or Adams Building Reading Rooms - STORED OFFSITE
  • - http://www.colorado.edu/business/philip-fernbach

    Philip Fernbach

    Assistant Professor
    Center for Research on Consumer Financial Decision Making • Marketing
    philip.fernbach@colorado.edu
    303.492.1311
    website
    Curriculum Vitae
    The Knowledge Illusion By Steven Sloman and Philip Fernbach
    Koelbel 489
    Biography
    Phil Fernbach is an assistant professor of marketing in the Leeds School of Business. He holds a Ph.D. from Brown University in cognitive science and a B.A. from Williams College where he studied philosophy. His research interests span many areas of consumer behavior including causal reasoning, probability judgment, financial decision-making, and moral judgment. His research has been published in outlets such as the Journal of Experimental Psychology: General, the Journal of Consumer Research, Management Science, and Psychological Science, and has been profiled in media outlets like The New York Times, the Wall Street Journal and Harvard Business Review. Prior to pursuing his Ph.D., he worked with consumer goods companies as a strategy consultant for two boutique firms in Boston.
    Dr. Fernbach's book, "The Knowledge Illusion: Why We Never Think Alone," will be published by Riverhead Books in March, 2017. Click the image below for more details.

    CV: https://0d201a26-a-62cb3a1a-s-sites.googlegroups.com/site/philfernbachswebpage/attachments/fernbach_cv.pdf?attachauth=ANoY7cqbW-60-TrHu9_wIuP2awCQO_o-3nreRgK-jpYcPFY-SSfYQtU8TTbptG4koAYsOsCecNO37jNy2b1d7lVf_ZnQoSeIUak6p8ysLpT1JhxVmwa1DS_jhFDHOK4mChnZe-sNHjuZIiT1toEixwi2NWx0kWEOtB_WFxEqpv9bf003Gg1oIGolKZIiFIB2eA1VPXworei1ezYBXMfASrpWVorrK1KZm9_Ow2NVaOczzwZv_M3bQZ9bz0lHlYswaACrohjvw7cv&attredirects=0

  • Philip Fernbach Website - https://sites.google.com/site/philfernbachswebpage/home

    Biography

    I am an assistant professor of marketing in the Leeds School of Business at the University of Colorado, Boulder. I study how people think and make decisions. Much of my work is inspired by causal model theory, the idea that people's judgments and decisions are based on knowledge of the world, knowledge that is represented in terms of causal structure. I received my Ph.D in cognitive science from the Department of Cognitive, Linguistic and Psychological Sciences at Brown University in 2010 . Prior to pursuing my Ph.D I worked with consumer goods companies as a strategy consultant for two boutique firms in Boston. Before that, I did my undergraduate studies at Williams College in the mountains of Western Massachusetts where I studied philosophy. When I'm not busy working on research I spend most of my time playing with my son and daughter, taking scenic walks with my wife, flatpicking my Martin HD28 at local bluegrass jams and playing ice hockey.

    Steve Sloman and I have a book coming out in March, 2017, published by Riverhead Books. Click the image below for more details. 

        Phil Fernbach
        Assistant Professor of Marketing
        Leeds School of Business
        University of Colorado, Boulder

Sloman, Steven & Philip Fernbach. The Knowledge Illusion: Why We Never Think Alone

Mary Ann Hughes
142.4 (Mar. 1, 2017): p94.
Copyright: COPYRIGHT 2017 Library Journals, LLC. A wholly owned subsidiary of Media Source, Inc. No redistribution permitted.
http://www.libraryjournal.com/
Sloman, Steven & Philip Fernbach. The Knowledge Illusion: Why We Never Think Alone. Riverhead. Mar. 2017. 304p. illus. notes. index. ISBN 9780399184352. $28; ebk. ISBN 9780399184345. PSYCH
We wander around in a fog of unknowing, argue Sloman (cognitive, linguistic, & psychological sciences, Brown Univ.; editor in chief, Cognition) and Fernbach (marketing, Leeds Sch. of Business, Univ. of Colorado). We depend on a web of experts and the technology they've created to keep our world going. Even Paleolithic societies had specialists--shamans, flint knappers, etc. The downside is that we tend to think that we know more than we do. Most people, for example, say that they understand how a toilet works or why a certain social policy should be enacted. But when asked to describe plumbing or explain why they advocate a policy, they are unable to do so. This is called the "Illusion of Explanatory Depth." It can lead to flooded bathrooms and wars. Sloman and Fernbach offer suggestions for minimizing the damage that this can cause, but, interestingly enough, this book illustrates the problem of specialization. The authors apparently aren't aware of some of the classic work done on values change by social psychologists. VERDICT General readers who like the work of Malcolm Gladwell will enjoy this book.--Mary Ann Hughes, Shelton, WA
Source Citation   (MLA 8th Edition)
Hughes, Mary Ann. "Sloman, Steven & Philip Fernbach. The Knowledge Illusion: Why We Never Think Alone." Library Journal, 1 Mar. 2017, p. 94+. General OneFile, go.galegroup.com/ps/i.do?p=ITOF&sw=w&u=schlager&v=2.1&id=GALE%7CA483702177&it=r&asid=da18f5304b6da8ba93a61bfbb02f902b. Accessed 11 Aug. 2017.

Gale Document Number: GALE|A483702177

Mind meld; Cognitive science

423.9035 (Apr. 8, 2017): p75(US).
Copyright: COPYRIGHT 2017 Economist Intelligence Unit N.A. Incorporated
http://store.eiu.com/
The Knowledge Illusion: Why We Never Think Alone. By Steven Sloman and Philip Fernbach. Riverhead; 296 pages. Macmillan
DO YOU know how a toilet works? What about a bicycle, or a zipper? Most people can provide half answers at best. They struggle to explain basic inventions, let alone more complex and abstract ones. Yet somehow, in spite of people's ignorance, they created and navigate the modern world. A new book, "The Knowledge Illusion" sets out to tackle this apparent paradox: how can human thinking be so powerful, yet so shallow?
Steven Sloman and Philip Fernbach, two cognitive scientists, draw on evolutionary theory and psychology. They argue that the mind has evolved to do the bare minimum that improves the fitness of its host. Because humans are a social species and evolved in the context of collaboration, wherever possible, abilities have been outsourced. As a result, people are individually rather limited thinkers and store little information in their own heads. Much knowledge is instead spread through the community--whose members do not often realise that this is the case.
The authors call this the illusion of understanding, and they demonstrate it with a simple experiment. Subjects are asked to rate their understanding of something, then to write a detailed account of it, and finally to rate their understanding again. The self-assessments almost invariably drop. The authors see this effect everywhere, from toilets and bicycles to complex policy issues. The illusion exists, they argue, because humans evolved as part of a hive mind, and are so intuitively adept at co-operation that the lines between minds become blurred. Economists and psychologists talk about the "curse of knowledge": people who know something have a hard time imagining someone else who does not. The illusion of knowledge works the other way round: people think they know something because others know it.
The hive mind, with its seamless interdependence and expertise-sharing, once helped humans hunt mammoths and now sends them into space. But in politics it causes problems. Using a toilet without understanding it is harmless, but changing the health-care system without understanding it is not. Yet people often have strong opinions about issues they understand little about. And on social media, surrounded by like-minded friends and followers, opinions are reinforced and become more extreme. It is hard to reason with someone under the illusion that their beliefs are thought through, and simply presenting facts is unlikely to change beliefs when those beliefs are rooted in the values and groupthink of a community.
The authors tentatively suggest that making people confront the illusion of understanding will temper their opinions, but this could have the opposite effect--people respond badly to feeling foolish. Messrs Sloman and Fernbach show how deep the problem runs, but are short on ideas to fix it.
"The Knowledge Illusion" is at once both obvious and profound: the limitations of the mind are no surprise, but the problem is that people so rarely think about them. However, while the illusion certainly exists, its significance is overstated. The authors are Ptolemaic in their efforts to make it central to human psychology, when really the answer to their first question--how can human thought be so powerful, yet so shallow?--is the hive mind. Human ignorance is more fundamental and more consequential than the illusion of understanding. But still, the book profits from its timing. In the context of partisan bubbles and fake news, the authors bring a necessary shot of humility: be sceptical of your own knowledge, and the wisdom of your crowd.
The Knowledge Illusion: Why We Never Think Alone.
By Steven Sloman and Philip Fernbach.
Source Citation   (MLA 8th Edition)
"Mind meld; Cognitive science." The Economist, 8 Apr. 2017, p. 75(US). General OneFile, go.galegroup.com/ps/i.do?p=ITOF&sw=w&u=schlager&v=2.1&id=GALE%7CA493932701&it=r&asid=2fac046563d9ec341c617d1fc45fa46c. Accessed 11 Aug. 2017.

Gale Document Number: GALE|A493932701

Massively intelligent

Gary Drevitch
50.2 (March-April 2017): p44.
Copyright: COPYRIGHT 2017 Sussex Publishers, Inc.
https://www.psychologytoday.com/
THE KNOWLEDGE ILLUSION Why We Never Think Alone STEVEN SLOMAN and PHILIPL FERNBACH
Hardly anyone can accurately explain how a zipper works. And yet we've cracked the atom and explored deep space. A new book explains how we manage it.
IN A CLASSIC Monty Python sketch, panelists on the talk show "How to Do It" promise to teach viewers how to split an atom, construct a box girder bridge, and irrigate the Sahara "to make vast new areas of land cultivatable." But first, how to play the flute: John Cleese picks up the instrument, points to it, and declares, "You blow there, and you move your fingers up and down here."
As Steven Sloman and Philip Fembach relate with less silliness in The Knowledge Illusion, most of us have no deeper understanding of how everyday devices like a toilet, a zipper, or a coffeemaker actually work, to say nothing of watches, cell phones, or space probes. And yet until we're actually asked, by researchers like the authors, to describe how a toilet functions-and they helpfully provide an elegant description-most of us assume that we do understand. This gap between what we know and what we think we know is known as the illusion of explanatory depth.
"Our point is not that people are ignorant," Sloman and Fembach write. "It's that people are more ignorant than they think they are." In fact, we know just enough to get by.

Sloman, a psychologist at Brown and the editor of the journal Cognition, and Fembach, a cognitive scientist at the University of Colorado, explore the paradox that while individual humans may be fairly ignorant of how the world works, we're collectively capable of brilliance. Their breezy guide to the mechanisms of human intelligence breaks us down, then builds us up, only to break us down again: We're ignorant, but that's okay, because accessible, actionable knowledge is everywhere-but our easy access to information makes us recklessly overconfident.
A GLOBAL BRAIN BANK
The human mind is an outstanding problem solver but a less impressive storage device. We can hold, according to some estimates, about l gigabyte of memory, maybe as much as 10. But our minds are not computers. They rely not entirely on memory and deliberation, as a machine must, but on pattern recognition and insight. Besides, there's just too much to know. Most of our knowledge instead resides outside of our heads-in our bodies, in the environment, and most crucially, in other people.
That's not a weakness, it's a strength. "You're seeing only a tiny bit of the world at a time, but you know the rest is there," the authors write. Our knowledge of how the world normally functions allows us to learn all we need to know about an environment by just looking around and allows us to draw correct conclusions about the space we're in without seeing it all at once. In other words, the world is part of your memory. Did you already forget the punch line to the Monty Python sketch? That's fine: You know it's right there at the top of the page, easy to access.
We have succeeded as a species because of how well communities of brains work together. Every significant project ever attempted-pyramid, cathedral, or skyscraper-would be inconceivable if it had to rely on a single mind. Evolutionary speaking, this is the social brain hypothesis. As social groups grew in size and complexity, we developed new cognitive capabilities to support our communities through specialization, distributed expertise, and division of cognitive labor. In short, we share intentionality, a trait easily observed in the group play of children and the global collaboration of scientific research.
What's true of a particle accelerator, a massively complex product of distributed expertise, also holds for our smallest groups: Spouses are more prone to forget details in areas of their partner's clear expertise-the right wine or how to file taxes--and vice versa. We can focus our memory on what we have to, or want to, secure that our partner has his or her territory equally well covered.
If we can't make use of other people's knowledge, we can't succeed. We can barely function. But if we can't recognize that most of our knowledge lives outside our brains, we face a different problem, and at least as much as it is about cognition, The Knowledge Illusion is about hubris.
To varying extents, we all live under the illusion of explanatory depth. "We fail to draw an accurate line between what is inside and outside our heads," the authors write. "So we frequently don't know what we don't know." The illusion is in some ways a byproduct of maturity. As preschoolers, we take an object in our hands and ask why and how it works until we're satisfied, or until our parents throw up their hands or distract us. Eventually, we stop asking. We come to tolerate our inability to understand our complex tools by deciding to stop recognizing it. And then we go further. "We think the knowledge we have about how things work sits inside our skulls when in fact we're drawing a lot of it from the environment and from other people."
Ignorance, the authors remind us, is our natural state and nothing to be ashamed of, as long as it's tempered with humility. When it is not, however, we can fall prey to the Dunning-Kruger effect, the phenomenon in which those who perform the worst on a task overrate their skills the most. As has been shown in studies of doctors, workers, students, and perhaps most notably, drivers, those with the highest level of skill tend to underrate their abilities-they recognize how much they don't know and how much better they could perform. Those who perform worst, however, tend to lack a sense of what skills they're missing, and instead remain willfully ignorant of their potential. "When the only way to evaluate how much you know is through your own knowledge, you'll never get an honest assessment."

The Dunning-Kruger effect is a particular risk in the political arena. Leaders "have the responsibility to learn about their own ignorance and effectively take advantage of others' knowledge and skills," the authors suggest, and voters have an obligation as well: "A mature electorate is one that makes the effort to appreciate a leader who recognizes that the world is complex and hard to understand."
HOW FACTS CAN DEFEAT BELIEFS
It's not only knowledge that is spread across communities, it's beliefs and values as well-and they're not always fact-based. The flip side of collective intelligence is groupthink, when members of a group provide each other with support for a shared belief that may have no factual basis. The illusion of explanatory depth helps to explain how we can passionately hold strong opinions with little factual support. But discarding an incorrect belief agreed upon by one's community-say, about climate change-is a daunting cognitive challenge.
Research into the illusion offers a way out-not through debate but through humility-evoking doubt. Studies have found that the best way to shift others' opinions is to employ the same strategy that helps them understand that they don't really know how a toilet works: Ask them to explain it. When people are prompted to do so, they are forced to acknowledge their lack of knowledge. The illusion is broken, and after such encounters, people report less attachment to extreme opinions. Similarly, when climate-change skeptics are shown videos or articles about the science behind the process-the mechanisms, not the blame-their cognitive wall begins to crack. In the end, the authors report, "no one wants to be wrong."
Source Citation   (MLA 8th Edition)
Drevitch, Gary. "Massively intelligent." Psychology Today, Mar.-Apr. 2017, p. 44+. General OneFile, go.galegroup.com/ps/i.do?p=ITOF&sw=w&u=schlager&v=2.1&id=GALE%7CA483929623&it=r&asid=978da046c4d098e37ed0e9af8410f064. Accessed 11 Aug. 2017.

Gale Document Number: GALE|A483929623

Sloman, Steven: THE KNOWLEDGE ILLUSION

(Jan. 15, 2017):
Copyright: COPYRIGHT 2017 Kirkus Media LLC
http://www.kirkusreviews.com/
Sloman, Steven THE KNOWLEDGE ILLUSION Riverhead (Adult Nonfiction) $28.00 3, 14 ISBN: 978-0-399-18435-2
A tour of the many honeycombs of the hive mind, courtesy of cognitive scientists Sloman (Brown Univ.) and Fernbach (Univ. of Colorado).You know more than I do, and you know next to nothing yourself. That's not just a Socratic proposition, but also a finding of recent generations of neuroscientific researchers, who, as Cognition editor Sloman notes, are given to addressing a large question: "How is thinking possible?" One answer is that much of our thinking relies on the thinking of others--and, increasingly, on machine others. As the authors note, flying a plane is a collaboration among pilots, designers, engineers, flight controllers, and automated systems, the collective mastery or even understanding of all of which is beyond the capacity of all but a very few humans. One thought experiment the authors propose is to produce from your mind everything you can say about how zippers work, a sobering exercise that quickly reveals the superficiality of much of what we carry inside our heads. We think we know, and then we don't. Therein lies a small key to wisdom, and this leads to a larger purpose, which is that traditional assessments of intelligence and performance are off-point: what matters is what the individual mind contributes to the collectivity. If that sounds vaguely collectivist, so be it. All the same, the authors maintain, "intelligence is no longer a person's ability to reason and solve problems; it's how much the person contributes to a group's reasoning and problem-solving process." This contribution, they add, may not just lie in creativity, but also in doing the grunt work necessary to move a project along. After all, even with better, more effectively distributed thinking, "ignorance is inevitable." Some of the book seems self-evident, some seems to be mere padding, and little of it moves with the sparkling aha intelligence of Daniel Dennett. Still, it's sturdy enough, with interesting insights, especially for team building.
Source Citation   (MLA 8th Edition)
"Sloman, Steven: THE KNOWLEDGE ILLUSION." Kirkus Reviews, 15 Jan. 2017. General OneFile, go.galegroup.com/ps/i.do?p=ITOF&sw=w&u=schlager&v=2.1&id=GALE%7CA477242249&it=r&asid=ce1a6e2c144950259e54207a2e912cf6. Accessed 11 Aug. 2017.

Gale Document Number: GALE|A477242249

The Knowledge Illusion: Why We Never Think Alone

263.46 (Nov. 14, 2016): p41.
Copyright: COPYRIGHT 2016 PWxyz, LLC
http://www.publishersweekly.com/
The Knowledge Illusion: Why We Never Think Alone
Steven Sloman and Philip Fernbach. Riverhead, $28 (304p) ISBN 978-0-399-18435-2
Sloman, a professor of cognitive, linguistic, and psychological sciences, and Fernbach, a cognitive scientist and professor of marketing, attempt nothing less than a takedown of widely held beliefs about intelligence and knowledge, namely the role of an individual's brain as the main center for knowledge. Using a mixture of stories and science from an array of disciplines, the authors present a compelling and entertaining examination of the gap between knowledge one thinks one has and the amount of knowledge actually held in the brain, seeking to "explain how human thinking can be so shallow and so powerful at the same time." The book starts with revelatory scholarly insights into the relationship between knowledge and the brain, finding that humans "are largely unaware of how little we understand." Sloman and Fernbach then take the reader through numerous real-life applications of their findings, such as the implications for non-experts' understanding of science, politics, and personal finances. In an increasingly polarized culture where certainty reigns supreme, a book advocating intellectual humility and recognition of the limits of understanding feels both revolutionary and necessary. The fact that it's a fun and engaging page-turner is a bonus benefit for the reader. Agent: Christy Fletcher, Fletcher and Co. (Mar.)
Source Citation   (MLA 8th Edition)
"The Knowledge Illusion: Why We Never Think Alone." Publishers Weekly, 14 Nov. 2016, p. 41. General OneFile, go.galegroup.com/ps/i.do?p=ITOF&sw=w&u=schlager&v=2.1&id=GALE%7CA473459010&it=r&asid=d3ca437f8d95bd15ce1444aa3bde8a2a. Accessed 11 Aug. 2017.

Gale Document Number: GALE|A473459010

Hughes, Mary Ann. "Sloman, Steven & Philip Fernbach. The Knowledge Illusion: Why We Never Think Alone." Library Journal, 1 Mar. 2017, p. 94+. General OneFile, go.galegroup.com/ps/i.do?p=ITOF&sw=w&u=schlager&v=2.1&it=r&id=GALE%7CA483702177&asid=da18f5304b6da8ba93a61bfbb02f902b. Accessed 11 Aug. 2017. "Mind meld; Cognitive science." The Economist, 8 Apr. 2017, p. 75(US). General OneFile, go.galegroup.com/ps/i.do?p=ITOF&sw=w&u=schlager&v=2.1&it=r&id=GALE%7CA493932701&asid=2fac046563d9ec341c617d1fc45fa46c. Accessed 11 Aug. 2017. Drevitch, Gary. "Massively intelligent." Psychology Today, Mar.-Apr. 2017, p. 44+. General OneFile, go.galegroup.com/ps/i.do?p=ITOF&sw=w&u=schlager&v=2.1&it=r&id=GALE%7CA483929623&asid=978da046c4d098e37ed0e9af8410f064. Accessed 11 Aug. 2017. "Sloman, Steven: THE KNOWLEDGE ILLUSION." Kirkus Reviews, 15 Jan. 2017. General OneFile, go.galegroup.com/ps/i.do?p=ITOF&sw=w&u=schlager&v=2.1&it=r&id=GALE%7CA477242249&asid=ce1a6e2c144950259e54207a2e912cf6. Accessed 11 Aug. 2017. "The Knowledge Illusion: Why We Never Think Alone." Publishers Weekly, 14 Nov. 2016, p. 41. General OneFile, go.galegroup.com/ps/i.do?p=ITOF&sw=w&u=schlager&v=2.1&it=r&id=GALE%7CA473459010&asid=d3ca437f8d95bd15ce1444aa3bde8a2a. Accessed 11 Aug. 2017.
  • New York Times Book Review
    https://www.nytimes.com/2017/04/18/books/review/knowledge-illusion-steven-sloman-philip-fernbach.html

    Word count: 1345

    People Have Limited Knowledge. What’s the Remedy? Nobody Knows
    By YUVAL HARARIAPRIL 18, 2017
    Continue reading the main story
    Share This Page
    Share
    Tweet
    Pin
    Email
    More
    Save
    Photo

    Credit
    Janet Hansen
    THE KNOWLEDGE ILLUSION
    Why We Never Think Alone
    By Steven Sloman and Philip Fernbach
    Illustrated. 296 pp. Riverhead Books. $28.
    In “The Knowledge Illusion,” the cognitive scientists Steven Sloman and Philip Fernbach hammer another nail into the coffin of the rational individual. From the 17th century to the 20th century, Western thought depicted individual human beings as independent rational agents, and consequently made these mythical creatures the basis of modern society. Democracy is founded on the idea that the voter knows best, free market capitalism believes the customer is always right, and modern education tries to teach students to think for themselves.
    Over the last few decades, the ideal of the rational individual has been attacked from all sides. Postcolonial and feminist thinkers challenged it as a chauvinistic Western fantasy, glorifying the autonomy and power of white men. Behavioral economists and evolutionary psychologists have demonstrated that most human decisions are based on emotional reactions and heuristic shortcuts rather than rational analysis, and that while our emotions and heuristics were perhaps suitable for dealing with the African savanna in the Stone Age, they are woefully inadequate for dealing with the urban jungle of the silicon age.
    Sloman and Fernbach take this argument further, positing that not just rationality but the very idea of individual thinking is a myth. Humans rarely think for themselves. Rather, we think in groups. Just as it takes a tribe to raise a child, it also takes a tribe to invent a tool, solve a conflict or cure a disease. No individual knows everything it takes to build a cathedral, an atom bomb or an aircraft. What gave Homo sapiens an edge over all other animals and turned us into the masters of the planet was not our individual rationality, but our unparalleled ability to think together in large groups.
    Continue reading the main story

    Smarter Living
    Stories to help you understand the world – and make the most of it.

    Dorm Essentials You Shouldn’t Forget (and Some You Should Skip)
    AUG 10
    What to Say When People Ask Why You Aren’t Having Children
    AUG 8
    Books About Curious Minds, Recommended for the Curious Minded
    AUG 8
    It’s Back to School Season. Here’s How to Prepare.
    AUG 2
    How Tech Can Ease Your Summer Travel
    AUG 2
    See More »
    ADVERTISEMENT
    Continue reading the main story

    As Sloman and Fernbach demonstrate in some of the most interesting and unsettling parts of the book, individual humans know embarrassingly little about the world, and as history progressed, they came to know less and less. A hunter-gatherer in the Stone Age knew how to produce her own clothes, how to start a fire from scratch, how to hunt rabbits and how to escape lions. We today think we know far more, but as individuals we actually know far less. We rely on the expertise of others for almost all our needs. In one humbling experiment, people were asked to evaluate how well they understood how a zipper works. Most people confidently replied that they understood it very well — after all, they use zippers all the time. They were then asked to explain how a zipper works, describing in as much detail as possible all the steps involved in the zipper’s operation. Most had no idea. This is the knowledge illusion. We think we know a lot, even though individually we know very little, because we treat knowledge in the minds of others as if it were our own.
    Photo

    This is not necessarily bad, though. Our reliance on groupthink has made us masters of the world, and the knowledge illusion enables us to go through life without being caught in an impossible effort to understand everything ourselves. From an evolutionary perspective, trusting in the knowledge of others has worked extremely well for humans.
    Yet like many other human traits that made sense in past ages but cause trouble in the modern age, the knowledge illusion has its downside. The world is becoming ever more complex, and people fail to realize just how ignorant they are of what’s going on. Consequently some who know next to nothing about meteorology or biology nevertheless conduct fierce debates about climate change and genetically modified crops, while others hold extremely strong views about what should be done in Iraq or Ukraine without being able to locate them on a map. People rarely appreciate their ignorance, because they lock themselves inside an echo chamber of like-minded friends and self-confirming newsfeeds, where their beliefs are constantly reinforced and seldom challenged.
    According to Sloman (a professor at Brown and editor of the journal Cognition) and Fernbach (a professor at the University of Colorado’s Leeds School of Business), providing people with more and better information is unlikely to improve matters. Scientists hope to dispel antiscience prejudices by better science education, and pundits hope to sway public opinion on issues like Obamacare or global warming by presenting the public with accurate facts and expert reports. Such hopes are grounded in a misunderstanding of how humans actually think. Most of our views are shaped by communal groupthink rather than individual rationality, and we cling to these views because of group loyalty. Bombarding people with facts and exposing their individual ignorance is likely to backfire. Most people don’t like too many facts, and they certainly don’t like to feel stupid. If you think that you can convince Donald Trump of the truth of global warming by presenting him with the relevant facts — think again.
    Indeed, scientists who believe that facts can change public opinion may themselves be the victims of scientific groupthink. The scientific community believes in the efficacy of facts, hence those loyal to that community continue to believe they can win public debates by marshaling the right facts, despite much empirical evidence to the contrary. Similarly, the traditional belief in individual rationality may itself be the product of groupthink rather than of empirical evidence. In one of the climactic moments of Monty Python’s “Life of Brian,” a huge crowd of starry-eyed followers mistakes Brian for the Messiah. Caught in a corner, Brian tells his disciples: “You don’t need to follow me, you don’t need to follow anybody! You’ve got to think for yourselves! You’re all individuals!” The enthusiastic crowd then chants in unison: “Yes! We’re all individuals!” Monty Python was parodying the counterculture orthodoxy of the 1960s, but the point may be true of the belief in rational individualism in other ages too.
    In the coming decades, the world will probably become far more complex than it is today. Individual humans will consequently know even less about the technological gadgets, the economic currents and the political dynamics that shape the world. How could we then vest authority in voters and customers who are so ignorant and susceptible to manipulation? If Sloman and Fernbach are correct, providing future voters and customers with more and better facts would hardly solve the problem. So what’s the alternative? Sloman and Fernbach don’t have a solution. They suggest a few remedies like offering people simple rules of thumb (“Save 15 percent of your income,” say), educating people on a just-in-time basis (teaching them how to handle unemployment immediately when they are laid off) and encouraging people to be more realistic about their ignorance. This will hardly be enough, of course. True to their own advice, Sloman and Fernbach are well aware of the limits of their own understanding, and they know they don’t know the answer. In all likelihood, nobody knows.
    Yuval Harari is the author of “Sapiens” and “Homo Deus.”