Cognitive science is the study of human intelligence, the search for the magic ingredients that allow people to perceive. Steven Sloman focuses on the impact of community and society in the creation or enforcement of knowledge or illusion of knowledge. Sloman and Fernbach see this effect, which they call the “illusion of explanatory depth”: People believe that they know way more than they actually do. Best exemplified by how little we understand everyday devices, like toilets, zippers, and cylinder locks.
This book has three central themes: ignorance, the illusion of understanding, and the community of knowledge. We have no illusion that the lessons we can draw from our discussion are simple. Those lessons are decidedly not to reduce ignorance, live happily within your community, and dispel all illusions. On the contrary, ignorance is inevitable, happiness is often in the eye of the beholder, and illusions have their place. The point of this book is not that people are ignorant. It's that people are more ignorant than they think they are. We all suffer, to a greater or lesser extent, from an illusion of understanding, an illusion that we understand how things work when in fact our understanding is meagre.
We are living in a complex sophisticated world. Our life is facilitated by a community of people having expertise in specific domains. As an individual, we can only scratch the surface of the true complexity of the world. We rely heavily on others. This reliance on a complex sophisticated system has resulted in greater ignorance of our knowledge and understanding.
People generally have a habit of overestimating their understanding of how things work. We all suffer, to a greater extent or lesser extent, from an illusion of understanding, an illusion that we understand how things work when in fact our understanding is meagre. The illusion of explanatory depth gets created as a result of our dependence on others & overestimation of our understanding.
Before trying to explain something, people feel they have a reasonable level of understanding, after explaining, they don’t. Storing details is often unnecessary to act effectively, a broad picture is generally all we need. Storing details or going for too many details can be counterproductive.
It's remarkable how easy it is to disabuse people of their illusion; you merely have to ask them for an explanation...We have also found that people experience the illusion not only with everyday objects but with just about everything: People overestimate their understanding of political issues like tax policy and foreign relations, of hot-button scientific topics like GMOs and climate change, and even of their own finances.
We rely on abstract knowledge, vague and unanalysed. We’ve all seen the exceptions—people who cherish detail and love to talk about it at great length, sometimes in fascinating ways. And we all have domains in which we are experts,
Donald Rumsfeld was the U.S. secretary of Defence under both Presidents Gerald Ford and George W. Bush. One of his claims to fame was to distinguish different kinds of not knowing:
There are known knowns. These are things we know that we know. There are known unknowns. That is to say, there are things that we know we don’t know. But there are also unknown unknowns. There are things we don’t know we don’t know.
Known unknowns can be handled. It might be hard, but at least it is clear what to prepare for.
So people know less than everything (surprise, surprise). In fact, we know a lot less. We know just enough to get by. Because our knowledge is limited, our understanding of how things change is correspondingly limited
The Two Causal Reasoners Inside Us :
Intuition and Deliberation are different approaches or responses toward a specific task, challenge or issue. Intuitions are personal, they reside in our heads. Deliberation involves conscious reflection.
This distinction between two different kinds of thought can be found throughout classical and modern philosophy, psychology, and cognitive science. Daniel Kahneman celebrated the distinction in his book Thinking, Fast and Slow. This is a distinction thousands of years old; it goes by a variety of names in cognitive science. For example, the two systems of reasoning have been referred to as associative versus rule-based thinking or simply as System 1 versus System 2. We’ll refer to it as the distinction between intuition and deliberation. Intuition leads to one conclusion, but deliberation makes us hesitate.
The knowledge illusion occurs because we live in a community of knowledge and we fail to distinguish the knowledge that is in our heads from the knowledge outside of it. We think the knowledge we have about how things work sits inside our skulls when in fact we’re drawing a lot of it from the environment and from other people.
The knowledge illusion is the flip side of what economists call the curse of knowledge. When we know about something, we find it hard to imagine that someone else doesn’t know it. If we tap out a tune, we’re sometimes shocked that others don’t recognise it. It seems so obvious; after all, we can hear it in our heads. Because we live inside a hive mind, relying heavily on others and the environment to store our knowledge, most of what is in our heads are quite superficial. We can get away with that superficiality most of the time because other people don’t expect us to know more; after all, their knowledge is superficial too. We get by because a division of cognitive labour exists that divides responsibility for different aspects of knowledge across a community.
No comments:
Post a Comment