Expert trap – What is it? (Part 1 of 3)
How hindsight, hierarchy, confirmation biases break knowledge and make it hard to access
🚨I will write an update to this essay based on the feedback from this Manifold Market 🚨
This essay has three parts. In part one, I include notes on epistemic status, I give a summary of the topic. But mainly I describe what is the expert trap.
Part two is about context. In “Why is expert trap happening?” I dive deeper explaining biases and dynamics behind it. Then in “Expert trap in the wild” I try to point out where it appears in reality.
Part three is about “Ways out”. I list my main ideas of how to counteract the expert trap. I end with conclusions and with a short Q&A.
How to read it? In this part, there may be a lot of knowledge that you already are familiar with. All chapters make sense on their own. Feel free to treat this article like a Q&A page and skip parts of it.
Intro
Version: 0.3. This is still a very early version of this article. Yes, this thing has versions and will improve with time.
Feedback: I would love to understand what I am missing or learn about any counter-arguments. Feel free to write a comment, dm me, or share anonymous feedback here: sysiak.com/feedback.
Writing style: I don’t normal write. English is not my native language. But I care about being simple and precise. Read more about my writing approach and values here: sysiak.com/about
Epistemic status: I think it’s a great practice, to begin with Epistemic status. That is state your certainty, effort, link to sources, and point to main counterarguments. In this case, however, feel free to skip it and come back to it at the end. I am proposing here quite a large statement and I am writing with quite a lot of uncertainty. It may be interesting to first evaluate claims on your own. At the end of the series, I will remind you about coming back to epistemic status and I will post a short Q&A that will hopefully clarify some gaps.
Epistemic status (optional)
Certainty: If I were to pick one confidence interval for the main claim it would be Likely (70%). There are also parts of the described dynamics that I think are Highly likely (90%) or Almost certain (>97%). These are for example explanations about my-side bias, confirmation bias, and hindsight bias which I think are established and largely non-controversial. But I also have quite a lot of uncertainty. There are a lot of claims that are around Maybe (50%) (hovering between 35% and 75%). I think the most uncertain part of the knowledge is Hierarchy bias and ideas from “Elephant in the Brain”. This is a sweeping reinterpretation of how human motivations work. Personally, I think it’s Likely (~70%) that hierarchy bias largely explains our behavior. But I understand others will find it a lot less probable.
Effort: I have been exposed to these findings for five years now. Since then I had time to digest them and read thoroughly around the topic.
Evidence: The evidence mostly comes from these sources: “Elephant in the Brain”, Stumbling on Happiness”, Daniel Kahneman’s work. Please let me know if you know any information that invalidates any of the research mentioned.
Antithesis: What would need to be true to invalidate the main claims of this essay? Here are three main axes of criticism 1) I may be missing some major factor (besides hierarchy and my-side bias) of why the Expert trap is happening 2) Hierarchy bias findings are not established and therefore there is a chance they are incorrect or have some gaps 3) All systems loose efficiency with getting more complex. Perhaps, it is universal and unavoidable that the more complex the knowledge the more corrupted and inaccessible it gets. The current norms around learning, sharing, encoding knowledge are rather efficient.
Summary
Summary in one paragraph: The main claim of this essay is that knowledge often gets locked in specific areas and levels of expertise. I identify two primary factors contributing to this phenomenon: hierarchy bias and my-side bias. I also present methods for how to counteract it.
Summary longer: This article explores the intuition that the way our civilization encodes knowledge is often faulty, inefficient and makes that knowledge difficult to use.
This essay explains my take on a cognitive bias that I call Expert trap. Those who are better informed have trouble or are unable to pass knowledge on to those who are less informed. Some call this bias “The curse of knowledge”. I use expert trap because it’s shorter, and has a closer association with the root of the problem. I see the Expert trap as a larger phenomenon than what one typically associates with “The curse of knowledge”. I see it as driven by a couple of other biases. These are Hindsight bias – once you know the answer to the question you will think that you would have guessed it. Hierarchy bias – people's hidden motive to acquire knowledge may be less about getting things right than elevating themselves in a hierarchy. Confirmation bias – once you have formed an opinion you will tend to select and believe in information that strengthens it and less in information that challenges it. At the root of all these biases is My-side-bias –what is mine is better. Whichever definition is mine will be questioned less.
I think the Expert trap has large and overlooked consequences. I will propose a hypothesis suggesting that the learning and sharing of knowledge within our civilization is largely inefficient, with the Expert trap serving as the primary explanation. I will describe this using examples of the educational system and our approach to learning in general. Finally, I will explain methods that may be helpful in counteracting it.
What is the Expert trap?
Healthy knowledge
First, let’s define the opposite. What’s the healthy state of knowledge? Knowledge that is efficient, robust, and useful? The metaphor I like is of conductivity. High-quality knowledge is highly conductive.
It brings one smoothly from not knowing to knowing.
It enables one to easily access different levels of complexity. So if one wants to learn just a little bit one knows how to do it. A simple explanation should be a good mapping, representation, and stepping stone to a more complex one.
It also should be roughly correct at different levels of complexity. So if one chooses to stay at a lower level and decides to apply this knowledge to their area of expertise they are going to get approximately accurate results.
I think knowledge created by our civilizations is often of low conductivity. I think one of the main drivers of this is the Expert trap dynamic.
Hindsight bias
The phrase “the curse of knowledge” was first used in 1989 by Camerer and Loewenstein. They saw it as closely related to Hindsight bias – knowing the outcome makes people falsely confident that they would have predicted the answer.
“Study participants could not accurately reconstruct their previous, less knowledgeable states of mind, which directly relates to the curse of knowledge. This poor reconstruction was theorized by Fischhoff to be because the participant was "anchored in the hindsightful state of mind created by receipt of knowledge". Fischhoff, Baruch (2003). "Hindsight is not equal to foresight".
It is as if our brains are wishfully reconstructing the knowledge to fit the outcome. If a person knows the outcome, they may be less inquisitive about its root causes, less motivated to look at it from first principles. They may be looking less carefully at each part of the process and therefore bending inputs so they match the outcome.
Historically, hindsight bias was the first clue to understanding the curse of knowledge (or what I call the Expert trap dynamic). My hypothesis is that it likely is an extension of my-side bias, which is also referred to by others as motivated reasoning (I write more on this later). That is we may be motivated to be critical about the knowledge as long as it strengthens our positive self-image. When I know the answer to the question I am not motivated to really dig deeper into its root causes. I already correctly guessed it and the reward, such as I am smart, was already delivered.
Tapping experiment
When subjects were asked to finger-tap a popular tune of their choosing they were hugely overconfident about how many people would get it. They estimated that 50% of people would get it whereas in reality, 1.33% got it from the 1990 Stanford experiment. This may be the best metaphor for the Expert trap I stumbled upon. It clearly renders what’s going on in the mind of somebody who “knows”. It looks like a person who “knows” is projecting that knowing onto their audience and is unable to see what they are really communicating.
Knowledge silos
Knowledge is often trapped in different expertise silos. It is trapped there in a state that is anywhere from – it’s hard to use to it’s unusable. I see two main mechanisms there.
When people learn, the knowledge gets trapped at each new level of understanding. Once a person acquired knowledge they often are unable to explain it to people who don’t understand. What’s fascinating here, is it seems that they wouldn’t be able to explain it to the past version of themselves. Somehow the context gets lost. Perhaps, as a person aspires to understand further, they lose track of the “Aha!” moments that brought them to where they are.
But also knowledge gets trapped across different disciplines. How easily can physicists talk to chemists about the same processes? I guess because of different terminology and mental models experts in adjacent areas often have a hard time. I will write more extensively on how this may work in “Why is expert trap happening?”
Expert trapped in Morgan Library
But let’s land in the real world. I have this silly example that I think renders well the dynamic of the Expert trap.
For two weeks I was living in Manhattan, New York, a ten-minute walk from the Morgan Library. I read that there is an interesting exhibition. I opened Google Maps and saw a photo of it. It just looked like an old library. I saw things like this before and decided not to go.
After a while, a different friend mentioned Morgan Library again. “Recommendations from two different sources? The exhibition is still going. Let’s go” I went there and was blown away. I explored it very carefully, digested it and sent a photo to my partner. I expected she will respond with excitement. But it seems it was “meh” to her. I looked at the photo and realized I made a very similar photo to the ones in Google Maps.
I think, like in Tapping experiment example, I projected my knowing on the photo. There were more things that make this place fascinating. The photo didn’t communicate them. Perhaps be that this private mansion, with the feel inside of a rural villa, is in the midst of Manhattan, one of the most densely populated places of the United States. Also the juxtaposition of wealth. This is the office of J.P. Morgan a founder of Chase, the biggest bank in the US. Here is this dirty street, and here, behind a wall, the exhibit of the most insane wealth in the world. A picture on the wall? You get closer. It’s Hans Memling. Some small altar? You zoom in and it is executed in thousand years ago in the Byzantine empire and is framing fragments of the cross of Jesus Christ.
I took the photo of this place and I have mentally overwritten it with new meanings. We may be victims of Expert Trap on many different layers and much more often than we assume.
Illusory knowledge
When learning, I think we very often fool ourselves about our comprehension level. We think we understand something, but what we actually did is familiarized ourselves with the area and memorized terminology. Memorized names, often, function as covers that are conveniently obstructing areas that are still fuzzy. Things start to feel familiar but we are not much deeper in understanding it. This seems like a pretty strong and redefining statement. I see this as a gradient. When a person learns something, they will acquire both illusory and true knowledge. I am claiming, however, that the proportion of illusory knowledge is much higher than it is conventionally assumed. Also, I don’t think people are doing it intentionally. Most of this happens subconsciously.
So when a person has a breath of complex terminology connected to some knowledge area, it is easy to mistake it for knowledge that is robust, precise, practical, flexible, and applicable to many contexts. Very few people have a habit to learn things comprehensively. That is, when asked to explain things, they can approach it from many different perspectives – explain it to a kid, a high-schooler, or an expert. Learning this way involves taking concepts and applying them to a wide variety of areas, classes, contexts; testing them against edge cases, counterfactuals; thinking about them in the most practical way. How this abstract concept intersects with the real world? How, if true, it will change what I see in the real world?
Arriving closer to true knowledge seems more like a curvy path, like a system of paths that are traveled in many directions. It may be more type of thinking that comes from play, and curiosity. Whereas illusory knowledge often can be found in thinking that is instrumental, that is a mean to something else, that tries, in a more straightforward way, to get to conclusions.
I sense this illusory knowledge is abundant. I think it's a primary way we encode knowledge. Our learning methods and our educational systems may be full of it. I think it may be largely present in any level of education: from primary school to higher education.
This could be one of the main reasons why most schools are experienced as boring, and it might also explain why I cannot remember almost any useful knowledge I learned during my primary and high school education.