They lied to us about Bayes' theorem
How to use Bayes' theorem to update your beliefs systematically
Bayesian thinking
Let's start with a small exercise. Consider this imaginary person Steve, whose occupation we need to predict from a given list of possibilities: librarian, farmer or salesman. To further help us with the task, here's a description of Steve:
Steve is very shy and withdrawn, invariably helpful, but with little interest in people, or in the world of reality. A meek and tidy soul, he has a need for order and structure, and a passion for detail.
People usually jump to the conclusion that Steve is most probably a librarian by applying the heuristic of representativeness - the degree to which the description is similar to that of librarians in general. But such heuristics have major flaws, one of which is the insensitivity to prior probabilities. Studies have shown that when presented with the above information, people invariably tend to ignore the fact that there are way more salesmen and farmers in the general population than librarians. The ideal response should come from updating our beliefs when presented with new information (Steve's description) rather than basing it entirely on the description. Here's the Nobel winning study that brought this idea to light.
This was a small example but the bigger picture is about beliefs in general. Beliefs shape who we are, how we develop competencies and how we behave in a given environment. How do we then go about managing our beliefs?
Enter Bayes' theorem
You can probably recall Bayes' theorem from your high school textbook but what CBSE did not tell us, was how to look at Bayes' theorem as a framework to systematically update our existing beliefs. To put it simply, Bayes' theorem tells us how to calculate the probability of a hypothesis being true, given some prior knowledge of conditions related to the event. Mathematically, it is represented as:
P(H) is called the prior, P(E|H) is the likelihood and the answer P(H|E) is called the posterior. Bayes would sit with his back facing a table and ask his assistant to randomly throw a ball at it. He would then try to guess the exact position of where it landed by asking his assistant to throw more balls and tell him if they landed to the left / right of the original ball. This would allow him to systematically update his idea of where the original ball might have landed with increasing accuracy. Notice how if the prior is 0 (0% certainty) or 1 (100% certainty), the posterior really doesn't change. This is the mathematical equivalent of not being able to update your beliefs with new evidence because you have already made up your mind. As a thumb rule, Bayes' theorem can be used in situations where you have some hypothesis, you see some new evidence and you want to find the probability that the hypothesis holds given that the evidence is true.
The thing about competencies
In his book Competence at Work, Spencer defines competencies as underlying characteristics of people that dictate "ways of behaving or thinking, generalising across situations, and enduring for a reasonably long period of time". He goes on to outline 5 types of competency characteristics / drivers: motives, traits, self-concept, knowledge and skill. As depicted below, traits, motives, self-concept and attitudes are more difficult to develop than knowledge and skills. This is why fancy training programs or online courses often fail. It is very difficult to bring about an impact at a core personality level through these means.
While Bayes' theorem tells us how to update our beliefs based on 'priors', it does not tell us how to set these priors. This can be really dangerous if you are trying to build new competencies. If you are already a 100% certain about your current self-concept or motives, then no amount of new evidence or training can help you change your beliefs and hence your outcomes (say, better performance). You ultimately enter into a self-fulfilling prophecy.
It might seem like we do not have a natural intuition or a sense of how the formula works but Derek argues the opposite, that we might in fact be too good at internalising Bayes' theorem:
"I think in life we can get used to particular circumstances, we can get used to results, maybe getting rejected or failing at something or getting paid a low wage and we can internalise that . . . . . . and we keep updating our beliefs to a point of near certainty that we think that is basically the way nature is, it's the way the world is and there's nothing that we can do to change it"
Fighting egocentric bias
Most people think they do most of the work - that's egocentric bias. When researchers asked authors of multi-author papers for an estimate (in percentage) of their contribution, the result on average added up to 140%. This is not necessarily because people want to appear like they contribute more. Individually, we remember vividly all the small steps we took towards our share of the work but we don't have the same visibility over others. Our perception of others' work is largely through their account of the same and vice versa. It can be useful to remember this in high leverage situations like performance reviews because these conversations are great avenues to adjust our priors and update our beliefs. In a world where the word 'feedback' is becoming too generic to be of use, labelling feedback as adjustments to 'priors' can bring structure to update our core beliefs in a meaningful way.