bayesian statistics, bayesian thinking, rational thinking

Bayesian Probability 101: The Mathematics of Rational Choice

A medical test for a disease is 99% accurate. This means, If you have the disease, the test will say "Positive" 99% of the time. If you don't have the disease, the test will say "Negative" 99% of the time. The result comes back Positive. Should you panic given the fact that 1 in 1000 people actually has the disease?

How to find a sinking submarine which is designed to sink?

In May 1968, the United States Navy faced a nightmare scenario. The USS Scorpion (SSN-589), a nuclear-powered attack submarine carrying 99 crewmen, had vanished. It was last reported somewhere southwest of the Azores in the vast, crushing depths of the Atlantic Ocean.

The ocean is incomprehensibly large. Finding a submarine hull which is designed to be stealthy and invisible on the ocean floor is harder than finding a needle in a haystack. It is like finding a needle in a haystack that covers half the planet, in pitch darkness. The Navy began a standard search pattern, effectively drawing a grid over the ocean and searching square by square. But the search area was too massive, and resources were finite. Weeks passed. Panic set in. The search was failing.

Enter Dr. John Craven, the Chief Scientist of the Navy’s Special Projects Division. Craven proposed a mathematical method that, at the time, was viewed with skepticism by “serious” naval strategists. He proposed using Bayesian search theory.

mathematics, united states navy, rational choice, bayesian probabilty 101

Instead of treating every square mile of the ocean as equally likely to hold the wreckage, Craven understood that they didn’t know where it was. But, crucially, he proposed that they knew something about where it might be based on different scenarios.

Craven assembled a team of submarine experts, navigators, and hydrographers. He didn’t ask them to pinpoint the sub. He asked them to bet. He proposed scenarios: “What if the battery exploded?” “What if the torpedoes fired accidentally?” “What if the hull collapsed at depth?”

For each scenario, the experts assigned a probability. If the battery exploded, the sub would have been on a specific heading. If the hull collapsed, it would have drifted in a specific current. They created a probability map of the ocean floor, divided into thousands of grid cells. Some cells had a high probability (if the explosion theory was right), and some had a low probability.

Here is where the magic of Bayesian probability came in.

When a ship searched a specific grid cell (let’s call it Cell X) and found nothing, the conventional wisdom suggested that it as a failure. Craven saw it as data.

If you search Cell X (which you thought had a 10% chance of holding the sub) and you don’t find it, the probability of the submarine being in Cell X drops to near zero. But total probability is a constant, it has to add up to 100%. If the chance of it being in Cell X drops, the chance of it being in Cell Y and Cell Z automatically goes up.

Every failed search refined the map. They weren’t just looking, they were updating their beliefs based on new information.

Five months after the disappearance, the search ship Mizar lowered a camera sled into the dark water. They targeted a specific coordinate that Craven’s map had highlighted as the highest probability zone after all the updates.

They found the USS Scorpion within 270 yards of the mathematically predicted center.

Dr. Craven hadn’t found the sub by knowing where it was. He found it by rigorously calculating where it most likely was, given what he knew and what he had failed to find elsewhere. This is the essence of Bayesian probability.

How much should we be scared about Positive medical test?

To understand how this applies to a common man.

Let’s take a classic example that demonstrates why human brains are bad at probability and why we need Bayes to make rational choices.

Imagine you go for a routine checkup. The doctor screens you for a rare, frightening disease called ‘disease-X’. Now this disease affects 1 in 1,000 people in the general population.

The test is very good. It is 99% accurate. This means:

  • If you have the disease, the test will say “Positive” 99% of the time.
  • If you don’t have the disease, the test will say “Negative” 99% of the time.

You take the test. The result comes back: Positive for disease X.

Panic sets in. Your gut reaction, and the reaction of most people (including many doctors), is to assume that there is a 99% chance you have the disease. You start writing your will.

But if you think like a Bayesian, you shouldn’t panic yet.

a medical personal showing a test kit representing medical test. (Bayesian)

The Reality: If you test positive for this disease, your actual chance of being sick is not 99%. It is roughly 9%.

How is that possible?

This is where “Priors” (Prior Probability) come in. We ignored the most important number: the base rate (1 in 1,000).

Let’s visualize a group of 1,000 random people.

  1. The Sick: According to the statistics (1 in 1,000), only 1 person in this group actually has the disease. They take the test, and it correctly says “Positive.” (Total true positives: 1).
  2. The Healthy: The other 999 people are healthy. However, the test is only 99% accurate. That means it is 1% wrong. 1% of 999 is roughly 10 people. These 10 healthy people will get a “False Positive.” (Total false positives: 10).

Now, look at the pool of people holding a “Positive” result paper. There is 1 sick person. There are 10 healthy people. There are 11 people total who tested positive.

You are one of those 11. What are the odds you are the sick one? 1 in 11.

By ignoring the prior probability (that the disease is rare), your rational choice mechanism broke down. Bayesian thinking restores it. It tells you that extraordinary claims (you have a rare disease) require extraordinary evidence, and a single test with a 1% error rate is not yet sufficient evidence to overcome the rarity of the disease.

[What if you get false negative? – 1,000 Random People.

The Sick (1 People): There is 1 sick person in the group. If the test is 99% accurate, it catches the disease 99% of the time. Conversely, it misses the disease 1% of the time. Since there is only 1 person, 1% of 1 person is a tiny fraction 0.01. For simplicity, let’s say there is a very small chance (0.01 people) this sick person gets a “False Negative.”

The Healthy (999 people): There are 999 healthy people. The test is 99% accurate, meaning it correctly identifies them as negative 99% of the time. 999 multiply 0.99 ~ 989 people. These are the “True Negatives.”

Now, look at the pool of people holding a “Negative” result paper. Healthy people with a Negative result: ~989. Sick people with a Negative result: ~0.01 (This represents the 1% chance the single sick person was missed). Total Negative Results: 989 + 0.01 = 989.01.

If you hold a negative result, what are the odds you are actually healthy? (989/ 989.01) * 100 ~ 99.999 %]

Understanding the Machine: How Bayesian Probability Works

Most of us are taught “Frequentist” probability in school. This is the probability of coin flips and dice rolls. If you flip a coin enough times, 50% of the time it will be heads. It treats probability as a property of the object itself.

Bayesian probability is different. It treats probability as a measure of belief.

It acknowledges that we rarely have perfect information. We have a “degree of belief” that something is true, and we encounter new evidence that should shift that belief.

The Bayesian Formula (Simplified)

The mathematical formula for Bayes’ Theorem looks like this , P(A|B)=P(B|A)P(A)P(B)P(A|B)= \frac{P(B|A) P(A)}{P(B)}.

We can break it down into four simple concepts that act as a machine for updating your mind.

1. The Prior (What you thought before)

This is your starting point. In the submarine example, the Prior was the initial guess of where the sub was based on the explosion scenarios. In the medical example, the Prior was the 1-in-1,000 rarity of the disease.

  • Rational Lesson: You never start from zero. You always start with context. If you hear hoofbeats, think horses, not zebras unless you are in the Serengeti (where your Prior for zebras is higher).

2. The Likelihood (The strength of the new evidence)

This asks: “If my theory were true, how likely is it that I would see this evidence?”

  • If you have the disease, how likely is a positive test? (99%).
  • If the submarine is in Cell X, how likely is it that we missed it?
  • Rational Lesson: Not all evidence is created equal. A blurry photo of a UFO is “evidence,” but the Likelihood of seeing a blurry blob given that it’s just a plane is very high, so the evidence is weak.

3. The Posterior (What you think now)

This is the result. It is your new belief after you mathematically smash the Prior and the Likelihood together.

  • Your Posterior probability becomes your new Prior for the next time you get evidence.

4. The Update Loop

This is the heartbeat of Bayesian thinking. You don’t just calculate once; you iterate.

  • Step 1: Belief (Prior) -> Step 2: New Data -> Step 3: Update Belief (Posterior).
  • Step 4: The Posterior becomes the new Prior -> Step 5: More Data…

Why This Matters for Rational Choice

We live in an age of information overload. We are constantly bombarded with news, studies, anecdotes, and data points. Most people process this information through “Confirmation Bias”, we accept evidence that supports what we already think and ignore evidence that doesn’t.

Bayesian thinking is the mathematical opposite of confirmation bias.

1. It forces you to quantify your uncertainty.

A Bayesian never says “I am 100% sure.” Being 100% sure is a mathematical trap. If your Prior is 100% (or 0%), no amount of evidence can ever change your mind (because anything multiplied by zero is zero). A rational person might say, “I am 90% sure this political policy will work.” This leaves 10% room for being wrong. If the policy fails, the Bayesian updates that 90% downward, rather than making excuses to keep it at 100%.

2. It helps you weigh the “Base Rate.”

This is the medical example. When you see a shocking news headline (“Shark Attack in Florida or in Sydney. “), your brain screams DANGER. A Bayesian brain checks the Base Rate: “Millions of people swam today; one was attacked.” The Prior probability of being attacked remains microscopically low, despite the scary new evidence.

3. It creates a mechanism for changing your mind.

In our culture, changing your mind is often seen as a sign of weakness or hypocrisy. In Bayesian terms, changing your mind is the only rational response to new evidence. If you don’t update your Posterior when new data arrives, you are mathematically malfunctioning. The USS Scorpion was found because the search team was willing to say, “We looked in the most likely spot and didn’t find it; therefore, the most likely spot is no longer the most likely spot.”

Conclusion: The Bayesian Way of Life

The world is not a coin flip. It is a messy, chaotic place where we never have all the facts. We are all captains of ships searching for submarines in the dark.

Most people navigate by gut feeling, fearing the worst (the false positive) or ignoring the context (the base rate). They cling to their initial beliefs regardless of what they see.

Bayesian probability offers a better way. It suggests that truth isn’t a destination you arrive at and stop; it is a direction you move toward. It teaches us to be humble about what we know (checking our Priors), to be critical of what we see (analyzing the Likelihood), and to be brave enough to change our minds when the math tells us we were wrong.

To be rational is not to be emotionless or robotic. To be rational is simply to ensure that your internal map of reality matches the external territory as closely as possible. And the only way to keep that map accurate is to update it, one piece of evidence at a time.

[Scientific Thinking 101] [Rationality 101]

📚 Recommended Reads

  • Fooled by Randomness: The Hidden Role of Chance in Life and in the Markets
    Nassim Nicholas Taleb


    🔗 Amazon |
    Amazon India

  • Rationality: What It Is, Why It Seems Scarce, Why It Matters

    Steven Pinker

    🔗 Amazon |
    Amazon India

Leave a Reply

Your email address will not be published. Required fields are marked *