What is rationality?
On one hand, we are sending human beings and rovers to mars using collective reasoning and on the other hand the same human species is stumbling over its own thoughts in daily life.
As an example, many people have a greater fear of flying in a commercial airliner than of driving a car, despite overwhelming statistical evidence that driving is many times more dangerous. This is a classic failure of rational assessment. (The dramatic, vivid, and highly publicized nature of a plane crash makes it more “available” to our memory, causing us to overestimate its likelihood compared to the mundane, everyday risk of a car accident).
We are capable of breathtaking logic and profound irrationality, often in the same day.

Why are people rational and/or irrational?
Let’s start with rationality.
In its most fundamental sense, rationality is the quality of being guided by reasons. Being rational means you’re good at using logic and thinking things through to make choices and decide what you believe. You base your actions and convictions on a careful look at the evidence.
Back in the day (we’re talking Stone Age), thinking clearly had obvious perks. You needed to figure out which berries were safe to eat, how to sneak up on a deer, and whether that weird guy in your tribe was a friend or a potential backstabber. For this kind of stuff, natural selection rewarded brains that could make smart, evidence-based decisions. Our ability to plan, calculate risks, weigh options that’s all part of the evolutionary toolkit for survival. Rational thinking helped us build tools, form alliances, and eventually invent calculus and memes.
But here’s the plot twist, irrationality isn’t some mental glitch. It’s not a bug. It’s a feature. Our brains evolved in a world full of uncertainty, where it was often better to be fast than accurate. That rustle in the bushes? Maybe it’s just the wind… or maybe it’s a leopard. Better to overreact and look silly than underreact and get eaten. That’s what scientists call error management theory, and it explains why we sometimes jump to conclusions or fall for false alarms.
And then there’s the social stuff. Believing what your group believes even if it’s nonsense, can keep you safe and accepted. In small tribal groups, being the odd one out wasn’t just awkward; it was dangerous. So our brains got really good at conformity, even if it meant sidelining logic. That’s why people follow superstitions, join cults, or vote based on vibes rather than manifestos.
In short, evolution shaped us to be just rational enough to survive and thrive. But when faced with modern problems like investing in the stock market or interpreting vaccine data our ancient mental shortcuts don’t always cut it. We’re still using Stone Age software in a Silicon Age world.
How is Rationality related to Scientific Thinking?
Imagine you’re trying to figure out whether eating curd at night really causes a cold, because your grandma swears by it. Rationality is the part of your brain that goes, “Hmm, interesting claim. Let’s test it instead of just accepting it.” And that is the first baby step toward scientific thinking.
Rationality helps you slow down, question your gut reactions, and ask: “What’s really going on here?” Now take that same energy, give it a bit of structure like forming hypotheses, running experiments, and double-checking results and boom, you’ve got scientific thinking.
In a way, scientific thinking is just rationality with a lab coat on.
But here’s the twist: you don’t need to be a scientist to think scientifically. You just need to:
Ask good questions (like “Does this work?” instead of “My cousin said it worked, so…”),
Look for evidence (actual data, not Dinesh-uncle’s forwarded WhatsApp story), and
Be open to being wrong (ouch, but essential!).
Rationality gives you the mindset. Scientific thinking gives you the method.
Together, they’re like chai and ginger, way better as a combo. Rationality helps you dodge scams, bad decisions, and shady health advice. Scientific thinking takes it further and helps you understand why things work (or don’t).
Why should rationality come with Empathy?
Okay, picture this: you’re super logical, super smart, and you just know your friend is making a terrible financial decision. So you hit them with charts, stats, and a five-minute TED Talk on compound interest. But instead of being impressed, they get defensive and stop talking to you. What went wrong?
Rationality without empathy is like Google Maps with no traffic updates, it knows the best route on paper, but it doesn’t care if the road’s flooded or if your passenger is car-sick.
Let’s break it down. Rationality helps you think clearly, question assumptions, and make decisions based on facts, not feelings or fears. That’s great. But humans are full of feelings and fears. We’re not robots; we’re messy, emotional, wonderfully irrational creatures. So if you want your logic to actually connect, it has to pass through the bridge of empathy.
Empathy is what lets you understand where someone’s coming from, even if they’re wrong. It doesn’t mean you agree with them, it means you see them as a fellow human, not a math problem to be solved.
Think of rationality as the engine of a car, and empathy as the steering wheel. You need both to get anywhere meaningful. Otherwise, you’re just powering through people, not helping them.
And honestly, some of the most harmful things in history weren’t done by people lacking intelligence, they were done by people who were coldly rational, but totally disconnected from human suffering. That’s the danger zone.
When rationality walks hand-in-hand with empathy, you get something beautiful: solutions that work and also care. Science that serves people. Arguments that persuade instead of polarize. Advice that heals instead of humiliates.
So yeah, be rational. But also, be kind. Because what’s the point of winning an argument if you lose the person?
The Ultimate Game
One last thing, rationality is sometimes complex. A powerful illustration of rationality’s complexities is the Ultimatum Game, a staple of behavioral economics. The game is simple, two players are given a sum of money, say 100 dollar. Player A, the “proposer,” offers a split of the money to Player B, the “responder.” Player B can either accept or reject the offer. If Player B accepts, they both get the money according to the split. If Player B rejects, neither player gets anything.
From a purely rational, self-interested perspective, Player A should offer the smallest possible amount, say 1 dollar, and Player B should accept it. Why? Because getting 1 dollar is better than getting nothing. However, in experiments conducted around the world, this almost never happens. People frequently reject offers they perceive as unfair, such as a 90/10 split.
This shows that human rationality isn’t solely driven by maximizing personal gain. People also consider concepts like fairness and social norms. Rejecting a seemingly unfair offer, even at personal cost, is a rational act within a broader framework that values social equity and punishes perceived injustice. This example, frequently cited in books like Nudge by Richard Thaler and Cass Sunstein, and The Righteous Mind by Jonathan Haidt, demonstrates that a full understanding of rationality must include psychological, emotional, and social factors.
Further Reading
📚 Research
- Güth, Werner, and Martin G. Kocher. “More than thirty years of ultimatum bargaining experiments: Motives, variations, and a survey of the recent literature.” Journal of Economic Behavior & Organization 108 (2014): 396-409.
- Oosterbeek, Hessel, Randolph Sloof, and Gijs Van De Kuilen. “Cultural differences in ultimatum game experiments: Evidence from a meta-analysis.” Experimental economics 7, no. 2 (2004): 171-188.
📚 Recommended Reads
-
Nudge
🔗 Amazon |
Amazon India
-
The Righteous Mind
Why Good People Are Divided by Politics and Religion
🔗 Amazon |
Amazon India






[…] Always ask: how many attributes were looked at? How many matches were found? Does this exceed what chance would allow? […]
[…] Rationalism, however, is hard work. It requires the nuance to say, “The world is getting better, but we still have serious problems.” It requires the courage to admit that while corruption exists, so does genuine competence and goodwill. […]
[…] [Rationality 101] [Scientific Thinking 101] […]
[…] [Scientific Thinking 101] [Rationality 101] […]