Why Logic Is Overrated: How Leaders Worth Following Can Learn to Hack the Irrational Mind

Premise:

In the last post, we saw how leaders can benefit a lot by learning from the field of Anthropology. In this post, I am going to share how leaders who learn from the field of behavioural economics can have a better shot at becoming leaders worth following.

I will reiterate the importance of the need to understand the complexity that we need to address as leaders in this ever changing environment and in order to address this complexity, our understanding also needs to evolve and learn how to bring different different mindsets/skillsets and toolsets to solve difficult challenges that come at us from all sides.

So, who are Behavioral economists and what can we learn from them?

Behavioral economists are scientists who study the way we behave as humans (non-rational beings) when we come face to face with economic matters.

They understand that human behaviour is complex. They have also identified some predictable ways in which we behave irrationally. The truth is, our minds are guided by a host of hidden psychological forces, cognitive shortcuts, and emotional responses that often lead us astray.

As leaders, when we learn about these predictable ways, we can plan to leverage it to help us achieve our goals instead of moving away from them.

The Myth of the Rational Mind

Anyone who has ever tried to stick to a diet knows the frustrating gap between intention and action. We know exactly what to do — burn more calories than we consume — but we still find ourselves distracted by the candy aisle or reaching for comfort food after a tough day.

Traditional economics would suggest that losing weight should be easy for any rational person. The reality, of course, is that our choices are rarely that simple.

Lets look at five of the most surprising and impactful principles from behavioral economics and learn from them. These insights, many pioneered by Nobel laureates, don’t just explain our quirks; they reveal the fundamental operating system of the human mind and will change how you see your own decisions and the world around you.

As leaders, when we start thinking like a behavioural economist, we can run experiments to identify and leverage these predictable ways in which we act irrationally.

1. Your Brain Cares More About Framing Than Facts

The way a choice is presented can be more important than the choice itself. This powerful principle is known as the Framing Effect. Nobel Laureate Richard Thaler discovered this in a memorable experiment early in his teaching career.

After one exam, his students were furious. The class average was 72 out of 100, and even after a generous curve, they complained bitterly about the test’s difficulty.

So, for the next exam, Thaler tried something radical: he made the test out of 137 total points. This test was slightly harder, and students, on average, answered only 70% of the questions correctly. But when they saw their scores—numerical grades in the 90s—they were ecstatic.

Their performance was marginally worse, but because it was framed as a higher number, their perception of success was completely transformed.

“This scoring system has no effect on the grade you get in the course, but it seems to make you happier.”

  • Richard Thaler

This principle is a quiet powerhouse in our daily lives. A risky medication is more appealing when it “saves 90 out of 100 people” than when it “kills 10 out of 100.”

The facts are identical, but the frame changes everything. This insight is used everywhere from marketing (3 tier pricing, with one mentioned as most preferred) to public policy, shaping our choices in ways we rarely notice.

As leaders, when we learn about the Framing effect, we learn that it is important to spend time thinking deeply about how we frame choices – to our employees, to our leaders, when we want them to approve something, to our customers and partners. The way we frame the choice has the most impact on the decisions made.

2. You Hate Losing Twice as Much as You Love Winning

One of the cornerstones of behavioral economics is Loss Aversion, the principle that the psychological pain of a loss is about twice as powerful as the pleasure of an equivalent gain. We are wired to avoid losses, and this aversion drives a huge range of our decisions.

“More than he loved to win, he hated to lose.”

  • Jimmy Connors, Tennis Champion

This intense hatred of losing has profound real-world consequences. In finance, it leads to the “disposition effect,” where investors hold on to losing stocks for far too long to avoid the pain of “realizing” the loss, while simultaneously selling winning stocks too early.

Understanding loss aversion is critical because it is the engine behind other powerful biases, like the endowment effect — the reason we overvalue things we already own, because parting with them feels like a loss.

When we truly understand loss aversion and combine it with framing, we can frame things such that the preferred choice is always the one that we want people to make. This is subtle influence.

The key here is to be clear that we don’t use it to manipulate people to individually benefit but to help them make the right choice for them and for the team. The threat of taking away something is far more likely to get people to take action than the reward of getting something.

3. An Immediate Reward Is More Tempting Than a Better One Later

Given the choice between $10 right now or $15 next month, most people will take the $10 without hesitation. This is Temporal Discounting, also known as Present Bias: our powerful, often irrational, preference for immediate gratification over a larger reward that requires waiting.

There is no better modern example of this principle’s power than Amazon Prime. The promise of two-day shipping taps directly into our desire for swift gratification. How many of us would reconsider a purchase if that immediate reward were unavailable? The effect is staggering: because of its ability to satisfy our present bias, Prime members spend nearly double what other Amazon customers do.

This bias is the root of many of our biggest challenges. It’s why we procrastinate on important projects and why saving for a distant goal like retirement is so difficult. The immediate reward of spending money today feels far more compelling than the abstract, larger benefit of a secure future. Our brains are built for the now, and the future often has to fight for our attention.

We can leverage this understanding by ensuring that we plan our rewards and recognitions as early in the process as possible. The use of the annual appraisal cycle to decide on pay, in particular, and rewards in genera, is not as tempting as getting rewards as and when they accomplish something substantial. This will motivate employees to attempt big things and be open to taking risks (if that is what we want as leaders).

4. You Judge an Entire Experience by Its Peak and Its End

Our memory of an experience isn’t a faithful recording of every moment. Instead, we rely on a mental shortcut called the Peak-End Rule. Our brain forms a lasting impression based almost entirely on two things: the most emotionally intense moment (the peak) and the final moment (the end). The duration of the experience hardly matters at all.

Nobel Laureate Daniel Kahneman demonstrated this with a classic experiment involving ice water.

  • One group of participants put their hands in 14-degree water for 30 seconds.
  • A second group put their hands in 14-degree water for 60 seconds, followed by an additional 30 seconds in slightly warmer water.

Surprisingly, the second group, which endured cold for a much longer total time (90 seconds vs. 30 seconds), rated their experience as less painful. Why? Because the experience had a better “end.” The less-painful final moments reshaped their entire memory of the event.

The implications are profound. Businesses can’t perfect every second of a customer’s journey, but they can focus on creating a memorable peak and ensuring the experience finishes on a high note. A fantastic dessert can save a mediocre meal, and a simple, friendly checkout process can leave a lasting positive impression, all because our memories don’t tell the whole story.

As leaders, we can plan these peaks and end moments across the lifecycle of our employees – during the hiring, on-boarding, learning, delivering and off-boarding.

All we need to do is to pick one thing in each process and make it emotionally memorable for the employees. And make sure that we end it in a memorable way.

The same thing could be planned for customers and partners as well. This way, we ensure that people remember their engagement with us for a long time.

5. You’re Powerfully Influenced by What Society Approves Of You Doing

We are all instinctively inclined to conform, but not all social influence is created equal. Behavioral science distinguishes between two types of Social Norms:

  • Descriptive norms: What people actually do.
  • Injunctive norms: What society approves of people doing.

It turns out that appealing to our shared ideals is far more powerful than describing our flawed reality.

Researchers discovered this in an experiment at the Petrified National Forest, which was struggling with the theft of petrified wood.

They tested two signs:

  1. The first sign used a descriptive norm: “Some people steal wood from the park, but please don’t do it.” The theft rate was 7.92 percent.
  2. The second sign used an injunctive norm: “Please don’t remove wood from the forest so we can all enjoy its natural state.” The theft rate plummeted to 1.67 percent.

The first sign accidentally normalized the bad behavior, while the second appealed to a shared value—preserving nature for everyone. This insight shows why simply telling people what others do can backfire. This is why brands that build their vision on shared ideals—injunctive norms—are far more persuasive.

This is also the reason why it is crucial for us as leaders, to be clear about what kind of behaviour is acceptable, what will be tolerated and what is unacceptable. We will always get the behaviour that we tolerate and bad behaviour spreads faster than good behaviour. Lets be cognisant of the social norms we support within our teams.

Conclusion: Building a Life on Human Truth

As these principles from the field of Behavioral Economics, show that our decisions are guided by predictable psychological patterns that often defy pure logic.

We are not the perfectly rational beings of traditional economic theory. Instead, we are humans, shaped by framing, driven by the fear of loss, tempted by the present, and influenced by our peers and our memories.

Understanding these forces is the first step toward making better choices for ourselves and designing more human-centric systems for others.

True power of leveraging these principles can be unleashed when we mix and match some of them. For example, when we combine the power of framing with the notion of loss aversion and add social norms to it, the choice becomes fairly straightforward for the person making the choice.

This is also called Choice Architecture and if we are constantly expecting people (can be customers, partners, employees or vendors) to make choices, then understanding and mastering the concepts of Choice Architecture can yield significant benefits.

We become leaders worth following when we learn these principles and use them not for our personal benefits but for the benefit of the team that we lead and the larger good.

PS: Here are all the other biases that the behavioural economics field has identified for your reference.

Biases Related to Value, Loss, and Ownership

• Loss Aversion: We are not neutral about gains and losses; we feel the pain of a loss about twice as intensely as the pleasure of an equivalent gain. This asymmetry makes us risk-averse, often rejecting favorable gambles to avoid the possibility of losing.

• The Sunk Cost Fallacy: This is the tendency to continue an endeavor once an investment in money, effort, or time has been made, even when it is no longer rational to do so. A classic example is driving through a blizzard to see a basketball game simply because you paid for the tickets, whereas you would stay home if the tickets had been free. It is driven by a desire to avoid admitting that the money is already gone.

• The Disposition Effect: In investing, people tend to sell winning stocks too early (to score a success) and hold onto losing stocks too long (to avoid admitting failure). This behavior is financially damaging but psychologically comforting.

• Mental Accounting: We treat money differently depending on its source or intended use, rather than viewing our wealth as a single pool. For example, people will keep money in a low-interest savings account (a “vacation” bucket) while simultaneously holding high-interest credit card debt.

• The Endowment Effect (implied via Status Quo/Default): People often stick with default options (like a standard retirement plan or organ donation status) because making an active change requires effort and decision-making. Defaults are “sticky” because moving away from them feels like a loss or risk.

Biases Related to Judgment and Perception

• Hindsight Bias: After an event occurs (like a financial crash or an election result), we immediately create a narrative that makes the event seem inevitable and predictable. This illusion fosters a false confidence that we understand the world and ignores the role of chance.

• Confirmation Bias: Once we form a belief or an impression, we filter new information to support it. For example, if a hotel guest has a bad check-in experience, they may actively look for other faults (like a “crap” ice sculpture) to confirm their negative narrative.

• Focusing Illusion: “Nothing in life is as important as you think it is while you are thinking about it”. We tend to exaggerate the impact of a specific change (like buying a luxury car or moving to a sunnier climate) on our overall happiness because we focus narrowly on that one aspect.

• WYSIATI (What You See Is All There Is): We judge situations based only on the information immediately in front of us, ignoring what we do not know. System 1 constructs the best possible story from available evidence, regardless of its quality or quantity.

• The Peak-End Rule: We do not evaluate an experience by its total duration. Instead, our memory is disproportionately shaped by the most intense moment (the peak) and how the experience ended.

Biases in Planning and Probability

• The Planning Fallacy: We tend to underestimate the time, costs, and risks of future tasks. Even experts who know the statistical failure rates of similar projects often fail to apply those statistics to their own projects, assuming their specific case will be different.

• Overconfidence: Confidence is often a measure of the coherence of the story we tell ourselves, not the accuracy of our evidence. We often trust our intuition in environments that are too chaotic to predict (like the stock market or long-term geopolitics).

• Narrow Framing: We view decisions in isolation rather than as part of a portfolio. For example, rejecting a single gamble with a positive expected value because of fear of loss, even though accepting a series of such gambles would virtually guarantee a win.

• Availability Heuristic: We judge the probability of an event by how easily examples come to mind. For instance, people wrongly believe homicides are more common than suicides because homicides are more frequently reported in the news and thus more “available” to memory.

• Attribute Substitution: When faced with a difficult question (e.g., “How happy are you with your life?”), we unknowingly substitute it with an easier one (e.g., “How many dates did I have last month?”). This leads to judgments that answer the wrong question.

• Certainty Effect: We pay a disproportionately high premium for 100% certainty (zero risk) compared to a statistically equivalent reduction in risk (e.g., moving from 5% risk to 1% risk).

Biases in Systems and Markets

• The Winner’s Curse: In auctions or competitive bidding (like for oil fields), the winner is usually the person who was the most optimistic (and likely overestimated the value). Consequently, the winner often overpays and loses money.

• Quantification Bias: The tendency to prioritize metrics that are easy to measure (cost, speed, efficiency) over psychological or human factors that are harder to quantify (trust, mood, experience). This can lead to decisions that look efficient on a spreadsheet but destroy value in the real world.

• The Doorman Fallacy: A specific type of quantification bias where a role is defined too narrowly (e.g., a doorman just “opens the door”), leading to automation that eliminates hidden value (security, recognition, signaling status) provided by the human.