Incentives: Show Me the Reward and I'll Show You the Behavior

Mental Models
32 posts
- 1. Incentives: Show Me the Reward and I'll Show You the Behavior
- 2. Scarcity: Why Rare Things Feel More Valuable
- 3. Social Proof: Why We Follow Crowds More Than We Admit
- 4. Reciprocity: Why Giving First Often Changes Everything
- 5. Status Quo Bias: Why People Resist Change Even When It Helps
- + 27 more posts
Introduction
Incentives are one of the most reliable mental models for understanding behavior. If you want to know why people act the way they do, look at what gets rewarded, what gets punished, what becomes easy, and what becomes costly.
The basic idea is simple: incentives shape behavior. People may talk about values, goals, culture, mission, and principles, but their actions usually move toward the reward structure around them.
This does not mean people are selfish machines. It means humans adapt. A salesperson paid only on short-term revenue will tend to push for short-term revenue. A student rewarded only for test scores will tend to optimize for test scores. A social platform that rewards outrage with attention will tend to produce more outrage. A company that promotes people for visible busyness will get more visible busyness.
Incentives matter because they reveal the real operating system behind a decision. They show you why good intentions often produce disappointing outcomes and why bad systems can make reasonable people behave badly.
The useful question is not only "What does this person want?" The deeper question is "What does this environment reward?"
What Are Incentives?
Incentives are rewards, penalties, pressures, or conditions that make certain behaviors more likely.
Some incentives are obvious. Money is an incentive. A bonus, commission, promotion, fine, grade, tax, prize, or deadline can clearly influence behavior.
Other incentives are quieter. Status is an incentive. Convenience is an incentive. Approval is an incentive. Avoiding embarrassment is an incentive. Belonging to a group is an incentive. Keeping your job, protecting your identity, reducing effort, or avoiding conflict can all shape what people do.
In practical terms, incentives answer three questions:
- What behavior is rewarded?
- What behavior is punished?
- What behavior is ignored?
That third question matters more than people expect. If a system says quality matters but only rewards speed, speed wins. If a company says collaboration matters but promotes individual heroes, individual heroics win. If a person says health matters but makes junk food easy and exercise inconvenient, convenience wins.
Incentives are not always formal. A team may have no written rule against disagreement, but if people who challenge leadership get excluded from important conversations, the incentive is clear. A household may say everyone should help, but if one person always fixes the mess without complaint, the system rewards the others for waiting.
The mental model helps you see behavior as a response to structure, not just personality.
Why Incentives Matter
Incentives matter because they often beat intentions.
Most people prefer to think of behavior as the result of character. Someone works hard because they are disciplined. Someone cuts corners because they are lazy. Someone speaks up because they are brave. Someone stays silent because they are weak.
Character matters, but incentives explain a lot of what character alone misses. Put the same person in a different reward system and their behavior may change quickly.
A thoughtful employee may become defensive if every mistake is punished publicly. A careful doctor may rush through appointments if the system rewards patient volume above diagnostic quality. A responsible manager may choose short-term numbers over long-term health if promotion depends on the quarter. A creator may become shallow if the algorithm rewards speed, conflict, and repetition.
This is why incentives are central to business, education, politics, management, economics, product design, and personal habits. Any system that rewards behavior will get more of that behavior.
The problem is that incentives often create side effects. You design a reward for one outcome, but people optimize around the reward itself. Sometimes that improves the real outcome. Sometimes it distorts it.
This connects closely to Goodhart's Law. Once a measure becomes a target, people adapt to the measure. Incentives are the force that makes the adaptation happen.
How Incentives Work
Incentives work by changing the cost and reward of behavior. They do not need to control people directly. They only need to make one path more attractive than another.
1. Incentives direct attention
People pay attention to what affects their rewards.
If a company rewards sales calls, salespeople count calls. If a school rewards exam scores, students and teachers focus on exam material. If a social platform rewards engagement, creators study what gets clicks, comments, and shares.
This can be useful. Clear incentives can focus effort on important goals. But they can also narrow attention until people ignore everything outside the reward.
The sales team may ignore customer fit. The student may forget the subject after the test. The creator may sacrifice depth for reaction. The number goes up, but the real objective may not.
2. Incentives reduce friction for some behaviors
Behavior follows ease as much as desire.
If healthy food is visible, affordable, and quick to prepare, eating better becomes easier. If expense reports are painful, people delay them. If a company makes it simple to report security issues, more issues get reported. If the reporting process is confusing or punishing, people stay quiet.
An incentive is not only a reward at the end. It can be the whole path. The easier path becomes the more likely path.
This is why good design often works better than motivation. You can tell yourself to read more, but putting a book on your pillow changes the immediate incentive. You can tell a team to document decisions, but making documentation part of the workflow changes the friction.
3. Incentives create social pressure
People respond to social rewards and social penalties.
If a workplace praises people who stay late, staying late becomes a status signal. If a group mocks careful thinking as overthinking, speed becomes socially rewarded. If a community respects long-term craftsmanship, patient work becomes easier to sustain.
Social incentives are powerful because humans care about belonging. Many decisions are not just about money or rules. They are about what will make you respected, included, trusted, admired, or left alone.
This connects to social proof. When people see what others are rewarded for, they learn what behavior belongs in the system.
4. Incentives compound over time
Small rewards can create large behavioral patterns when repeated.
If a company repeatedly rewards urgent visible work over quiet preventive work, the culture will slowly drift toward urgency. If a person repeatedly rewards themselves with distraction after every difficult moment, distraction becomes the default escape. If a platform repeatedly rewards extreme content, the content ecosystem becomes more extreme.
Incentives compound because each rewarded behavior teaches the system what to do next. The lesson may be explicit or invisible, but the pattern builds.
This is why incentive design deserves care. A small misalignment today can become a large cultural, personal, or strategic problem later.
A Simple Example: The Sales Commission
Imagine a software company gives its sales team a commission for every new contract signed. The goal is reasonable: the company wants more revenue.
At first, the incentive works. Salespeople make more calls, follow up faster, and close more deals.
Then side effects appear. Some salespeople start promising features the product does not have. Others target customers who are easy to close but poor fits. The support team gets overloaded. Customer churn rises. Product managers spend time explaining why promised features cannot appear immediately. Revenue looks strong for a quarter, but trust declines.
The problem is not that salespeople are bad. The problem is that the incentive rewarded closing, not durable customer success.
A better incentive might include revenue quality, retention, customer fit, implementation success, and feedback from account managers. That does not make the system perfect, but it moves the reward closer to the real objective.
The model gives you a practical lesson: before blaming people, inspect the reward structure.
Real-World Examples of Incentives
Incentives become easier to see when you look across different systems.
Business
Companies often say they want long-term thinking, but reward short-term performance. If bonuses, promotions, and praise depend mostly on quarterly numbers, people will protect quarterly numbers.
This can lead to underinvestment, customer neglect, accounting games, or rushed decisions. The stated value may be patience, but the practical incentive is speed.
The same pattern appears in hiring. If recruiters are measured only on time-to-fill, they may push candidates through quickly. If they are measured only on cost, quality may fall. If hiring managers are rewarded for building strong teams over time, the behavior changes.
Education
Grades are incentives. They can motivate effort, clarify standards, and help students track progress. But they can also shift attention from learning to score maximization.
A student who asks "Will this be on the test?" is responding to the incentive system. That question may sound narrow, but it is rational in an environment where grades carry more consequences than curiosity.
If the goal is durable understanding, the system needs incentives for writing, discussion, application, revision, and genuine mastery, not only correct answers under exam conditions.
Public policy
Policy incentives often create behavior at scale.
Taxes, subsidies, fines, permits, welfare rules, zoning laws, and liability standards all shape choices. A subsidy can encourage investment. A tax can reduce an activity. A fine can discourage harmful behavior. But every policy also creates adaptation.
If a city fines companies for late deliveries but makes legal unloading zones scarce, drivers may double-park anyway. If a regulation rewards checking boxes over solving problems, organizations will become skilled at checking boxes.
Policy design is difficult because people do not simply obey rules. They respond to the whole system of costs, rewards, loopholes, enforcement, and social norms.
Personal habits
Your own behavior also follows incentives.
If your phone is next to your bed, scrolling has a low-friction reward. If exercise requires a long setup, skipping it becomes easier. If you keep snacks visible and vegetables hidden, your kitchen has made a choice before you have.
Personal discipline helps, but environment design is often stronger. You can change incentives by making good behavior easier, bad behavior harder, and progress more visible.
For example, if you want to write every morning, open the document before you sleep, put your phone in another room, and decide the first sentence in advance. You have changed the immediate reward structure. Writing now has less friction than avoidance.
Online platforms
Digital platforms are incentive machines. Likes, shares, views, comments, follower counts, streaks, rankings, and notifications all shape behavior.
Creators learn what the platform rewards. Users learn what gets attention. Companies learn which metrics investors care about. Over time, the platform produces more of the behavior its incentives amplify.
If thoughtful work is rewarded, thoughtful work grows. If outrage is rewarded, outrage grows. If addictive loops are rewarded, addictive loops improve.
The platform may claim to value community, knowledge, or creativity. The real question is what the interface and algorithm reward every day.
Good Incentives vs. Bad Incentives
Incentives are not good or bad by themselves. Their quality depends on what they produce.
A good incentive aligns rewarded behavior with the real objective. A bad incentive rewards behavior that looks successful while damaging the larger system.
Here is a simple comparison:
| Question | Good Incentive | Bad Incentive |
|---|---|---|
| What does it reward? | The real outcome or a close proxy | A narrow metric that is easy to game |
| What happens over time? | The system becomes healthier | The system looks better while getting worse |
| How do people adapt? | They improve the underlying behavior | They optimize around the reward |
| What does it ignore? | Few critical tradeoffs | Quality, trust, learning, or long-term cost |
This is why incentive design requires second-order thinking. You need to ask not only "Will this reward increase the desired behavior?" but also "What else will people do to obtain the reward?"
Common Mistakes When Thinking About Incentives
The incentives mental model is simple, but it is easy to misuse.
Mistake 1: Assuming incentives are only financial
Money matters, but it is not the only motivator. Status, autonomy, convenience, identity, fairness, fear, belonging, and reputation can be just as powerful.
A volunteer community may have no salaries, but it still has incentives. People may contribute for recognition, purpose, friendship, learning, or influence.
Mistake 2: Listening to stated goals instead of observed rewards
Organizations often describe the behavior they want. The reward system reveals the behavior they actually encourage.
If a company says "quality matters" but celebrates only speed, speed is the incentive. If a person says "family matters" but always rewards work with the best hours of the day, work has the stronger incentive.
Look at what happens after behavior, not only what people say before behavior.
Mistake 3: Ignoring hidden penalties
Sometimes the official incentive is positive, but the hidden penalty points the other way.
A manager may say, "Please bring me bad news early." But if every bearer of bad news gets blamed, people will delay bad news. The stated incentive rewards honesty. The hidden penalty punishes it.
This is common in teams. If people are punished for mistakes, they will hide mistakes. If people are punished for disagreement, they will perform agreement. If people are punished for asking basic questions, they will pretend to understand.
Mistake 4: Optimizing one incentive in isolation
Most systems contain many incentives at once. A bonus may reward one thing, while status rewards another and convenience rewards a third.
For example, a company may pay people for long-term performance but praise those who respond instantly to every message. The formal incentive says patience. The social incentive says interruption.
When incentives conflict, behavior often follows the incentive that is most immediate, visible, and emotionally charged.
Mistake 5: Blaming individuals for system behavior
Sometimes people are responsible for bad choices. But if many people inside the same system behave similarly, the system deserves inspection.
Repeated behavior is evidence. If every team cuts corners near deadlines, maybe the planning system rewards unrealistic commitments. If every customer support agent rushes calls, maybe the metric punishes depth. If every student forgets material after the exam, maybe the system rewards passing more than understanding.
Before you conclude "people are the problem," ask what the system teaches them to do.
How to Apply the Incentives Mental Model
You can use incentives as a practical diagnostic tool.
Start with observed behavior
Do not begin with what people claim to value. Begin with what they repeatedly do.
Ask:
- What behavior keeps showing up?
- Who benefits from this behavior?
- What costs does it avoid?
- What reward does it create?
- What would happen if someone did the opposite?
Behavior is usually more honest than language.
Separate the official incentive from the real incentive
The official incentive is what the system says it rewards. The real incentive is what people learn through experience.
For example, a company may officially reward innovation. But if failed experiments damage careers, the real incentive is caution. A team may officially reward transparency. But if uncomfortable information creates punishment, the real incentive is silence.
The gap between official and real incentives is where many cultural problems live.
Look for second-order effects
Every incentive creates adaptation. Ask what people will do after they understand the rules.
If you reward speed, what happens to quality? If you reward volume, what happens to judgment? If you reward visible output, what happens to quiet maintenance? If you reward agreement, what happens to truth?
This is not cynicism. It is design hygiene.
Reward the behavior you actually want
If you want long-term thinking, reward long-term results. If you want honesty, protect people who tell uncomfortable truths. If you want learning, reward revision and curiosity, not only correct answers. If you want healthy habits, make healthy actions easy and visible.
The closer the reward is to the real outcome, the stronger the system becomes.
Review incentives regularly
Incentives can drift. A metric that once worked can become gamed. A reward that once motivated useful effort can become a status game. A rule that once prevented harm can become a box-checking ritual.
Review the system after people have adapted to it. The question is not "Did this incentive make sense when we created it?" The question is "What behavior is it producing now?"
A Practical Incentive Checklist
Use this checklist when you want to understand a decision, team, policy, product, or habit:
- What behavior is being rewarded?
- What behavior is being punished?
- What behavior is being ignored?
- Are the rewards financial, social, emotional, or practical?
- What hidden penalties might stop the desired behavior?
- What would a smart person do if they wanted the reward with the least effort?
- What long-term cost might appear if this incentive works too well?
- Does the incentive support the real objective or only a convenient proxy?
The most useful incentives question is often the simplest: "What would this system make a reasonable person do?"
Final Thoughts
Incentives explain why behavior often follows rewards more than speeches, values, or intentions. They help you see the structure behind repeated outcomes. If you want to understand a system, look at what it rewards. If you want to improve a system, change what it rewards.
If you want a deeper framework for using mental models in everyday decisions, 100 Mental Models expands on these ideas in a broader and more practical way.
Key Takeaways
- Incentives shape behavior by making some actions more rewarding, easier, safer, or more socially acceptable than others.
- Bad incentives often create bad outcomes even when the people inside the system have good intentions.
- You can use the incentives mental model by asking what behavior is rewarded, what is punished, and what hidden tradeoffs the system creates.
Quick Q&A
What are incentives in simple terms?
Incentives are rewards, penalties, pressures, or signals that make people more likely to behave in certain ways.
How do you use incentives to make better decisions?
Look past stated goals and ask what the system actually rewards, because people usually adapt to the incentives around them.
Part of 32 in
Mental Models