Defining Probability
Why we overestimate rare events
Probability Fallacies
There are many commonly accepted fallacies we humans take part in every day. I’ve always wanted to examine some of these scenarios in more detail, to better understand not only how probability works, but how people think about probability, and where we are often mistaken.
To start off, when I give a figure, like the chance of getting in a car wreck this year is 1 in 17, nobody takes into consideration that the number is meaningless. That number is saying that if you take the total number of wrecks in a year and divide it by the number of drivers, you get 1 in 17. A very easy mistake is to then conclude that my and your likelihood of getting in a wreck is 1 in 17. This is a false statement. If you are more aware on the road than your fellow human, have better eyesight, drive more carefully, always pay attention, that number goes from 1 in 17 to likely 1 in several hundred. If you are older, happen to be of a particular gender, or aren’t very aware of your surroundings, that likelihood is more like 1 in 5.
These figures need to reflect the individual. Drivers who are generally more unaware or who use their phone while driving are more likely to crash. The type of car you drive matters too—vehicles with advanced safety features like automatic braking cut your risk significantly. Averages hide these truths. Take a teen driver texting in a beat-up sedan versus a middle-aged commuter in a Tesla with autopilot. Their risks aren’t remotely the same, yet we lean on that 1 in 17 figure like it’s gospel. This is where we start screwing up probability.
Another trap is the gambler’s fallacy. People think past random events change future odds. Flip a coin five times, get heads each time, and suddenly everyone’s betting tails is “due.” Wrong. The chance of tails is still 50/50, every single flip. The math doesn’t care about your streak. People make this mistake constantly—lottery players chasing “hot” numbers, or stock traders thinking a rally has to crash because it’s gone on too long. Our brains are wired to see patterns where none exist, and it costs us. A coin tossed 10 times has a 1 in 1024 chance of landing heads every time, but the next flip? Still 50/50.
This is quite a common fallacy, we all fall for all the time. Gamblers double down after losses, thinking they’re “due” a win. Weather watchers assume no rain tomorrow because it poured today. In sports betting, people back a team to lose after a winning streak, ignoring their actual skill. The fix is simple but hard: accept randomness. Independent events don’t care about history. If you can’t grasp that, you’re bleeding money or sanity.
Then there’s the base rate fallacy. We ignore big-picture stats for shiny, specific details. Imagine a town where 99% of cab drivers are safe, but a witness says a reckless cab was blue, with 80% accuracy. Most people will bet on the blue cab being guilty, forgetting only 15% of cabs are blue. The math says the safe driver is still more likely. This happens in medicine—someone tests positive for a rare disease, and panic sets in, even though false positives are common when the disease is rare. In hiring, we lean on stereotypes over actual data. Always check the base rate first, or you’re building decisions on sand.
This base rate problem shows up everywhere. Airport security profiles people based on vague suspicions, ignoring how rare actual threats are. Shark attacks make headlines, so we avoid the ocean, despite a 1 in 3.7 million chance of getting bitten. (Fuck you Jaws franchise) The news makes things feel likely because they’re vivid, not because they’re common. Ask yourself: what’s the overall prevalence? If you skip that step, you’re not thinking—you’re reacting.
These fallacies come from our brains trying to simplify a complex world. We evolved for survival, not stats, so we lean on stories over numbers. Kinda interesting to go through some of these common themes.


