You made money last month. Were you skilled, or did you get lucky? Profit alone cannot answer that question. Calibration score can.
What Calibration Actually Means
A perfectly calibrated forecaster means: when they say something has a 70% chance of happening, it happens 70% of the time. When they say 30%, it happens 30% of the time. Across hundreds of predictions, their stated probabilities match actual frequencies.
Most people are overconfident: they say 80% when the true probability is 65%. Some people are underconfident: they hedge towards 50% because they are afraid of being wrong. Both failures cost money in prediction markets.
The Brier Score
The Brier score is the standard calibration metric. It is the mean squared difference between your stated probability and the actual outcome (1 or 0). A perfect score is 0. A completely random forecaster scores 0.25. The lower your Brier score, the better your calibration.
Track your Brier score separately for each domain you trade. Domain-specific calibration tells you where your skill is real and where you are fooling yourself.
How to Improve Calibration
- →Keep a prediction log — write down your probability estimate before entering any position
- →Review resolved markets monthly and compare stated probabilities to outcomes
- →Look for systematic biases: do you overestimate underdog probabilities? Underestimate incumbents?
- →Use reference class forecasting to anchor your initial estimate before adjusting for specifics
- →Deliberately make predictions in low-stakes domains just to practice the calibration habit