The hottest Calibration Substack posts right now

And their main takeaways
Category
Top Science Topics
Mindful Modeler 359 implied HN points 06 Jun 23
  1. Machine learning models have uncertainty in predictions, categorized into aleatoric and epistemic uncertainty.
  2. Defining and distinguishing between aleatoric and epistemic uncertainty is a complex task influenced by deterministic and random factors.
  3. Conformal prediction methods capture both aleatoric and epistemic uncertainty, providing prediction intervals reflecting model uncertainty.
Mindful Modeler 239 implied HN points 11 Oct 22
  1. Machine learning models often lack the ability to express uncertainty, leading to overconfidence and potential inaccuracies in predictions.
  2. Conformal prediction is a useful method to quantify uncertainty in predictive models, offering benefits like speed, model-agnosticism, and statistical guarantees.
  3. To implement conformal prediction, one must have a heuristic score of uncertainty, ensuring that the calibration of uncertainty levels is reliable for more accurate predictions.
Mike’s Blog 39 implied HN points 07 Nov 23
  1. Calibration plots show how accurate forecasts are by comparing predicted probabilities to actual outcomes.
  2. Sports betting markets are remarkably well-calibrated based on analysis of MLB, NFL, NBA, NHL, and NCAA data from 2016-2020.
  3. Implied probabilities in sports betting are normalized for analysis, where they sum to 1, to compare prediction accuracy.
Get a weekly roundup of the best Substack posts, by hacker news affinity: