The hottest Regression Substack posts right now

And their main takeaways
Category
Top Science Topics
arg min 297 implied HN points 04 Oct 24
  1. Using modularity, we can tackle many inverse problems by turning them into convex optimization problems. This helps us use simple building blocks to solve complex issues.
  2. Linear models can be a good approximation for many situations, and if we rely on them, we can find clear solutions to our inverse problems. However, we should be aware that they don't always represent reality perfectly.
  3. Different regression techniques, like ordinary least squares and LASSO, allow us to handle noise and sparse data effectively. Tuning the right parameters can help us balance accuracy and manageability in our models.
Mindful Modeler 279 implied HN points 03 Jan 23
  1. In regression, conformal prediction can turn point predictions into prediction intervals with guarantees of future observation coverage.
  2. Starting from point predictions or non-conformal intervals from quantile regression are two common approaches to creating prediction intervals.
  3. Conformalized mean regression and conformalized quantile regression are two techniques to generate prediction intervals in regression models.
Mindful Modeler 139 implied HN points 25 Apr 23
  1. Log odds are additive, probabilities are multiplicative. Some interpretation methods like expressing predictions as a linear sum may benefit from log odds.
  2. Edge transitions, like from 0.001 to 0.01, may sometimes be more significant than middle transitions, like 0.5 to 0.6.
  3. Probabilities offer intuitive understanding for decision-making, cost calculations, and are more commonly familiar compared to log odds.
The Palindrome 0 implied HN points 05 Mar 24
  1. Real datasets often have multiple features, going beyond a single variable. Understanding how to handle multiple variables is crucial in machine learning.
  2. Linear regression can be generalized to handle multiple variables by using a regression coefficient vector and a bias term.
  3. The parameters of a multivariable linear regression model help define a d-dimensional plane, providing a way to map feature vectors to target values in a straightforward manner.
Get a weekly roundup of the best Substack posts, by hacker news affinity: