How to reduce cost of ranking by knowledge distillation 33 implied HN points • 06 Jan 24 🕹 Technology AI Machine Learning Cost Optimization Training an early ranker to mimic the final ranker can improve top-line metrics and reduce costs Knowledge distillation involves training a student model, the early ranker, to learn from a teacher model, the final ranker Implementing knowledge distillation through shared or auxiliary tasks can increase alignment between the early and final rankers
Reducing selection bias / popularity bias in ranking 26 implied HN points • 20 Jan 24 🕹 Technology AI Machine Learning Data Analysis Algorithms Coding Reducing selection bias and popularity bias in ranking is important for recommender systems. An advocated approach is to factorize user interaction signals to account for biases originating from power users and power items. The proposals for causal/debiased ranking involve factorization, mutual information, and mixture of logits to improve the ranking model.