PaperPulse logo
FeedTopicsAI Researcher FeedBlogPodcastAccount

Stay Updated

Get the latest research delivered to your inbox

Platform

  • Home
  • About Us
  • Search Papers
  • Research Topics
  • Researcher Feed

Resources

  • Newsletter
  • Blog
  • Podcast
PaperPulse•

AI-powered research discovery platform

© 2024 PaperPulse. All rights reserved.

Optimal Unconstrained Self-Distillation in Ridge Regression: Strict Improvements, Precise Asymptotics, and One-Shot Tuning

ArXivSource

Hien Dang, Pratik Patil, Alessandro Rinaldo

math.ST
cs.LG
stat.ML
|
Feb 19, 2026
5 views

One-line Summary

This paper demonstrates that self-distillation can significantly improve ridge regression performance by optimally mixing teacher predictions, providing precise asymptotic analyses and a practical one-shot tuning method.

Plain-language Overview

Self-distillation is a technique where a model (the student) is retrained using a combination of actual data labels and predictions made by the same model (the teacher). This study explores how self-distillation can enhance ridge regression, a common statistical method, by adjusting how much weight is given to the teacher's predictions. The researchers found that this approach can consistently improve the model's predictions, even in challenging scenarios. They also developed a practical method to determine the best way to mix these predictions without extensive trial and error, which was validated using real-world data.

Technical Details