PaperPulse logo
FeedTopicsAI Researcher FeedBlogPodcastAccount

Stay Updated

Get the latest research delivered to your inbox

Platform

  • Home
  • About Us
  • Search Papers
  • Research Topics
  • Researcher Feed

Resources

  • Newsletter
  • Blog
  • Podcast
PaperPulse•

AI-powered research discovery platform

© 2024 PaperPulse. All rights reserved.

Scalable Class-Incremental Learning Based on Parametric Neural Collapse

ArXivSource

Chuangxin Zhang, Guangfeng Lin, Enhui Zhao, Kaiyang Liao, Yajun Chen

cs.CV
cs.LG
|
Dec 26, 2025
4 views

One-line Summary

The paper introduces SCL-PNC, a scalable class-incremental learning method that uses parametric neural collapse to efficiently expand models while maintaining feature consistency and addressing class misalignment.

Plain-language Overview

Incremental learning allows AI models to learn new information over time without forgetting what they've previously learned, but it often struggles with balancing new and old data. This research presents a novel approach called SCL-PNC that helps AI models expand flexibly and efficiently as they encounter new categories, ensuring they don't forget older information. The method uses a strategy called neural collapse to align features across different learning modules, which helps maintain consistency and accuracy as the model grows. Experiments show that this approach works well in practical scenarios, making it a promising solution for scalable learning systems.

Technical Details