PaperPulse logo
FeedTopicsAI Researcher FeedBlogPodcastAccount

Stay Updated

Get the latest research delivered to your inbox

Platform

  • Home
  • About Us
  • Search Papers
  • Research Topics
  • Researcher Feed

Resources

  • Newsletter
  • Blog
  • Podcast
PaperPulse•

AI-powered research discovery platform

© 2024 PaperPulse. All rights reserved.

Comba: Improving Nonlinear RNNs with Closed-loop Control

arXivSource

Jiaxi Hu, Yongqi Pan, Jusen Du, Disen Lan, Xiaqiang Tang, Qingsong Wen, Yuxuan Liang, Weigao Sun

cs.CL
|
Jun 3, 2025
2 views

One-line Summary

Comba is a new Nonlinear RNN variant that enhances performance and efficiency in sequence modeling by using closed-loop control theory and a scalar-plus-low-rank state transition.

Plain-language Overview

Researchers have developed a new type of recurrent neural network (RNN) called Comba, which improves how these networks process sequences of data. Traditional RNNs often struggle with managing memory effectively, especially when dealing with complex data. By introducing a method based on closed-loop control theory, Comba enhances the way memory is handled through feedback mechanisms. This results in better performance and efficiency when Comba is applied to tasks in language and vision processing.

Technical Details