PaperPulse logo
FeedTopicsAI Researcher FeedBlogPodcastAccount

Stay Updated

Get the latest research delivered to your inbox

Platform

  • Home
  • About Us
  • Search Papers
  • Research Topics
  • Researcher Feed

Resources

  • Newsletter
  • Blog
  • Podcast
PaperPulse•

AI-powered research discovery platform

© 2024 PaperPulse. All rights reserved.

Noise Consistency Training: A Native Approach for One-Step Generator in Learning Additional Controls

arXivSource

Yihong Luo, Shuchen Xue, Tianyang Hu, Jing Tang

cs.LG
|
Jun 24, 2025
6 views

One-line Summary

Noise Consistency Training (NCT) is a novel approach to add control signals to pre-trained one-step generators without retraining, achieving state-of-the-art results in controllable content generation efficiently.

Plain-language Overview

Generating high-quality content that can be easily controlled is a big challenge in AI. Traditional methods to add new controls to AI models are often slow and require a lot of computation. This paper introduces a new method called Noise Consistency Training (NCT), which allows existing AI models to learn new controls without needing to be retrained from scratch. NCT works by adding a small module to the model and using a special type of training that helps the model adapt to new conditions. This makes it faster and more efficient than older methods, while still producing high-quality results.

Technical Details