Yihong Luo, Shuchen Xue, Tianyang Hu, Jing Tang
Noise Consistency Training (NCT) is a novel approach to add control signals to pre-trained one-step generators without retraining, achieving state-of-the-art results in controllable content generation efficiently.
Generating high-quality content that can be easily controlled is a big challenge in AI. Traditional methods to add new controls to AI models are often slow and require a lot of computation. This paper introduces a new method called Noise Consistency Training (NCT), which allows existing AI models to learn new controls without needing to be retrained from scratch. NCT works by adding a small module to the model and using a special type of training that helps the model adapt to new conditions. This makes it faster and more efficient than older methods, while still producing high-quality results.