Skip to content

[ICML 2025] Generative Modeling Reinvents Supervised Learning: Label Repurposing with Predictive Consistency Learning

License

Notifications You must be signed in to change notification settings

Thinklab-SJTU/predictive-consistency-learning

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Predictive Consistency Learning | ICML 2025

Official implementation of ICML 2025 paper: "Generative Modeling Reinvents Supervised Learning: Label Repurposing with Predictive Consistency Learning".

Lay Summary:

In machine learning, labels are typically viewed as simple answers that models aim to predict from data. However, many real-world tasks involve complex labels that contain richer information beyond just a final answer. This work explores a fundamental question: when labels hold valuable information, can they be used to aid learning instead of merely serving as prediction targets?

We introduce a novel approach to unlock this hidden value in labels by treating them not only as targets but also as informative references during training. Our method, Predictive Consistency Learning (PCL), inspired by generative consistency models, breaks down label information into a progressive learning process. Besides data inputs, PCL additionally receives input from noise-perturbed labels as an additional reference, pursuing predictive consistency across different noise levels.

This strategy shows promise across diverse data types such as images, text, and graphs. By demonstrating the effectiveness of incorporating label information into model input for reference, this study opens new avenues for rethinking how labels are utilized in machine learning.

overview

inference

Code Organization

The experiments consists of three parts:

  • Graph Modality: N-body simulation (high-dimensional continuous outputs in graph learning);

  • Vision Modality: semantic segmentation (high-dimensional categorical outputs in vision learning);

  • Text Modality: supervised fine-tuning via next-token prediction (high-dimensional sequential outputs in language modeling).

Run

To run the code, please refer to the README.md in the subdirectories.

Reference

    @inproceedings{li2025pcl,
    title={Generative Modeling Reinvents Supervised Learning: Label Repurposing with Predictive Consistency Learning},
    author={Li, Yang and Ma, Jiale and Yang, Yebin and Wu, Qitian and Zha, Hongyuan and Yan, Junchi},
    booktitle={Proceedings of the 42th International Conference on Machine Learning (ICML)},
    year={2025}
    }

Contact

If you have any questions, feel free to reach us via:

Yang Li: [email protected]

About

[ICML 2025] Generative Modeling Reinvents Supervised Learning: Label Repurposing with Predictive Consistency Learning

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published