Skip to content

ZERO-9215/Online-CL-LLMs

Repository files navigation

Online-CL-LLMs

This repository contains the official implementation for our ICML 2025 paper: Exploiting Presentative Feature Distributions for Parameter-Efficient Continual Learning of Large Language Models.

Architecture

Training

Generate the training script by executing:

python gen_script_new_{benchmark}_{model}.py

Then run the resulting script to start the training.

Evaluation

Compute key metrics including: Average Performance (AP), Forgetting Rate (F.Ra), Forward Transfer (FWT) and Backward Transfer (BWT). Execute the following command:

python score.py your_result_path single_result_path 

Credits

The code of this repository partly relies on SAPT and we would like to show our sincere gratitude to authors of it.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published