Official implementation of Continual Transformers including ready-to-use modules for Continual Inference.


Continual Transformers and its modules can be installed in in your project using:
pip install git+https://github.com/LukasHedegaard/continual-transformers.git
The experiment code-base is split into seperate repositories for Online Action Detection and Online Audio Classification. Below, we present a summary of result from the paper.
@article{hedegaard2022cotrans,
title={Continual Transformers: Redundancy-Free Attention for Online Inference},
author={Lukas Hedegaard and Alexandros Iosifidis},
journal={International Conference on Learning Representations (ICLR)},
year={2023}
}
See CONTRIBUTING.md