Skip to content

Jingwei-Bao/T-InvBlocks

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

38 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Tri-Branch Invertible Block for Image Rescaling (T-InvBlock)

This repository is the official implementation of the paper Plug-and-Play Tri-Branch Invertible Block for Image Rescaling (AAAI 2025).

💥 News

  • 2024-12: The paper and the corresponding code are released.

🛠️ Dependencies and Installation

The codes are developed under the following environments:

# 1. Python 3.7.1 (Recommend to use conda)
conda create -n tinvb python=3.7.1
conda activate tinvb

# 2. PyTorch=1.9.0, torchvision=0.10.0, cudatoolkit=11.1
python -m pip install --upgrade pip
pip install torch==1.9.0+cu111 torchvision==0.10.0+cu111 -f https://download.pytorch.org/whl/torch_stable.html

# 3. Other dependencies
pip install -r requirements.txt

📚 Dataset Preparation

We use the DIV2K training split for model training, and validate on DIV2K validation split and four widely-used benchmarks: Set5, Set14, BSDS100, and Urban100.

Please organize the datasets and the code in the following folder structure:

Folder Structure for Datasets
├── datasets
│   ├── BSDS100
│   │   └── *.png
│   ├── DIV2K
│   │   ├── DIV2K_train_HR
│   │   │   └── *.png
│   │   ├── DIV2K_train_LR_bicubic
│   │   │   ├── X2
│   │   │   │   └── *.png
│   │   │   └── X4
│   │   │       └── *.png
│   │   ├── DIV2K_valid_HR
│   │   │   └── *.png
│   │   └── DIV2K_valid_LR_bicubic
│   │       ├── X2
│   │       │   └── *.png
│   │       └── X4
│   │           └── *.png
│   ├── Set5
│   │   ├── GTmod12
│   │   │   └── *.png
│   │   ├── LRbicx2
│   │   │   └── *.png
│   │   └── LRbicx4
│   │       └── *.png
│   ├── Set14
│   │   ├── GTmod12
│   │   │   └── *.png
│   │   ├── LRbicx2
│   │   │   └── *.png
│   │   └── LRbicx4
│   │       └── *.png
│   └── urban100
│       └── *.png
└── TInvBlock 
    ├── codes
    ├── experiments
    ├── results
    └── tb_logger

To accelerate training, we suggest crop the 2K resolution images to sub-images for faster IO speed.

🎯 Testing

The pretrained models are available in ./experiments/pretrained_TIRN and ./experiments/pretrained_TSAIN. The test config files are located in ./codes/options/test for quickly reproducing the results reported in the paper.

T-IRN:

# For scale x2, change directory to `.code/`, run
python test.py -opt options/test/TIRN_2.yml 

# For scale x4, change directory to `.code/`, run
python test.py -opt options/test/TIRN_4.yml

T-SAIN:

# For scale x2 with JPEG compression QF=90, change directory to `.code/`, run
python test.py -opt options/test/TSAIN_2.yml -format JPEG -qf 90

# For scale x4 with JPEG compression QF=90, change directory to `.code/`, run
python test.py -opt options/test/TSAIN_4.yml -format JPEG -qf 90
State of the Art Performance

PWC  PWC  PWC  PWC  PWC  PWC  PWC  PWC  PWC  PWC  PWC  PWC 

🚀 Training

The training configs are included in ./codes/options/train.

T-IRN:

# For scale x2, change directory to `.code/`, run
python train.py -opt options/train/TIRN_2.yml 

# For scale x4, change directory to `.code/`, run
python train.py -opt options/train/TIRN_4.yml

T-SAIN:

# For scale x2 with JPEG compression QF=90, change directory to `.code/`, run
python train.py -opt options/train/TSAIN_2.yml 

# For scale x4 with JPEG compression QF=90, change directory to `.code/`, run
python train.py -opt options/train/TSAIN_4.yml 

🙌🏻️ Acknowledgement

The code is based on SAIN, IRN and BasicSR.

🔍 Citation

If our work assists your research, feel free to give us a star ⭐ or cite us using:

@inproceedings{bao2025plug,
  title={Plug-and-play tri-branch invertible block for image rescaling},
  author={Bao, Jingwei and Hao, Jinhua and Xu, Pengcheng and Sun, Ming and Zhou, Chao and Zhu, Shuyuan},
  booktitle={Proceedings of the AAAI Conference on Artificial Intelligence},
  volume={39},
  number={2},
  pages={1826--1834},
  year={2025}
}

About

[AAAI-2025] Plug-and-Play Tri-Branch Invertible Block for Image Rescaling

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •  

Languages