Skip to content

Parth38/Knowledge-Distillation-using-Attention-Maps

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 

Repository files navigation

Knowledge-Distillation-using-Attention-Maps

This repository explores knowledge distillation in Vision Transformers (ViTs) with a focus on attention maps. This technique helps the student model mimic the attention maps of the teacher model, improving its understanding of image details. Attention maps are generated using attention rollout, providing insights into where the model focuses its attention during inference. An appropriate loss function is created for the same. image

Deeplake/ Activeloop login is required to access the imagenet dataset.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published