sayakpaul HF staff commited on
Commit
8ab76fa
·
1 Parent(s): 2a9ec74

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +4 -1
README.md CHANGED
@@ -10,4 +10,7 @@ pinned: false
10
  license: apache-2.0
11
  ---
12
 
13
- Check out the configuration reference at https://huggingface.co/docs/hub/spaces#reference
 
 
 
 
10
  license: apache-2.0
11
  ---
12
 
13
+ Attention Rollout was proposed by [Abnar et al.](https://arxiv.org/abs/2005.00928) to quantify the information
14
+ that flows through self-attention layers. In the original ViT paper ([Dosovitskiy et al.](https://arxiv.org/abs/2010.11929)),
15
+ the authors use it to investigate the representations learned by ViTs. The model used in the backend is a ViT B-16 model. For more
16
+ details about it, refer to [this notebook](https://github.com/sayakpaul/probing-vits/blob/main/notebooks/load-jax-weights-vitb16.ipynb).