ofirzaf's picture
Create README.md
212f2d2
|
raw
history blame
443 Bytes
metadata
language: en

85% Sparse BERT-Large (uncased) Prune OFA

This model is a result from our paper Prune Once for All: Sparse Pre-Trained Language Models presented in ENLSP NeurIPS Workshop 2021.

For further details on the model and its result, see our paper and our implementation available here.