Create README.md
Browse files# Distil-wav2vec2
This model is a distilled version of the wav2vec2 model (https://arxiv.org/pdf/2006.11477.pdf). This model is 4 times smaller and 3 times faster than the original wav2vec2 large model.
# Evaluation results
When used with a light tri-gram language model head, this model achieves the following results :
| Dataset | WER |
| ------------- |:-------------:|
| Librispeech-clean| 12.7%|
#Usage
https://github.com/OthmaneJ/distil-wav2vec2