OthmaneJ commited on
Commit
1e2ba10
1 Parent(s): 350798d

Create README.md

Browse files

# Distil-wav2vec2
This model is a distilled version of the wav2vec2 model (https://arxiv.org/pdf/2006.11477.pdf). This model is 4 times smaller and 3 times faster than the original wav2vec2 large model.

# Evaluation results
When used with a light tri-gram language model head, this model achieves the following results :
| Dataset | WER |
| ------------- |:-------------:|
| Librispeech-clean| 12.7%|

#Usage
https://github.com/OthmaneJ/distil-wav2vec2

Files changed (1) hide show
  1. README.md +0 -0
README.md ADDED
File without changes