This model receives scrambled or incoherent sentences as input and returns a meaningful sentence using the same words in the input . A form of grammar correction if you may . It was trained on a dataset of permutated sentences derived from wikipedia pages as input with the correct arrangement of words as labels . It is an encoder-decoder model that uses BERT's weight in both it's encoder and decoder .
- Downloads last month
- 8
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.