Model from the preprint Unlimiformer: Long-Range Transformers with Unlimited Length Input.

This model was finetuned from a BART-base model using Unlimiformer-aware early stopping, as described in section 3.1 of the paper. It was finetuned on the dataset BookSum (full-book setting).

The inference demo is disabled because you must add the Unlimiformer files to your repo before this model can handle unlimited length input! See the Unlimiformer GitHub for setup instructions.

Downloads last month
106
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and HF Inference API has been turned off for this model.

Dataset used to train abertsch/unlimiformer-earlyk-bart-booksum