|
--- |
|
language: en |
|
license: apache-2.0 |
|
--- |
|
|
|
### Description |
|
Adaptation of the [flan-t5-base](https://huggingface.co/google/flan-t5-base) weights to make it compatible with the [FAT5](https://github.com/catie-aq/flashT5) framework (Flash Attention T5). |
|
This adaptation should enable the user to efficiently continue the pre-training of the flan-t5 to adapt it to more recent data, or to specialize it in a specific domain, for example. |
|
|
|
### Usage |
|
``` |
|
from transformers import AutoModel, AutoTokenizer |
|
model = AutoModel.from_pretrained("CATIE-AQ/FAT5-base-flan-en", trust_remote_code=True) |
|
tokenizer = AutoTokenizer.from_pretrained("google/flan-t5-base") |
|
``` |