Model was fine-tuned from sberbank-ai/ruT5-base on parallel detoxification corpus.
text2text generation
encoder-decoder
bpe
32 101
222 M