English2Hinglish-Flan-T5-Base
This is a finetuned model of google/flan-t5-base with rvv-karma/English-Hinglish-TOP dataset.
Usage
# Load model directly
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
model_name = "rvv-karma/English2Hinglish-Flan-T5-Base"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForSeq2SeqLM.from_pretrained(model_name)
input_text = "What are you doing?"
input_ids = tokenizer(input_text, return_tensors="pt")
output_ids = model.generate(**input_ids)
output_text = tokenizer.decode(output_ids[0], skip_special_tokens=True)
print(output_text)
Fine-tuning script
References
- Downloads last month
- 14
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.