Edit model card

Base Model: https://huggingface.co/mostafaamiri/persian_llama_7B_merged


Model fine-tuned on a real news dataset and optimized for neural news generation.

Note: Turkish was not in pretraining.

from transformers import AutoModelForSequenceClassification, AutoTokenizer, pipeline

# Load model and tokenizer
tokenizer = AutoTokenizer.from_pretrained(""mostafaamiri/persian_llama_7B_merged"")
model = AutoModelForSequenceClassification.from_pretrained('tum-nlp/neural-news-generator-llama-7b-fa')

# Create the pipeline for neural news generation and set the repetition penalty >1.1 to punish repetition.
generator = pipeline('text-generation',
                      model=model,
                      tokenizer=tokenizer,
                      repetition_penalty=1.2)

# Define the prompt
prompt = " [EOP] به‌ دنبال «شورش مسلحانه» مزدوران نظامی واگنر و تصرف برخی "

# Generate
generator(prompt, max_length=1000, num_return_sequences=1)

Trained on 6k datapoints (including all splits) from: https://huggingface.co/datasets/RohanAiLab/persian_news_dataset

Downloads last month
9
Safetensors
Model size
6.9B params
Tensor type
FP16
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.