Tarantino Scene Generator
This is a fine-tuned version of mistralai/Mistral-7B-Instruct-v0.2 that generates movie script scenes in the distinctive style of Quentin Tarantino.
Model Details
Model Description
This model was fine-tuned on a custom dataset of Quentin Tarantino's screenplays. The goal was to teach the model the unique structural, stylistic, and tonal characteristics of his writing, including the witty, fast-paced dialogue, screenplay formatting, and shifts in tone from casual to tense. The model is intended for creative and entertainment purposes, acting as a "Scene Generator" that can take a high-level creative brief and output a formatted script scene.
- Developed by: manohar3181
- Model type: Fine-tuned version of Mistral-7B-Instruct
- Language(s) (NLP): English
- License: Apache 2.0
- Finetuned from model:
mistralai/Mistral-7B-Instruct-v0.2
Model Sources
- Repository: [Link to your GitHub repository for this project]
- Demo: [Link to your Gradio Demo on Hugging Face Spaces]
Uses
Direct Use
The model is intended to be used for creative text generation. You provide it with a high-level instruction describing a scene, and it will generate a response formatted as a screenplay.
from transformers import pipeline
# Replace with your final model repository name
model_name = "manohar3181/Tarantino-Scene-Generator-v1"
generator = pipeline("text-generation", model=model_name)
prompt = "Write a scene where two old gangsters meet in an empty warehouse. One has betrayed the other."
formatted_prompt = f"<s>[INST] {prompt} [/INST]"
outputs = generator(formatted_prompt, max_new_tokens=300)
print(outputs[0]['generated_text'])
#### Training Hyperparameters
Training Hyperparameters
The model was fine-tuned for 2 epochs using the SFTTrainer from the TRL library.
Quantization: QLoRA (4-bit NF4)
LoRA r: 16
LoRA alpha: 32
Learning Rate: 5e-5
Optimizer: Paged AdamW (8-bit)
LR Scheduler: Cosine
- Downloads last month
- 7
Model tree for manohar3181/TarintinoStyle-scene-generator-v1
Base model
mistralai/Mistral-7B-Instruct-v0.2