--- license: apache-2.0 language: - en pipeline_tag: text-generation widget: - text: 10 Meditation tips example_title: Health Exmaple - text: Cooking red sauce pasta example_title: Cooking Example - text: Introduction to Keras example_title: Technology Example tags: - text-generation --- # ScriptForge-small ## 🖊️ Model description ScriptForge-small is a language model trained on a dataset of 100 YouTube videos that cover different domains of Youtube videos. ScriptForge-small is a Causal language transformer. The model resembles the GPT2 architecture, the model is a Causal Language model meaning it predicts the probability of a sequence of words based on the preceding words in the sequence. It generates a probability distribution over the next word given the previous words, without incorporating future words. The goal of ScriptForge-small is to generate scripts for Youtube videos that are coherent, informative, and engaging. This can be useful for content creators who are looking for inspiration or who want to automate the process of generating video scripts. To use ScriptGPT-small, users can provide a prompt or a starting sentence, and the model will generate a sequence of words that follow the context and style of the training data. Models - [Script_GPT](https://huggingface.co/SRDdev/ScriptForge) : AI content Model - [ScriptGPT-small](https://huggingface.co/SRDdev/ScriptForge-small) : Generalized Content Model More models are coming soon... ## 🛒 Intended uses The intended uses of ScriptForge-small include generating scripts for videos, providing inspiration for content creators, and automating the process of generating video scripts. ## 📝 How to use You can use this model directly with a pipeline for text generation. 1. __Load Model__ ```python from transformers import AutoTokenizer, AutoModelForCausalLM tokenizer = AutoTokenizer.from_pretrained("SRDdev/ScriptForge-small") model = AutoModelForCausalLM.from_pretrained("SRDdev/ScriptForge-small") ``` 2. __Pipeline__ ```python from transformers import pipeline generator = pipeline('text generation, model= model , tokenizer=tokenizer) context = "Cooking red sauce pasta" length_to_generate = 250 script = generator(context, max_length=length_to_generate, do_sample=True)[0]['generated_text'] script ```
The model may generate random information as it is still in beta version
## 🎈Limitations and bias > The model is trained on Youtube Scripts and will work better for that. It may also generate random information and users should be aware of that and cross-validate the results. ## Citations ``` @model{ Name=Shreyas Dixit framework=Pytorch Year=Jan 2023 Pipeline=text-generation Github=https://github.com/SRDdev LinkedIn=https://www.linkedin.com/in/srddev } ```