File size: 3,513 Bytes
1d2b25f 0392247 bbc8a25 4fbd0c5 b4bf016 4fbd0c5 bbc8a25 4fbd0c5 b4bf016 4fbd0c5 bbc8a25 4fbd0c5 b4bf016 4fbd0c5 bbc8a25 4fbd0c5 b4bf016 4fbd0c5 bbc8a25 4fbd0c5 b4bf016 4fbd0c5 0392247 bbc8a25 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 |
---
license: cc-by-nc-4.0
language:
- en
widget:
- text: |
User: What's a good tip for stayin' steady on a buckin' bronco?
Assistant:
example_title: "Bronco Riding Tips"
- text: |
User: Got any short tales 'bout a sheriff in the Wild West?
Assistant:
example_title: "Tales of Wild West Sheriffs"
- text: |
User: What's a catchy line from a famous cowboy song?
Assistant:
example_title: "Famous Cowboy Song Lines"
- text: |
User: What's a quick trick for keepin' a cowboy hat in place?
Assistant:
example_title: "Keeping a Cowboy Hat Secure"
- text: |
User: How do cowboys say 'hello' to each other?
Assistant:
example_title: "Cowboy Greetings"
---
![tinycowboy.png](https://huggingface.co/phanerozoic/Tiny-Cowboy-1.1b-v0.1/resolve/main/tinycowboy.png)
# Tiny-Cowboy-1.1b-v0.1
Tiny-Cowboy-1.1b-v0.1 is a specialized language model designed for generating cowboy-themed content. Developed by phanerozoic, this model is fine-tuned from TinyLlamaTinyLlama-1.1B-Chat-v1.0, optimized for environments with limited computing resources.
### Version Control
Tiny-Cowboy-1.1b-v0.1 marks the first release of this cowboy-focused language model.
### Performance
The model excels in generating engaging cowboy narratives and demonstrates a strong grasp of cowboy culture and lifestyle. However, it is less effective in general language tasks, especially in scientific and technical domains.
### Direct Use
Ideal for thematic language generation, particularly in applications where cowboy culture and storytelling are central. Less suited for general-purpose use or scenarios requiring detailed, accurate scientific explanations.
### Training Data
Incorporates a dataset focused on cowboy and Wild West themes, derived from the foundational TinyLlama-1.1B model.
### Custom Stopping Strings
Custom stopping strings were used to refine output quality:
- "},"
- "User:"
- "You:"
- "\nUser"
- "\nUser:"
- "me:"
- "user"
- "\n"
### Training Hyperparameters and Fine-Tuning Details
- **Base Model Name**: TinyLlamaTinyLlama-1.1B-Chat-v1.0
- **Base Model Class**: LlamaForCausalLM
- **Projections**: gate, down, up, q, k, v, o
- **LoRA Rank**: 16
- **LoRA Alpha**: 32
- **True Batch Size**: 4
- **Gradient Accumulation Steps**: 1
- **Epochs**: 1
- **Learning Rate**: 3e-4
- **LR Scheduler**: Linear
- **LLaMA Target Projections**: All targets modified
- **Loss**: 2.096
- **Stop Step**: 42
### Limitations
While adept at cowboy-themed content, Tiny-Cowboy-v0.1 struggles with topics outside its specialty, particularly in scientific and technical areas. The model tends to incorporate cowboy elements into responses, regardless of the question's relevance.
### Compute Infrastructure
Efficiently trained, demonstrating the feasibility of specialized model training in resource-constrained environments.
### Results
Successfully generates cowboy-themed responses, maintaining thematic consistency. However, it shows limitations in handling more complex, non-cowboy-related queries.
### Summary
Tiny-Cowboy-1.1b-v0.1 is a significant development in thematic, lightweight language models, ideal for cowboy-themed storytelling and educational purposes. Its specialization, however, limits its applicability in broader contexts, particularly where accurate, technical knowledge is required.
### Acknowledgments
Special thanks to the TinyLlama-1.1B team, whose foundational work was instrumental in the development of Tiny-Cowboy-v0.1.
|