phanerozoic's picture
Update README.md
4fbd0c5 verified
|
raw
history blame
3.51 kB
metadata
license: cc-by-nc-4.0
language:
  - en
widget:
  - text: |
      User: What's a good tip for stayin' steady on a buckin' bronco?
      Assistant:
    example_title: Bronco Riding Tips
  - text: |
      User: Got any short tales 'bout a sheriff in the Wild West?
      Assistant:
    example_title: Tales of Wild West Sheriffs
  - text: |
      User: What's a catchy line from a famous cowboy song?
      Assistant:
    example_title: Famous Cowboy Song Lines
  - text: |
      User: What's a quick trick for keepin' a cowboy hat in place?
      Assistant:
    example_title: Keeping a Cowboy Hat Secure
  - text: |
      User: How do cowboys say 'hello' to each other?
      Assistant:
    example_title: Cowboy Greetings

tinycowboy.png

Tiny-Cowboy-1.1b-v0.1

Tiny-Cowboy-1.1b-v0.1 is a specialized language model designed for generating cowboy-themed content. Developed by phanerozoic, this model is fine-tuned from TinyLlamaTinyLlama-1.1B-Chat-v1.0, optimized for environments with limited computing resources.

Version Control

Tiny-Cowboy-1.1b-v0.1 marks the first release of this cowboy-focused language model.

Performance

The model excels in generating engaging cowboy narratives and demonstrates a strong grasp of cowboy culture and lifestyle. However, it is less effective in general language tasks, especially in scientific and technical domains.

Direct Use

Ideal for thematic language generation, particularly in applications where cowboy culture and storytelling are central. Less suited for general-purpose use or scenarios requiring detailed, accurate scientific explanations.

Training Data

Incorporates a dataset focused on cowboy and Wild West themes, derived from the foundational TinyLlama-1.1B model.

Custom Stopping Strings

Custom stopping strings were used to refine output quality:

  • "},"
  • "User:"
  • "You:"
  • "\nUser"
  • "\nUser:"
  • "me:"
  • "user"
  • "\n"

Training Hyperparameters and Fine-Tuning Details

  • Base Model Name: TinyLlamaTinyLlama-1.1B-Chat-v1.0
  • Base Model Class: LlamaForCausalLM
  • Projections: gate, down, up, q, k, v, o
  • LoRA Rank: 16
  • LoRA Alpha: 32
  • True Batch Size: 4
  • Gradient Accumulation Steps: 1
  • Epochs: 1
  • Learning Rate: 3e-4
  • LR Scheduler: Linear
  • LLaMA Target Projections: All targets modified
  • Loss: 2.096
  • Stop Step: 42

Limitations

While adept at cowboy-themed content, Tiny-Cowboy-v0.1 struggles with topics outside its specialty, particularly in scientific and technical areas. The model tends to incorporate cowboy elements into responses, regardless of the question's relevance.

Compute Infrastructure

Efficiently trained, demonstrating the feasibility of specialized model training in resource-constrained environments.

Results

Successfully generates cowboy-themed responses, maintaining thematic consistency. However, it shows limitations in handling more complex, non-cowboy-related queries.

Summary

Tiny-Cowboy-1.1b-v0.1 is a significant development in thematic, lightweight language models, ideal for cowboy-themed storytelling and educational purposes. Its specialization, however, limits its applicability in broader contexts, particularly where accurate, technical knowledge is required.

Acknowledgments

Special thanks to the TinyLlama-1.1B team, whose foundational work was instrumental in the development of Tiny-Cowboy-v0.1.