tinyviking.png

TinyViking-1.1B-v0.1

TinyViking-1.1B-v0.1 is a specialized language model designed for generating Viking-themed content. Developed by phanerozoic, this model is fine-tuned from TinyLlamaTinyLlama-1.1B-Chat-v1.0, optimized for environments with limited computing resources.

Performance

TinyViking is capable of generating engaging Viking narratives, reflecting an understanding of Viking culture. However, it is not designed for general language tasks and may struggle with complex scientific or technical queries.

Direct Use

Ideal for thematic language generation, particularly in settings like NPCs in games, where fun and thematic engagement are prioritized over detailed factual accuracy.

Training Data

Trained on "The Saga of Grettir the Strong: Grettir's Saga" to ensure authentic thematic content.

Custom Stopping Strings

Custom stopping strings are employed to enhance output quality:

  • "},"
  • "User:"
  • "You:"
  • "\nUser"
  • "\nUser:"
  • "me:"
  • "user"
  • "\n"

Training Hyperparameters and Fine-Tuning Details

  • Learning Rate: 2e-5
  • Epochs: 1
  • Training Duration: Approximately 5.6 minutes on an RTX 6000 Ada GPU
  • LoRA Rank: 2048
  • LoRA Alpha: 4096
  • LoRA Dropout: 0.05
  • Cutoff Length: 256
  • Batch Size: 4 (micro batch size)
  • Warmup Steps: 8
  • Optimizer: adamw_torch
  • Gradient Accumulation Steps: 1

Limitations

Specialized in Viking dialect and narratives, TinyViking is less effective outside its thematic focus.

Compute Infrastructure

Trained on an RTX 6000 Ada Lovelace GPU

Results

Successfully generates Viking-themed responses, maintaining thematic consistency while displaying improved coherence and depth over previous models due to advancements in dataset generation and parsing.

Summary

TinyViking-1.1B-v0.1 shows an improvement in quality compared to earlier thematic models, thanks to a new dataset generation method that helps to conserve the base model's already tenuous ability to hold a conversation. While it excels in Viking-themed interactions, its specialized focus limits broader application.

Acknowledgments

Gratitude to the TinyLlama team, whose foundational work was, as always, essential for developing TinyViking.

Downloads last month
17
Safetensors
Model size
1.1B params
Tensor type
FP16
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for phanerozoic/Tiny-Viking-1.1b-v0.1

Quantizations
2 models

Collection including phanerozoic/Tiny-Viking-1.1b-v0.1