This is the 42M parameter Llama 2 architecture model trained on the TinyStories dataset. These are converted from karpathy/tinyllamas. See the llama2.c project for more details.

Downloads last month
190
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for nickypro/tinyllama-42M

Finetunes
1 model
Quantizations
1 model