File size: 748 Bytes
f4af40e |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 |
---
license: mit
language:
- en
base_model:
- meta-llama/Llama-3.2-1B-Instruct
library_name: mlx
tags:
- llama
- librarian
- mlx
---
This is a fine-tuned model of (meta-llama/Llama-3.2-1B-Instruct)[https://huggingface.co/meta-llama/Llama-3.2-1B-Instruct] using MLX on Apple Silicon. The dataset is based on (FinGreyLit)[https://github.com/NatLibFi/FinGreyLit],
a dataset of Finnish Grey Literature. Only the English articles from the complete FinGreyLit data set were used for training.
The model is trained as a Librarian to extract bibliographic information from papers with the system prompt:
>"You are a skilled librarian
>specialized in meticulous cataloguing of digital documents.
>Extract metadata from this document. Return as JSON."
|