File size: 646 Bytes
2a48f98 b03d05e 2a48f98 b03d05e 2a48f98 b03d05e 2a48f98 b03d05e 2a48f98 b03d05e 03d2730 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 |
---
library_name: transformers
tags: []
---
# SOLAR-10.7b-Instruct-truthy-dpo
![orca-bagel](orca-bagel.png)
This model is a finetune of [macadeliccc/SOLAR-10.7b-Instruct-truthy-dpo](https://huggingface.co/macadeliccc/SOLAR-10.7b-Instruct-dpo)
## Process
1. I finetuned upstageai/Solar-10.7b-Instruct-v0.1 with 1 epoch of Intel/orca_dpo_pairs (12.4k samples)
2. I futher finetuned that model with 3 epochs of jondurbin/truthy-dpo-v0.1 (1.04k samples)
3. This process is experimental and the base model linked above is more tested at this time.
## GGUF
Available [here](https://huggingface.co/macadeliccc/SOLAR-10.7b-Instruct-truthy-dpo-GGUF) |