File size: 504 Bytes
0363480 8ebfa87 0363480 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 |
---
datasets:
- ehartford/dolphin
license: apache-2.0
---
**Base Model :** mosaicml/mpt-30b
**Tool :** MosaicML's llm-foundry (https://github.com/mosaicml/llm-foundry)
**Dataset :** Entire flan1m-GPT4 dataset
**Config yaml with Model Params :** https://huggingface.co/iamplus/mpt-30b-v4/blob/main/mpt-30b_v4.yaml
***Description :*** **mosaicml/mpt-30b** -> Finetuning on (Entire flan3m-GPT3.5 dataset for 4 epochs)
**Prompt Format :**
```
<system>: [system prompt]
<human>: [question]
<bot>:
``` |