|
The base model is bigscience/bloom-560m. It was finetuned using RLHF and the dataset and the model prompt is similar to the original model. |
|
This repo contains the merged fp16 model. |
|
|
|
**Legal Disclaimer: This model is bound by the usage restrictions of the original BLOOM model. And comes with no warranty or gurantees of any kind.** |
|
|
|
--- |
|
- license: |
|
- bigscience-bloom-rail-1.0 <br> |
|
- datasets: |
|
- timdettmers/openassistant-guanaco <br> |
|
- language: |
|
- en <br> |
|
- reference: https://github.com/hiyouga/LLaMA-Efficient-Tuning/tree/main |
|
--- |
|
|