The base model is bigscience/bloom-560m. It was finetuned using RLHF and the dataset and the model prompt is similar to the original model. This repo contains the merged fp16 model.
Legal Disclaimer: This model is bound by the usage restrictions of the original BLOOM model. And comes with no warranty or gurantees of any kind.
- license:
- bigscience-bloom-rail-1.0
- bigscience-bloom-rail-1.0
- datasets:
- timdettmers/openassistant-guanaco
- timdettmers/openassistant-guanaco
- language:
- en
- en
- reference: https://github.com/hiyouga/LLaMA-Efficient-Tuning/tree/main