|
--- |
|
datasets: |
|
- tiiuae/falcon-refinedweb |
|
language: |
|
- en |
|
--- |
|
|
|
# Falcon-7B |
|
|
|
**Falcon-7B is a 7B parameters causal decoder-only model built by [TII](https://www.tii.ae) and trained on 1500B tokens of [RefinedWeb](https://huggingface.co/datasets/tiiuae/falcon-refinedweb) enhanced with curated corpora. It is made available under the [TII Falcon LLM License](https://huggingface.co/tiiuae/falcon-7b/blob/main/LICENSE.txt).** |
|
|
|
More details coming soon. |
|
|
|
|
|
# Model Card for Falcon-7B |
|
|
|
|
|
## Model Details |
|
|
|
### Model Description |
|
|
|
- **Developed by:** [https://www.tii.ae](https://www.tii.ae) |
|
- **Model type:** Causal decoder-only |
|
- **Language(s) (NLP):** English |
|
- **License:** TII Falcon LLM License |
|
|
|
### Model Source |
|
|
|
- **Paper:** coming soon |
|
- **Demo:** coming soon |
|
|
|
## Uses |
|
|
|
### Out-of-Scope Use |
|
|
|
Production use without adequate assessment of risks and mitigation; any use cases which may be considered irresponsible or harmful |
|
|
|
## Bias, Risks, and Limitations |
|
|
|
Falcon-7B is trained on English and French data only, and will not generalize appropriately to other languages. Furthermore, as it is trained on a large-scale corpora representative of the web, it will carry the stereotypes and biases commonly encountered online |
|
|
|
|
|
## Paper |
|
|
|
More details coming soon in the paper. |