mbrack commited on
Commit
3629d48
1 Parent(s): 3128edb

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -15,7 +15,7 @@ pipeline_tag: text-generation
15
 
16
  **Occiglot-7B-IT-EN** is a generative language model with 7B parameters for Italian and English and trained by the [Occiglot Research Collective](https://occiglot.github.io/occiglot/)..
17
  It is based on [Mistral-7B-v0.1](https://huggingface.co/mistralai/Mistral-7B-v0.1) and trained on 113B tokens of additional multilingual and code data with a block size of 8,192 tokens per sample.
18
- Note that the model is a general-purpose base model and was not instruction-fine-tuned nor optimized for chat or other applications. We make an instruction tuned variant available as [occiglot-7b-it-en-instruct](https://huggingface.co/occiglot/occiglot-7b-fr-en-instruct)
19
 
20
  This is the first release of an ongoing open research project for multilingual language models.
21
  If you want to train a model for your own language or are working on evaluations, please contact us or join our [Discord server](https://discord.gg/wUpvYs4XvM). **We are open for collaborations!**
 
15
 
16
  **Occiglot-7B-IT-EN** is a generative language model with 7B parameters for Italian and English and trained by the [Occiglot Research Collective](https://occiglot.github.io/occiglot/)..
17
  It is based on [Mistral-7B-v0.1](https://huggingface.co/mistralai/Mistral-7B-v0.1) and trained on 113B tokens of additional multilingual and code data with a block size of 8,192 tokens per sample.
18
+ Note that the model is a general-purpose base model and was not instruction-fine-tuned nor optimized for chat or other applications. We make an instruction tuned variant available as [occiglot-7b-it-en-instruct](https://huggingface.co/occiglot/occiglot-7b-it-en-instruct)
19
 
20
  This is the first release of an ongoing open research project for multilingual language models.
21
  If you want to train a model for your own language or are working on evaluations, please contact us or join our [Discord server](https://discord.gg/wUpvYs4XvM). **We are open for collaborations!**