license: apache-2.0 | |
pipeline_tag: text-generation | |
tags: | |
- pretrained | |
- mistral | |
- 7b | |
inference: true | |
widget: | |
- messages: | |
- role: user | |
content: What is your favorite condiment? | |
# Model Card for Mistral-7B-v0.2 | |
The Mistral-7B-Instruct-v0.2 Large Language Model (LLM) was fine-tuned on top of Mistral-7B-v0.2. | |
Mistral-7B-v0.2 has the following changes compared to Mistral-7B-v0.1 | |
- 32k context window (vs 8k context in v0.1) | |
- Rope-theta = 1e6 | |
- No Sliding-Window Attention | |
For full details of this model please read our [paper](https://arxiv.org/abs/2310.06825) and [release blog post](https://mistral.ai/news/la-plateforme/). | |