Strongly suggest -> Recommend
Browse files
README.md
CHANGED
@@ -49,7 +49,7 @@ pip install git+https://github.com/huggingface/transformers.git
|
|
49 |
|
50 |
Since ModernBERT is a Masked Language Model (MLM), you can use the `fill-mask` pipeline or load it via `AutoModelForMaskedLM`. To use ModernBERT for downstream tasks like classification, retrieval, or QA, fine-tune it following standard BERT fine-tuning recipes.
|
51 |
|
52 |
-
**⚠️ We
|
53 |
|
54 |
```bash
|
55 |
pip install flash-attn
|
|
|
49 |
|
50 |
Since ModernBERT is a Masked Language Model (MLM), you can use the `fill-mask` pipeline or load it via `AutoModelForMaskedLM`. To use ModernBERT for downstream tasks like classification, retrieval, or QA, fine-tune it following standard BERT fine-tuning recipes.
|
51 |
|
52 |
+
**⚠️ We recommend using ModernBERT with Flash Attention 2 to reach the highest efficiency. To do so, install Flash Attention as follows, then use the model as normal:**
|
53 |
|
54 |
```bash
|
55 |
pip install flash-attn
|