Update README.md
Browse files
README.md
CHANGED
@@ -24,10 +24,10 @@ library_name: transformers
|
|
24 |
---
|
25 |
|
26 |
<div style="text-align: center; display: flex; flex-direction: column; align-items: center;">
|
27 |
-
<img src="https://
|
28 |
</div>
|
29 |
|
30 |
-
# Model Card for Minerva-7B-
|
31 |
|
32 |
Minerva is the first family of **LLMs pretrained from scratch on Italian** developed by [Sapienza NLP](https://nlp.uniroma1.it)
|
33 |
in collaboration with [Future Artificial Intelligence Research (FAIR)](https://fondazione-fair.it/) and [CINECA](https://www.cineca.it/).
|
@@ -47,8 +47,7 @@ This model is part of the Minerva LLM family:
|
|
47 |
* [Minerva-1B-base-v1.0](https://huggingface.co/sapienzanlp/Minerva-1B-base-v1.0)
|
48 |
* [Minerva-3B-base-v1.0](https://huggingface.co/sapienzanlp/Minerva-3B-base-v1.0)
|
49 |
* [Minerva-7B-base-v1.0](https://huggingface.co/sapienzanlp/Minerva-7B-base-v1.0-1110)
|
50 |
-
* [Minerva-7B-
|
51 |
-
* [Minerva-7B-base-v1.0-dpo](https://huggingface.co/sapienzanlp/Minerva-7B-cpt-v1.0-mixed_recipe7-3epochs-safety-handcraft-DPO-alert-uf-evol-temp)
|
52 |
|
53 |
## 🚨⚠️🚨 Bias, Risks, and Limitations 🚨⚠️🚨
|
54 |
|
@@ -78,7 +77,7 @@ For more information about this issue, please refer to our survey:
|
|
78 |
import transformers
|
79 |
import torch
|
80 |
|
81 |
-
model_id = "sapienzanlp/Minerva-7B-
|
82 |
|
83 |
# Initialize the pipeline.
|
84 |
pipeline = transformers.pipeline(
|
|
|
24 |
---
|
25 |
|
26 |
<div style="text-align: center; display: flex; flex-direction: column; align-items: center;">
|
27 |
+
<img src="https://huggingface.co/sapienzanlp/Minerva-7B-instruct-v1.0/resolve/main/minerva-logo.png" style="max-width: 550px; height: auto;">
|
28 |
</div>
|
29 |
|
30 |
+
# Model Card for Minerva-7B-instruct-v1.0
|
31 |
|
32 |
Minerva is the first family of **LLMs pretrained from scratch on Italian** developed by [Sapienza NLP](https://nlp.uniroma1.it)
|
33 |
in collaboration with [Future Artificial Intelligence Research (FAIR)](https://fondazione-fair.it/) and [CINECA](https://www.cineca.it/).
|
|
|
47 |
* [Minerva-1B-base-v1.0](https://huggingface.co/sapienzanlp/Minerva-1B-base-v1.0)
|
48 |
* [Minerva-3B-base-v1.0](https://huggingface.co/sapienzanlp/Minerva-3B-base-v1.0)
|
49 |
* [Minerva-7B-base-v1.0](https://huggingface.co/sapienzanlp/Minerva-7B-base-v1.0-1110)
|
50 |
+
* [Minerva-7B-instruct-v1.0](https://huggingface.co/sapienzanlp/Minerva-7B-instruct-v1.0)
|
|
|
51 |
|
52 |
## 🚨⚠️🚨 Bias, Risks, and Limitations 🚨⚠️🚨
|
53 |
|
|
|
77 |
import transformers
|
78 |
import torch
|
79 |
|
80 |
+
model_id = "sapienzanlp/Minerva-7B-instruct-v1.0"
|
81 |
|
82 |
# Initialize the pipeline.
|
83 |
pipeline = transformers.pipeline(
|