navigli commited on
Commit
0d1c53a
1 Parent(s): 12f220c

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -3
README.md CHANGED
@@ -30,7 +30,7 @@ library_name: transformers
30
  # Model Card for Minerva-7B-instruct-v1.0
31
 
32
  Minerva is the first family of **LLMs pretrained from scratch on Italian** developed by [Sapienza NLP](https://nlp.uniroma1.it)
33
- in collaboration with [Future Artificial Intelligence Research (FAIR)](https://fondazione-fair.it/) and [CINECA](https://www.cineca.it/).
34
  Notably, the Minerva models are truly-open (data and model) Italian-English LLMs, with approximately half of the pretraining data
35
  including Italian text.
36
 
@@ -251,7 +251,7 @@ Minerva-7B-base-v1.0 is a pretrained base model and, therefore, has no moderatio
251
  ## The Sapienza NLP Team
252
 
253
  ### 🧭 Project Lead and Coordination
254
- * __Roberto Navigli__: project lead and coordinator of the [FAIR Transversal Project 2](https://fondazione-fair.it/transversal-projects/tp2-vision-language-and-multimodal-challenges/), director of the [Sapienza NLP Group](https://nlp.uniroma1.it), full professor at Sapienza University of Rome; model analysis, evaluation and selection, safety and guardrailing, dialogue.
255
 
256
  ### 🤖 Model Development
257
  * __Edoardo Barba__: pre-training, post-training, data analysis, prompt engineering.
@@ -275,6 +275,6 @@ Minerva-7B-base-v1.0 is a pretrained base model and, therefore, has no moderatio
275
 
276
  ## Acknowledgments
277
 
278
- This work was funded by the PNRR MUR project [PE0000013-FAIR](https://fondazione-fair.it) and the CREATIVE project, which is funded by the MUR Progetti di
279
  Rilevante Interesse Nazionale programme (PRIN 2020).
280
  We acknowledge the [CINECA](https://www.cineca.it) award "IscB_medit" under the ISCRA initiative for the availability of high-performance computing resources and support.
 
30
  # Model Card for Minerva-7B-instruct-v1.0
31
 
32
  Minerva is the first family of **LLMs pretrained from scratch on Italian** developed by [Sapienza NLP](https://nlp.uniroma1.it)
33
+ in the context of the [Future Artificial Intelligence Research (FAIR)](https://fondazione-fair.it/) project, in collaboration with [CINECA](https://www.cineca.it/) and with additional contributions from [Babelscape](https://babelscape.com) and the CREATIVE PRIN Project.
34
  Notably, the Minerva models are truly-open (data and model) Italian-English LLMs, with approximately half of the pretraining data
35
  including Italian text.
36
 
 
251
  ## The Sapienza NLP Team
252
 
253
  ### 🧭 Project Lead and Coordination
254
+ * __Roberto Navigli__: project lead and coordination; model analysis, evaluation and selection, safety and guardrailing, conversations.
255
 
256
  ### 🤖 Model Development
257
  * __Edoardo Barba__: pre-training, post-training, data analysis, prompt engineering.
 
275
 
276
  ## Acknowledgments
277
 
278
+ This work was funded by the PNRR MUR project [PE0000013-FAIR](https://fondazione-fair.it) and the CREATIVE PRIN project, which is funded by the MUR Progetti di
279
  Rilevante Interesse Nazionale programme (PRIN 2020).
280
  We acknowledge the [CINECA](https://www.cineca.it) award "IscB_medit" under the ISCRA initiative for the availability of high-performance computing resources and support.