cosimoiaia
commited on
Commit
•
2823e8b
1
Parent(s):
7e6eb70
Update README.md
Browse files
README.md
CHANGED
@@ -71,7 +71,7 @@ model = AutoModelForCausalLM.from_pretrained(
|
|
71 |
|
72 |
Loquace-20B was trained on a conversational dataset comprising 102k question/answer pairs in Italian language.
|
73 |
The training data was constructed by putting together translations from the original alpaca Dataset and other sources like the OpenAssistant dataset.
|
74 |
-
The model was trained for only 3000 iterations and took
|
75 |
|
76 |
## Limitations
|
77 |
|
|
|
71 |
|
72 |
Loquace-20B was trained on a conversational dataset comprising 102k question/answer pairs in Italian language.
|
73 |
The training data was constructed by putting together translations from the original alpaca Dataset and other sources like the OpenAssistant dataset.
|
74 |
+
The model was trained for only 3000 iterations and took 68 hours on 4 RTX 3090, kindly provided by Genesis Cloud. (https://gnsiscld.co/26qhlf)
|
75 |
|
76 |
## Limitations
|
77 |
|