Commit
•
6834a70
1
Parent(s):
3161c84
Update README.md
Browse files
README.md
CHANGED
@@ -24,7 +24,9 @@ The IGEL family includes `instruct-igel-001` and `chat-igel-001` _coming soon_.
|
|
24 |
|
25 |
## Model Description
|
26 |
|
27 |
-
LoRA tuned [BLOOM-CLP German (6.4B parameters)](https://huggingface.co/malteos/bloom-6b4-clp-german) with merged weights.
|
|
|
|
|
28 |
|
29 |
## Training data
|
30 |
|
|
|
24 |
|
25 |
## Model Description
|
26 |
|
27 |
+
LoRA tuned [BLOOM-CLP German (6.4B parameters)](https://huggingface.co/malteos/bloom-6b4-clp-german) with merged weights. The `001` was designed as a naive test to determine whether it is possible to create an german instruction-tuned model using a small, undertrained LLM and a naive translated dataset. The goal of this test was to explore the potential of the BLOOM architecture for language modeling tasks that require instruction-based responses.
|
28 |
+
|
29 |
+
To achieve this goal, we used a pre-trained LLM model with limited training, and fine-tuned it using a dataset of naive translations of instruction-based content. The dataset was created by taking instructions in English and translating them into German using an automated translation tool. While this approach may introduce errors in the translated content, we wanted to test whether the model could still learn to generate instruction-based responses in a variety of languages.
|
30 |
|
31 |
## Training data
|
32 |
|