atsuki-yamaguchi commited on
Commit
720f1c0
1 Parent(s): d602120

Upload README.md with huggingface_hub

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -6,7 +6,7 @@ language:
6
  base_model: google/gemma-2-9b
7
  library_name: transformers
8
  ---
9
- # Gemma2 9B for Sinhala: 1000 target vocabulary size + Mean target vocabulary initialization + T&B2LS/MTP/512 training
10
 
11
  This model is built on top of Gemma2 9B adapted for Sinhala using 30K target language sentences sampled from CC-100.
12
 
@@ -14,7 +14,7 @@ This model is built on top of Gemma2 9B adapted for Sinhala using 30K target lan
14
 
15
  * **Vocabulary**: This model has an additional 1000 target vocabulary.
16
  * **Target vocabulary initialization**: The target weights of the embedding were initialized using Mean initialization.
17
- * **Training**: This model was additionally pre-trained on 30K target language sentences sampled from CC-100. The training was conducted with the T&B2LS/MTP/512 strategies introduced in the paper.
18
 
19
  ## Model Description
20
 
 
6
  base_model: google/gemma-2-9b
7
  library_name: transformers
8
  ---
9
+ # Gemma2 9B for Sinhala: 1000 target vocabulary size + Mean target vocabulary initialization + 2x2LS/MTP/512 training
10
 
11
  This model is built on top of Gemma2 9B adapted for Sinhala using 30K target language sentences sampled from CC-100.
12
 
 
14
 
15
  * **Vocabulary**: This model has an additional 1000 target vocabulary.
16
  * **Target vocabulary initialization**: The target weights of the embedding were initialized using Mean initialization.
17
+ * **Training**: This model was additionally pre-trained on 30K target language sentences sampled from CC-100. The training was conducted with the 2x2LS/MTP/512 strategies introduced in the paper.
18
 
19
  ## Model Description
20