Omartificial-Intelligence-Space commited on
Commit
ba29fd4
1 Parent(s): ba61529

Update readme.md

Browse files
Files changed (1) hide show
  1. README.md +21 -2
README.md CHANGED
@@ -131,9 +131,9 @@ widget:
131
  license: apache-2.0
132
  ---
133
 
134
- # GATE-AraBert-v1
135
 
136
- This is a General Arabic Text Embedding trained using SentenceTransformers in a multi-task setup. The system trains on the AllNLI and on the STS dataset.
137
 
138
  ## Model Details
139
 
@@ -221,3 +221,22 @@ print(similarities.shape)
221
  | spearman_max | 0.8173 |
222
 
223
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
131
  license: apache-2.0
132
  ---
133
 
134
+ # GATE-AraBert-V1
135
 
136
+ This is **GATE | General Arabic Text Embedding** trained using SentenceTransformers in a **multi-task** setup. The system trains on the **AllNLI** and on the **STS** dataset.
137
 
138
  ## Model Details
139
 
 
221
  | spearman_max | 0.8173 |
222
 
223
 
224
+ ## <span style="color:blue">Acknowledgments</span>
225
+
226
+ The author would like to thank Prince Sultan University for their invaluable support in this project. Their contributions and resources have been instrumental in the development and fine-tuning of these models.
227
+
228
+
229
+ ```markdown
230
+ ## Citation
231
+
232
+ If you use the Arabic Matryoshka Embeddings Model, please cite it as follows:
233
+
234
+ @misc{nacar2025GATE,
235
+ title={GATE: General Arabic Text Embedding for Enhanced Semantic Textual Similarity with Hybrid Loss Training},
236
+ author={Omer Nacar, Anis Koubaa, Serry Taiseer Sibaee and Lahouari Ghouti},
237
+ year={2025},
238
+ note={Submitted to COLING 2025},
239
+ url={https://huggingface.co/Omartificial-Intelligence-Space/GATE-AraBert-v1},
240
+ }
241
+
242
+