Sentence Similarity
sentence-transformers
PyTorch
Transformers
English
t5
text-embedding
embeddings
information-retrieval
beir
text-classification
language-model
text-clustering
text-semantic-similarity
text-evaluation
prompt-retrieval
text-reranking
feature-extraction
English
Sentence Similarity
natural_questions
ms_marco
fever
hotpot_qa
mteb
Eval Results
How to finetune with multiple GPUs
#13
by
nlpdev3
- opened
How to finetune with multiple GPUs?
Hi, Thanks a lot for your interests in the INSTRUCTOR!
The following script should use all the available GPUs to finetune models:
python train.py --model_name_or_path sentence-transformers/gtr-t5-large --output_dir {output_directory} --cache_dir {cache_directory} --max_source_length 512 --num_train_epochs 10 --save_steps 500 --cl_temperature 0.01 --warmup_ratio 0.1 --learning_rate 2e-5 --overwrite_output_dir
You may also specify the GPUs by using CUDA_VISIBLE_DEVICE=GPU_ids.
For more details, you may refer to training instructions
nlpdev3
changed discussion status to
closed