nucleotide-transformer-2.5b-1000g_ft_BioS2_1kbpHG19_DHSs_H3K27AC
This model is a fine-tuned version of InstaDeepAI/nucleotide-transformer-2.5b-1000g on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.3818
- F1 Score: 0.8223
- Precision: 0.9051
- Recall: 0.7534
- Accuracy: 0.8300
- Auc: 0.9341
- Prc: 0.9327
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 20
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | F1 Score | Precision | Recall | Accuracy | Auc | Prc |
---|---|---|---|---|---|---|---|---|---|
0.5231 | 0.0841 | 500 | 0.4932 | 0.7420 | 0.8533 | 0.6564 | 0.7617 | 0.8781 | 0.8679 |
0.4788 | 0.1683 | 1000 | 0.4291 | 0.8277 | 0.7714 | 0.8930 | 0.8060 | 0.8849 | 0.8762 |
0.4409 | 0.2524 | 1500 | 0.4796 | 0.8422 | 0.7621 | 0.9410 | 0.8159 | 0.9048 | 0.8973 |
0.4323 | 0.3366 | 2000 | 0.4503 | 0.8469 | 0.7725 | 0.9371 | 0.8231 | 0.9065 | 0.8996 |
0.4113 | 0.4207 | 2500 | 0.4413 | 0.8422 | 0.7491 | 0.9616 | 0.8118 | 0.9144 | 0.9101 |
0.387 | 0.5049 | 3000 | 0.3642 | 0.8534 | 0.8272 | 0.8814 | 0.8420 | 0.9213 | 0.9157 |
0.3932 | 0.5890 | 3500 | 0.3608 | 0.8576 | 0.8224 | 0.8959 | 0.8447 | 0.9247 | 0.9205 |
0.4066 | 0.6732 | 4000 | 0.3853 | 0.8652 | 0.8193 | 0.9165 | 0.8509 | 0.9249 | 0.9196 |
0.3944 | 0.7573 | 4500 | 0.3437 | 0.8563 | 0.8710 | 0.8420 | 0.8524 | 0.9295 | 0.9256 |
0.3753 | 0.8415 | 5000 | 0.4040 | 0.8614 | 0.8340 | 0.8907 | 0.8504 | 0.9278 | 0.9258 |
0.3826 | 0.9256 | 5500 | 0.3818 | 0.8223 | 0.9051 | 0.7534 | 0.8300 | 0.9341 | 0.9327 |
Framework versions
- Transformers 4.42.3
- Pytorch 2.3.0+cu121
- Datasets 2.18.0
- Tokenizers 0.19.0
- Downloads last month
- 2
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API:
The model has no library tag.