nucleotide-transformer-2.5b-1000g_ft_BioS74_1kbpHG19_DHSs_H3K27AC
This model is a fine-tuned version of InstaDeepAI/nucleotide-transformer-2.5b-1000g on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.4403
- F1 Score: 0.8334
- Precision: 0.8158
- Recall: 0.8518
- Accuracy: 0.8217
- Auc: 0.9054
- Prc: 0.9023
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 20
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | F1 Score | Precision | Recall | Accuracy | Auc | Prc |
---|---|---|---|---|---|---|---|---|---|
0.5156 | 0.1314 | 500 | 0.4243 | 0.8280 | 0.7707 | 0.8945 | 0.8054 | 0.8816 | 0.8757 |
0.475 | 0.2629 | 1000 | 0.5234 | 0.7654 | 0.8545 | 0.6931 | 0.7775 | 0.8892 | 0.8824 |
0.4596 | 0.3943 | 1500 | 0.4114 | 0.8220 | 0.8092 | 0.8353 | 0.8107 | 0.8921 | 0.8890 |
0.4396 | 0.5258 | 2000 | 0.4771 | 0.8335 | 0.7846 | 0.8890 | 0.8141 | 0.8967 | 0.8938 |
0.4142 | 0.6572 | 2500 | 0.4196 | 0.8322 | 0.8023 | 0.8644 | 0.8175 | 0.9005 | 0.8979 |
0.456 | 0.7886 | 3000 | 0.4024 | 0.8299 | 0.8291 | 0.8307 | 0.8217 | 0.8987 | 0.8942 |
0.4295 | 0.9201 | 3500 | 0.4403 | 0.8334 | 0.8158 | 0.8518 | 0.8217 | 0.9054 | 0.9023 |
Framework versions
- Transformers 4.42.3
- Pytorch 2.3.0+cu121
- Datasets 2.18.0
- Tokenizers 0.19.0
- Downloads last month
- 2
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API:
The model has no library tag.