nucleotide-transformer-2.5b-1000g_ft_BioS73_1kbpHG19_DHSs_H3K27AC
This model is a fine-tuned version of InstaDeepAI/nucleotide-transformer-2.5b-1000g on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.7317
- F1 Score: 0.8668
- Precision: 0.8656
- Recall: 0.8680
- Accuracy: 0.8576
- Auc: 0.9328
- Prc: 0.9309
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 20
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | F1 Score | Precision | Recall | Accuracy | Auc | Prc |
---|---|---|---|---|---|---|---|---|---|
0.4388 | 0.1864 | 500 | 0.3842 | 0.8618 | 0.7952 | 0.9406 | 0.8390 | 0.9151 | 0.9091 |
0.386 | 0.3727 | 1000 | 0.4030 | 0.8661 | 0.8152 | 0.9239 | 0.8476 | 0.9233 | 0.9200 |
0.3811 | 0.5591 | 1500 | 0.3778 | 0.8694 | 0.8199 | 0.9253 | 0.8517 | 0.9252 | 0.9217 |
0.3818 | 0.7454 | 2000 | 0.3400 | 0.8699 | 0.8090 | 0.9406 | 0.8498 | 0.9322 | 0.9284 |
0.3575 | 0.9318 | 2500 | 0.4269 | 0.8626 | 0.7761 | 0.9707 | 0.8349 | 0.9321 | 0.9271 |
0.2826 | 1.1182 | 3000 | 0.6570 | 0.8760 | 0.8575 | 0.8953 | 0.8647 | 0.9340 | 0.9291 |
0.2169 | 1.3045 | 3500 | 0.5168 | 0.8732 | 0.8466 | 0.9015 | 0.8602 | 0.9260 | 0.9184 |
0.2012 | 1.4909 | 4000 | 0.3607 | 0.8748 | 0.8508 | 0.9001 | 0.8625 | 0.9346 | 0.9312 |
0.2245 | 1.6772 | 4500 | 0.6390 | 0.8765 | 0.8123 | 0.9518 | 0.8569 | 0.9309 | 0.9246 |
0.1985 | 1.8636 | 5000 | 0.7556 | 0.8738 | 0.8021 | 0.9595 | 0.8520 | 0.9296 | 0.9270 |
0.1625 | 2.0499 | 5500 | 0.8361 | 0.8614 | 0.8892 | 0.8352 | 0.8565 | 0.9355 | 0.9331 |
0.0565 | 2.2363 | 6000 | 0.8087 | 0.8783 | 0.8532 | 0.9050 | 0.8662 | 0.9344 | 0.9310 |
0.0746 | 2.4227 | 6500 | 0.8608 | 0.8800 | 0.8527 | 0.9092 | 0.8677 | 0.9296 | 0.9221 |
0.063 | 2.6090 | 7000 | 0.8631 | 0.8654 | 0.8663 | 0.8645 | 0.8565 | 0.9336 | 0.9304 |
0.0588 | 2.7954 | 7500 | 0.8914 | 0.8781 | 0.8455 | 0.9134 | 0.8647 | 0.9316 | 0.9269 |
0.0674 | 2.9817 | 8000 | 0.7317 | 0.8668 | 0.8656 | 0.8680 | 0.8576 | 0.9328 | 0.9309 |
Framework versions
- Transformers 4.42.3
- Pytorch 2.3.0+cu121
- Datasets 2.18.0
- Tokenizers 0.19.0
- Downloads last month
- 6
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API:
The model has no library tag.