dnabert2_ft_BioS2_1kbpHG19_DHSs_H3K27AC

This model is a fine-tuned version of vivym/DNABERT-2-117M on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5274
  • F1 Score: 0.8245
  • Precision: 0.7496
  • Recall: 0.9160
  • Accuracy: 0.7937
  • Auc: 0.8768
  • Prc: 0.8713

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 20
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss F1 Score Precision Recall Accuracy Auc Prc
0.5965 0.0842 500 0.5836 0.7668 0.6494 0.9360 0.6988 0.7875 0.7661
0.574 0.1684 1000 0.5463 0.7731 0.6977 0.8669 0.7308 0.8030 0.7876
0.5599 0.2527 1500 0.6162 0.7749 0.7024 0.8641 0.7344 0.8063 0.7893
0.5516 0.3369 2000 0.5434 0.7780 0.6705 0.9265 0.7202 0.8150 0.8028
0.5542 0.4211 2500 0.5759 0.6427 0.8036 0.5355 0.6850 0.8150 0.7996
0.5508 0.5053 3000 0.5854 0.7736 0.6496 0.9561 0.7039 0.8153 0.8044
0.5431 0.5895 3500 0.5414 0.7814 0.7095 0.8695 0.7426 0.8196 0.8113
0.5416 0.6737 4000 0.5594 0.7875 0.7053 0.8914 0.7455 0.8224 0.8094
0.5379 0.7580 4500 0.5209 0.7877 0.7217 0.8669 0.7527 0.8278 0.8183
0.5364 0.8422 5000 0.5591 0.7885 0.7057 0.8933 0.7465 0.8323 0.8217
0.5411 0.9264 5500 0.5144 0.7876 0.6954 0.9080 0.7409 0.8329 0.8240
0.528 1.0106 6000 0.5883 0.7817 0.6575 0.9637 0.7152 0.8338 0.8214
0.4991 1.0948 6500 0.5155 0.7943 0.7291 0.8723 0.7610 0.8390 0.8247
0.515 1.1790 7000 0.5264 0.7915 0.7220 0.8758 0.7559 0.8199 0.8015
0.5211 1.2633 7500 0.5094 0.7973 0.6964 0.9325 0.7492 0.8454 0.8374
0.493 1.3475 8000 0.5053 0.8015 0.7213 0.9016 0.7637 0.8468 0.8387
0.5037 1.4317 8500 0.5015 0.8001 0.6987 0.9360 0.7526 0.8518 0.8417
0.4963 1.5159 9000 0.5154 0.7934 0.7676 0.8211 0.7738 0.8484 0.8398
0.4835 1.6001 9500 0.4856 0.8062 0.7250 0.9080 0.7691 0.8545 0.8482
0.4921 1.6844 10000 0.4796 0.7967 0.7762 0.8182 0.7790 0.8575 0.8475
0.4697 1.7686 10500 0.4897 0.8113 0.7287 0.9150 0.7748 0.8609 0.8561
0.4857 1.8528 11000 0.4694 0.8122 0.7553 0.8784 0.7851 0.8613 0.8545
0.4837 1.9370 11500 0.4648 0.8085 0.7753 0.8446 0.7883 0.8654 0.8592
0.4438 2.0212 12000 0.4683 0.8151 0.7404 0.9064 0.7824 0.8570 0.8466
0.4555 2.1054 12500 0.4589 0.8186 0.7600 0.8870 0.7920 0.8711 0.8681
0.4458 2.1897 13000 0.4698 0.8179 0.7510 0.8978 0.7884 0.8706 0.8649
0.4598 2.2739 13500 0.4631 0.7870 0.8043 0.7705 0.7793 0.8707 0.8660
0.4601 2.3581 14000 0.4866 0.8186 0.7404 0.9153 0.7854 0.8722 0.8662
0.4675 2.4423 14500 0.4677 0.8199 0.7302 0.9347 0.7827 0.8709 0.8599
0.4534 2.5265 15000 0.4512 0.8158 0.7600 0.8803 0.7896 0.8695 0.8644
0.439 2.6107 15500 0.4580 0.8281 0.7589 0.9112 0.7999 0.8748 0.8694
0.4479 2.6950 16000 0.4673 0.8151 0.7968 0.8341 0.7997 0.8774 0.8722
0.4468 2.7792 16500 0.4575 0.8144 0.7865 0.8443 0.7964 0.8731 0.8703
0.4387 2.8634 17000 0.4576 0.8171 0.7831 0.8542 0.7977 0.8774 0.8727
0.426 2.9476 17500 0.4615 0.8229 0.7726 0.8803 0.7996 0.8768 0.8724
0.4259 3.0318 18000 0.5274 0.8245 0.7496 0.9160 0.7937 0.8768 0.8713

Framework versions

  • Transformers 4.46.0.dev0
  • Pytorch 2.4.1+cu121
  • Datasets 2.18.0
  • Tokenizers 0.20.0
Downloads last month
5
Safetensors
Model size
89.2M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for tanoManzo/dnabert2_ft_BioS2_1kbpHG19_DHSs_H3K27AC

Finetuned
(12)
this model