--- library_name: transformers license: mit base_model: m3rg-iitd/matscibert tags: - generated_from_trainer metrics: - precision - recall - f1 - accuracy model-index: - name: MatSciBERT_ST_DA_100 results: [] --- # MatSciBERT_ST_DA_100 This model is a fine-tuned version of [m3rg-iitd/matscibert](https://huggingface.co/m3rg-iitd/matscibert) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.2043 - Precision: 0.9627 - Recall: 0.9693 - F1: 0.9660 - Accuracy: 0.9561 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 ### Training results | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:| | No log | 1.0 | 59 | 0.2685 | 0.9263 | 0.9420 | 0.9341 | 0.9213 | | No log | 2.0 | 118 | 0.1935 | 0.9477 | 0.9573 | 0.9524 | 0.9429 | | No log | 3.0 | 177 | 0.2043 | 0.9558 | 0.9669 | 0.9613 | 0.9506 | | No log | 4.0 | 236 | 0.1769 | 0.9596 | 0.9701 | 0.9648 | 0.9554 | | No log | 5.0 | 295 | 0.1789 | 0.9619 | 0.9686 | 0.9652 | 0.9561 | | No log | 6.0 | 354 | 0.1916 | 0.9620 | 0.9683 | 0.9651 | 0.9557 | | No log | 7.0 | 413 | 0.1955 | 0.9623 | 0.9685 | 0.9654 | 0.9559 | | No log | 8.0 | 472 | 0.2002 | 0.9627 | 0.9713 | 0.9670 | 0.9575 | | 0.1044 | 9.0 | 531 | 0.2033 | 0.9632 | 0.9698 | 0.9665 | 0.9566 | | 0.1044 | 10.0 | 590 | 0.2043 | 0.9627 | 0.9693 | 0.9660 | 0.9561 | ### Framework versions - Transformers 4.44.2 - Pytorch 2.4.0+cu121 - Datasets 2.21.0 - Tokenizers 0.19.1