--- library_name: transformers tags: - BC5CDR-disease - NER license: apache-2.0 language: - en metrics: - seqeval base_model: - distilbert/distilbert-base-uncased --- # Model Card for Model ID Fine-tuned distilbert model. Trained on train set of BC5CDR-disease dataset taken from [BLURB](https://microsoft.github.io/BLURB/tasks.html). ## Model Details ### Model Sources [optional] - **Repository:** https://github.com/kbulutozler/medical-llm-benchmark ## Training Details ### Training Data Train set of BC5CDR-disease dataset. ### Training Procedure Classical fine-tuning. #### Training Hyperparameters - **Training regime:** [More Information Needed] learning_rate=5e-5 per_device_train_batch_size=16 per_device_eval_batch_size=16 num_train_epochs=3 weight_decay=0.01 ## Evaluation #### Testing Data Test set of BC5CDR-disease dataset. ### Results Precision: 0.76 Recall: 0.81 Micro-F1: 0.79 ## Environmental Impact - **Hardware Type:** 1xRTX A4000 - **Hours used:** 00:07:00