Model Card for Model ID
Fine-tuned distilbert model. Trained on train set of BC2GM dataset taken from BLURB.
Model Details
Model Sources [optional]
Training Details
Training Data
Train set of BC2GM dataset.
Training Procedure
Classical fine-tuning.
Training Hyperparameters
- Training regime: [More Information Needed]
learning_rate=5e-5 per_device_train_batch_size=16 per_device_eval_batch_size=16 num_train_epochs=3 weight_decay=0.01
Evaluation
Testing Data
Test set of BC2GM dataset.
Results
Precision: 0.76 Recall: 0.79 Micro-F1: 0.77
Environmental Impact
- Hardware Type: 1xRTX A4000
- Hours used: 00:10:00
- Downloads last month
- 7
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
Model tree for kbulutozler/distilbert-base-uncased-FT-ner-BC2GM
Base model
distilbert/distilbert-base-uncased