DistilRoBERTa base fintuned condition classifier
Table of Contents
Model Details
Model Description
This model is fine-tuned for a condition classification version of the DistilRoBERTa-base model. This model is case-sensitive: it makes a difference between english and English.
- Fine-tuned by: Ban Ursus
- Model type: Transformer-based language model
- Language(s) (NLP): English
- License: Apache 2.0
- Related Models: DistilRoBERTa-base model
- Resources for more information:
Training Details
This model was fine-tuned 5 epochs using Drug Review Dataset. Therefore, you can improve the accuracy of this model just by training more.
Evaluation
Validation results:
Accuracy | F1 score |
---|---|
0.63 | 0.58 |
Note: Rounded to 2 decimal places
How to Get Started With the Model
Follow the Section 2 Try it out! of the GitHub Repository.
- Downloads last month
- 17
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.