nickprock's picture
Update README.md
fa668bd
|
raw
history blame
519 Bytes
metadata
license: mit
datasets:
  - sst2
language:
  - en
metrics:
  - accuracy
pipeline_tag: text-classification
widget:
  - text: it 's just incredibly dull .
    example_title: Example 1

distilbert-base-uncased-finetuned-sst2

This model is a fine-tuned version of distilbert-base-uncased on the sst2 dataset.

This model has been fine-tuned for an exercise in the face hugging class course.

It achieves the following results on the evaluation set:

  • Accuracy: 0.90