We find your model to be a great base-model
#3
by
eladven
- opened
README.md
CHANGED
@@ -51,17 +51,17 @@ output = model(encoded_input)
|
|
51 |
```
|
52 |
|
53 |
## Evaluation results
|
54 |
-
|
55 |
-
|
|
|
|
|
|
|
56 |
|
57 |
Results:
|
58 |
|
59 |
| 20_newsgroup | ag_news | amazon_reviews_multi | anli | boolq | cb | cola | copa | dbpedia | esnli | financial_phrasebank | imdb | isear | mnli | mrpc | multirc | poem_sentiment | qnli | qqp | rotten_tomatoes | rte | sst2 | sst_5bins | stsb | trec_coarse | trec_fine | tweet_ev_emoji | tweet_ev_emotion | tweet_ev_hate | tweet_ev_irony | tweet_ev_offensive | tweet_ev_sentiment | wic | wnli | wsc | yahoo_answers |
|
60 |
|---------------:|----------:|-----------------------:|--------:|--------:|--------:|-------:|-------:|----------:|--------:|-----------------------:|-------:|--------:|--------:|--------:|----------:|-----------------:|-------:|--------:|------------------:|--------:|--------:|------------:|--------:|--------------:|------------:|-----------------:|-------------------:|----------------:|-----------------:|---------------------:|---------------------:|--------:|--------:|--------:|----------------:|
|
61 |
| 86.3648 | 89.3 | 66.72 | 53.0937 | 82.0183 | 89.2857 | 83.605 | 73 | 77.4667 | 91.0423 | 87.3 | 93.868 | 73.1421 | 87.3881 | 87.7451 | 63.6757 | 88.4615 | 92.678 | 91.0809 | 91.4634 | 83.3935 | 95.2982 | 58.1448 | 91.6334 | 97 | 91 | 44.95 | 83.0401 | 52.5589 | 77.0408 | 86.0465 | 69.7818 | 70.0627 | 49.2958 | 63.4615 | 72.5667 |
|
62 |
-
|
63 |
-
|
64 |
-
|
65 |
### BibTeX entry and citation info
|
66 |
|
67 |
```bibtex
|
|
|
51 |
```
|
52 |
|
53 |
## Evaluation results
|
54 |
+
## [Model Recycling](https://ibm.github.io/model-recycling/)
|
55 |
+
|
56 |
+
Evaluation on 36 datasets using ibm/ColD-Fusion-itr13-seed2 as a base model, yields average score of 78.72 in comparison to 76.22 by roberta-base.
|
57 |
+
|
58 |
+
Overall ranking: top 1 model among roberta-base models (updated to 12/12/2022)
|
59 |
|
60 |
Results:
|
61 |
|
62 |
| 20_newsgroup | ag_news | amazon_reviews_multi | anli | boolq | cb | cola | copa | dbpedia | esnli | financial_phrasebank | imdb | isear | mnli | mrpc | multirc | poem_sentiment | qnli | qqp | rotten_tomatoes | rte | sst2 | sst_5bins | stsb | trec_coarse | trec_fine | tweet_ev_emoji | tweet_ev_emotion | tweet_ev_hate | tweet_ev_irony | tweet_ev_offensive | tweet_ev_sentiment | wic | wnli | wsc | yahoo_answers |
|
63 |
|---------------:|----------:|-----------------------:|--------:|--------:|--------:|-------:|-------:|----------:|--------:|-----------------------:|-------:|--------:|--------:|--------:|----------:|-----------------:|-------:|--------:|------------------:|--------:|--------:|------------:|--------:|--------------:|------------:|-----------------:|-------------------:|----------------:|-----------------:|---------------------:|---------------------:|--------:|--------:|--------:|----------------:|
|
64 |
| 86.3648 | 89.3 | 66.72 | 53.0937 | 82.0183 | 89.2857 | 83.605 | 73 | 77.4667 | 91.0423 | 87.3 | 93.868 | 73.1421 | 87.3881 | 87.7451 | 63.6757 | 88.4615 | 92.678 | 91.0809 | 91.4634 | 83.3935 | 95.2982 | 58.1448 | 91.6334 | 97 | 91 | 44.95 | 83.0401 | 52.5589 | 77.0408 | 86.0465 | 69.7818 | 70.0627 | 49.2958 | 63.4615 | 72.5667 |
|
|
|
|
|
|
|
65 |
### BibTeX entry and citation info
|
66 |
|
67 |
```bibtex
|