Edit model card

2020-Q4-50p-filtered

This model is a fine-tuned version of cardiffnlp/twitter-roberta-base-2019-90m on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 2.6101

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 4.1e-07
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.98) and epsilon=1e-08
  • lr_scheduler_type: linear
  • training_steps: 2400000

Training results

Training Loss Epoch Step Validation Loss
No log 0.03 8000 2.9660
3.1627 0.07 16000 2.8754
3.1627 0.1 24000 2.8263
2.9611 0.13 32000 2.7973
2.9611 0.17 40000 2.7741
2.8986 0.2 48000 2.7574
2.8986 0.24 56000 2.7413
2.8726 0.27 64000 2.7240
2.8726 0.3 72000 2.7239
2.8558 0.34 80000 2.7132
2.8558 0.37 88000 2.7030
2.8459 0.4 96000 2.7112
2.8459 0.44 104000 2.6918
2.8379 0.47 112000 2.7017
2.8379 0.51 120000 2.6920
2.8265 0.54 128000 2.6971
2.8265 0.57 136000 2.6924
2.8227 0.61 144000 2.6952
2.8227 0.64 152000 2.6811
2.8209 0.67 160000 2.6829
2.8209 0.71 168000 2.6883
2.8147 0.74 176000 2.6675
2.8147 0.77 184000 2.6674
2.8077 0.81 192000 2.6661
2.8077 0.84 200000 2.6773
2.8058 0.88 208000 2.6734
2.8058 0.91 216000 2.6742
2.812 0.94 224000 2.6666
2.812 0.98 232000 2.6642
2.8025 1.01 240000 2.6681
2.8025 1.04 248000 2.6663
2.809 1.08 256000 2.6645
2.809 1.11 264000 2.6529
2.8073 1.15 272000 2.6623
2.8073 1.18 280000 2.6551
2.8005 1.21 288000 2.6643
2.8005 1.25 296000 2.6628
2.7988 1.28 304000 2.6583
2.7988 1.31 312000 2.6594
2.7887 1.35 320000 2.6544
2.7887 1.38 328000 2.6516
2.7964 1.41 336000 2.6555
2.7964 1.45 344000 2.6551
2.7919 1.48 352000 2.6508
2.7919 1.52 360000 2.6486
2.8058 1.55 368000 2.6484
2.8058 1.58 376000 2.6532
2.796 1.62 384000 2.6473
2.796 1.65 392000 2.6489
2.799 1.68 400000 2.6476
2.799 1.72 408000 2.6417
2.7991 1.75 416000 2.6545
2.7991 1.79 424000 2.6466
2.792 1.82 432000 2.6397
2.792 1.85 440000 2.6428
2.7972 1.89 448000 2.6446
2.7972 1.92 456000 2.6434
2.798 1.95 464000 2.6490
2.798 1.99 472000 2.6502
2.7914 2.02 480000 2.6407
2.7914 2.05 488000 2.6284
2.7932 2.09 496000 2.6426
2.7932 2.12 504000 2.6423
2.787 2.16 512000 2.6385
2.787 2.19 520000 2.6388
2.7893 2.22 528000 2.6422
2.7893 2.26 536000 2.6410
2.7889 2.29 544000 2.6337
2.7889 2.32 552000 2.6280
2.791 2.36 560000 2.6364
2.791 2.39 568000 2.6341
2.7883 2.43 576000 2.6317
2.7883 2.46 584000 2.6278
2.7889 2.49 592000 2.6357
2.7889 2.53 600000 2.6341
2.7838 2.56 608000 2.6333
2.7838 2.59 616000 2.6382
2.7873 2.63 624000 2.6275
2.7873 2.66 632000 2.6260
2.7813 2.69 640000 2.6373
2.7813 2.73 648000 2.6349
2.7858 2.76 656000 2.6223
2.7858 2.8 664000 2.6276
2.7895 2.83 672000 2.6355
2.7895 2.86 680000 2.6270
2.7873 2.9 688000 2.6244
2.7873 2.93 696000 2.6397
2.7866 2.96 704000 2.6303
2.7866 3.0 712000 2.6167
2.7865 3.03 720000 2.6265
2.7865 3.07 728000 2.6403
2.7716 3.1 736000 2.6247
2.7716 3.13 744000 2.6255
2.779 3.17 752000 2.6316
2.779 3.2 760000 2.6270
2.7811 3.23 768000 2.6268
2.7811 3.27 776000 2.6147
2.7797 3.3 784000 2.6271
2.7797 3.33 792000 2.6243
2.7798 3.37 800000 2.6240
2.7798 3.4 808000 2.6225
2.7774 3.44 816000 2.6232
2.7774 3.47 824000 2.6247
2.7744 3.5 832000 2.6270
2.7744 3.54 840000 2.6175
2.7786 3.57 848000 2.6264
2.7786 3.6 856000 2.6192
2.7829 3.64 864000 2.6278
2.7829 3.67 872000 2.6237
2.776 3.71 880000 2.6202
2.776 3.74 888000 2.6216
2.7797 3.77 896000 2.6174
2.7797 3.81 904000 2.6239
2.7744 3.84 912000 2.6163
2.7744 3.87 920000 2.6198
2.7713 3.91 928000 2.6236
2.7713 3.94 936000 2.6226
2.7853 3.97 944000 2.6175
2.7853 4.01 952000 2.6189
2.7766 4.04 960000 2.6192
2.7766 4.08 968000 2.6318
2.7851 4.11 976000 2.6210
2.7851 4.14 984000 2.6172
2.7804 4.18 992000 2.6200
2.7804 4.21 1000000 2.6157
2.773 4.24 1008000 2.6098
2.773 4.28 1016000 2.6156
2.7818 4.31 1024000 2.6149
2.7818 4.35 1032000 2.6121
2.7736 4.38 1040000 2.6150
2.7736 4.41 1048000 2.6156
2.7761 4.45 1056000 2.6171
2.7761 4.48 1064000 2.6124
2.7789 4.51 1072000 2.6277
2.7789 4.55 1080000 2.6138
2.7744 4.58 1088000 2.6081
2.7744 4.61 1096000 2.6201
2.77 4.65 1104000 2.6171
2.77 4.68 1112000 2.6099
2.772 4.72 1120000 2.6141
2.772 4.75 1128000 2.6174
2.7709 4.78 1136000 2.6200
2.7709 4.82 1144000 2.6150
2.7724 4.85 1152000 2.6042
2.7724 4.88 1160000 2.6158
2.7763 4.92 1168000 2.6167
2.7763 4.95 1176000 2.6174
2.7736 4.99 1184000 2.6099
2.7736 5.02 1192000 2.6076
2.7692 5.05 1200000 2.6088
2.7692 5.09 1208000 2.6174
2.7794 5.12 1216000 2.6041
2.7794 5.15 1224000 2.6051
2.7709 5.19 1232000 2.6093
2.7709 5.22 1240000 2.6062
2.7727 5.25 1248000 2.6052
2.7727 5.29 1256000 2.6126
2.7686 5.32 1264000 2.6099
2.7686 5.36 1272000 2.6192
2.7668 5.39 1280000 2.6166
2.7668 5.42 1288000 2.6042
2.7777 5.46 1296000 2.6038
2.7777 5.49 1304000 2.6119
2.7737 5.52 1312000 2.6155
2.7737 5.56 1320000 2.6236
2.7757 5.59 1328000 2.6124
2.7757 5.63 1336000 2.5993
2.7757 5.66 1344000 2.6132
2.7757 5.69 1352000 2.6063
2.7748 5.73 1360000 2.6130
2.7748 5.76 1368000 2.6100
2.769 5.79 1376000 2.6024
2.769 5.83 1384000 2.6062
2.7713 5.86 1392000 2.6138
2.7713 5.89 1400000 2.6025
2.7766 5.93 1408000 2.6088
2.7766 5.96 1416000 2.6138
2.7727 6.0 1424000 2.6048
2.7727 6.03 1432000 2.6068
2.7737 6.06 1440000 2.6144
2.7737 6.1 1448000 2.6051
2.778 6.13 1456000 2.6158
2.778 6.16 1464000 2.6152
2.7767 6.2 1472000 2.6019
2.7767 6.23 1480000 2.6117
2.7706 6.27 1488000 2.6065
2.7706 6.3 1496000 2.6122
2.7775 6.33 1504000 2.6100
2.7775 6.37 1512000 2.6100
2.7753 6.4 1520000 2.6051
2.7753 6.43 1528000 2.6037
2.7691 6.47 1536000 2.6037
2.7691 6.5 1544000 2.5992
2.758 6.53 1552000 2.6080
2.758 6.57 1560000 2.6139
2.7722 6.6 1568000 2.6000
2.7722 6.64 1576000 2.6107
2.7737 6.67 1584000 2.6057
2.7737 6.7 1592000 2.6063
2.7722 6.74 1600000 2.6028
2.7722 6.77 1608000 2.5995
2.7659 6.8 1616000 2.6042
2.7659 6.84 1624000 2.6013
2.7769 6.87 1632000 2.6028
2.7769 6.91 1640000 2.6080
2.7732 6.94 1648000 2.5994
2.7732 6.97 1656000 2.6063
2.7708 7.01 1664000 2.6120
2.7708 7.04 1672000 2.6023
2.7614 7.07 1680000 2.6091
2.7614 7.11 1688000 2.6003
2.7655 7.14 1696000 2.6016
2.7655 7.17 1704000 2.6058
2.7747 7.21 1712000 2.6045
2.7747 7.24 1720000 2.6097
2.7685 7.28 1728000 2.6068
2.7685 7.31 1736000 2.6037
2.7736 7.34 1744000 2.6125
2.7736 7.38 1752000 2.6113
2.7666 7.41 1760000 2.5972
2.7666 7.44 1768000 2.6081
2.7658 7.48 1776000 2.6090
2.7658 7.51 1784000 2.6126
2.7802 7.55 1792000 2.6021
2.7802 7.58 1800000 2.6087
2.7749 7.61 1808000 2.5986
2.7749 7.65 1816000 2.6002
2.7689 7.68 1824000 2.6023
2.7689 7.71 1832000 2.5969
2.7699 7.75 1840000 2.5975
2.7699 7.78 1848000 2.6070
2.7715 7.81 1856000 2.6035
2.7715 7.85 1864000 2.6049
2.7653 7.88 1872000 2.6129
2.7653 7.92 1880000 2.6027
2.7729 7.95 1888000 2.6000
2.7729 7.98 1896000 2.6138
2.7693 8.02 1904000 2.6052
2.7693 8.05 1912000 2.6060
2.7585 8.08 1920000 2.6065
2.7585 8.12 1928000 2.6105
2.7652 8.15 1936000 2.6075
2.7652 8.19 1944000 2.6076
2.7508 8.22 1952000 2.6083
2.7508 8.25 1960000 2.6112
2.7678 8.29 1968000 2.6019
2.7678 8.32 1976000 2.6029
2.7653 8.35 1984000 2.6087
2.7653 8.39 1992000 2.6064
2.7661 8.42 2000000 2.6031
2.7661 8.45 2008000 2.6051
2.7742 8.49 2016000 2.6091
2.7742 8.52 2024000 2.5978
2.7748 8.56 2032000 2.6131
2.7748 8.59 2040000 2.6030
2.7706 8.62 2048000 2.6036
2.7706 8.66 2056000 2.5998
2.769 8.69 2064000 2.6013
2.769 8.72 2072000 2.6000
2.7733 8.76 2080000 2.6062
2.7733 8.79 2088000 2.6057
2.7714 8.83 2096000 2.6021
2.7714 8.86 2104000 2.6028
2.7754 8.89 2112000 2.5964
2.7754 8.93 2120000 2.6015
2.7683 8.96 2128000 2.6060
2.7683 8.99 2136000 2.6082
2.7758 9.03 2144000 2.6130
2.7758 9.06 2152000 2.6071
2.768 9.09 2160000 2.6141
2.768 9.13 2168000 2.6003
2.7653 9.16 2176000 2.5987
2.7653 9.2 2184000 2.6066
2.7621 9.23 2192000 2.6041
2.7621 9.26 2200000 2.6060
2.7712 9.3 2208000 2.6144
2.7712 9.33 2216000 2.5990
2.7718 9.36 2224000 2.6039
2.7718 9.4 2232000 2.5931
2.774 9.43 2240000 2.6129
2.774 9.47 2248000 2.6095
2.765 9.5 2256000 2.5932
2.765 9.53 2264000 2.6010
2.7754 9.57 2272000 2.6078
2.7754 9.6 2280000 2.5981
2.771 9.63 2288000 2.6052
2.771 9.67 2296000 2.5944
2.7757 9.7 2304000 2.6045
2.7757 9.73 2312000 2.5971
2.7685 9.77 2320000 2.6101
2.7685 9.8 2328000 2.5964
2.7708 9.84 2336000 2.5974
2.7708 9.87 2344000 2.5953
2.7695 9.9 2352000 2.5981
2.7695 9.94 2360000 2.6095
2.7702 9.97 2368000 2.6042
2.7702 10.0 2376000 2.6095
2.7614 10.04 2384000 2.6007
2.7614 10.07 2392000 2.6017
2.7708 10.11 2400000 2.6114

Framework versions

  • Transformers 4.35.0.dev0
  • Pytorch 2.0.1+cu117
  • Datasets 2.14.5
  • Tokenizers 0.14.0
Downloads last month
5
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for DouglasPontes/2020-Q4-50p-filtered

Finetuned
(34)
this model