changing max new tokens to None due to continuing warning on backend: Running generate_until requests: 0%| | 13/99442 [10:42<1539:30:14, 55.74s/it]Both (=2048) and (=302) seem to have been set. will take precedence. Please refer to the documentation for more information. (https://huggingface.co/docs/transformers/main/en/main_classes/text_generation)
7526aba
meg-huggingfacecommited on
changing batch size to auto
8cd9975
meg-huggingfacecommited on
evaluation with hf not hf-auto
e6dead6
meg-huggingfacecommited on
Run full eval
499d1c4
meg-huggingfacecommited on
DEBUG
1702f66
meg-huggingfacecommited on
DEBUG
f1e6565
meg-huggingfacecommited on
just realtoxicityprompts
fe8891f
meg-huggingfacecommited on
Limit to 20 now
1bed6b0
meg-huggingfacecommited on
debug
623b36b
meg-huggingfacecommited on
debug
5e999cb
meg-huggingfacecommited on
debug
6ea3edf
meg-huggingfacecommited on
debug
3dbfbdf
meg-huggingfacecommited on
Debug
936b02e
meg-huggingfacecommited on
debug
3313f0d
meg-huggingfacecommited on
Debug
f09aba3
meg-huggingfacecommited on
Changing refresh rate to 1 hr
590e272
meg-huggingfacecommited on
Updates, not sure what -- left over from last night
30b5f7e
meg-huggingfacecommited on
Refreshing less
611c544
meg-huggingfacecommited on
Changing model from hf-causal-experimental to hf-auto
3441586
meg-huggingfacecommited on
Trying to make it work with new EAI versions
3596f80
meg-huggingfacecommited on
Should be realtoxicityprompts as one of the target tasks