Commit History
Including RUNNING requests as a file that can be FAILED.
bb6f5b0
meg-huggingface
commited on
Inferring compute needs and code cleanup
ffe4d51
meg-huggingface
commited on
Inference endpoints and parallelism.
7dd405e
meg-huggingface
commited on
Merge branch 'main' of hf.co:spaces/meg/backend into main
5169c06
meg-huggingface
commited on
Deleting logging that we're not using
4f55f5f
meg-huggingface
commited on
Update src/backend/run_toxicity_eval.py
b6b2391
verified
parallel processing handling
798ff9d
meg-huggingface
commited on
replicas
668284b
meg-huggingface
commited on
Background scheduling of the evaluation.
20fd212
meg-huggingface
commited on
Handling more exceptions
5c33832
meg-huggingface
commited on
Inference endpoint figuring
3d16b0d
meg-huggingface
commited on
Endpoint naming change
7d70d90
meg-huggingface
commited on
More handling of inference endpoints: Delete when done.
506d239
meg-huggingface
commited on
Trying to handle endpoint errors
a9f6487
meg-huggingface
commited on
Trying to handle endpoint errors
e79b5e9
meg-huggingface
commited on
Adding more endpoint options
58956f6
meg-huggingface
commited on
Updating with new approach to inference endpoint
66621a9
meg-huggingface
commited on
Endpoint name character limit
74d59fa
meg-huggingface
commited on
Handling of json error, running generate all at once.
d4f49be
meg-huggingface
commited on
Please run after fully loading
99df58a
meg-huggingface
commited on
Full dataset
86102e5
meg-huggingface
commited on
Clearing cached results
f20cab2
meg-huggingface
commited on
Backend toxicity
64c3915
meg-huggingface
commited on
Moving to just toxicity
5ea4d55
meg-huggingface
commited on
Update src/envs.py
90907b9
verified
Changing to HF_TOKEN
fb6d1ba
meg-huggingface
commited on
Changes to whatever
0f1cbe4
meg-huggingface
commited on
Changing to BACKEND_TOKEN
5cb2831
meg-huggingface
commited on
Toxigen
ae82a09
meg-huggingface
commited on
string parsing error
c949238
meg-huggingface
commited on
Fixing timing bug
7f85059
meg-huggingface
commited on
changing max new tokens to None due to continuing warning on backend: Running generate_until requests: 0%| | 13/99442 [10:42<1539:30:14, 55.74s/it]Both (=2048) and (=302) seem to have been set. will take precedence. Please refer to the documentation for more information. (https://huggingface.co/docs/transformers/main/en/main_classes/text_generation)
7526aba
meg-huggingface
commited on
changing batch size to auto
8cd9975
meg-huggingface
commited on
evaluation with hf not hf-auto
e6dead6
meg-huggingface
commited on
Run full eval
499d1c4
meg-huggingface
commited on
Update requirements.txt
bea7c2b
verified
Update requirements.txt
37b54ff
verified
DEBUG
1702f66
meg-huggingface
commited on
DEBUG
f1e6565
meg-huggingface
commited on
just realtoxicityprompts
fe8891f
meg-huggingface
commited on
Limit to 20 now
1bed6b0
meg-huggingface
commited on
debug
623b36b
meg-huggingface
commited on
debug
5e999cb
meg-huggingface
commited on
debug
6ea3edf
meg-huggingface
commited on
debug
3dbfbdf
meg-huggingface
commited on
Debug
936b02e
meg-huggingface
commited on
debug
3313f0d
meg-huggingface
commited on
Debug
f09aba3
meg-huggingface
commited on
Changing refresh rate to 1 hr
590e272
meg-huggingface
commited on