TabbyAPI Error (resolved: tokenizer update)
Hey @bartowski thanks so much for putting out the awesome quants.
I'm hitting an issue loading this model in tabbyAPI. I've only hit it with this model, none of your others. Has anyone seen this?
data: {"error":{"message":"data did not match any variant of untagged enum ModelWrapper at line 275732 column 3","trace":null}}
have you tried with other mistral models?
Can you show some more details about how exactly you're hitting it, IE what you're passing to the endpoint?
Hey @bartowski thanks so much for putting out the awesome quants.
I'm hitting an issue loading this model in tabbyAPI. I've only hit it with this model, none of your others. Has anyone seen this?
data: {"error":{"message":"data did not match any variant of untagged enum ModelWrapper at line 275732 column 3","trace":null}}
.
cd to the tabbyAPI folder
source venv/bin/activate
pip install -U tokenizers