TabbyAPI Error (resolved: tokenizer update)

#1
by ernestr - opened

Hey @bartowski thanks so much for putting out the awesome quants.

I'm hitting an issue loading this model in tabbyAPI. I've only hit it with this model, none of your others. Has anyone seen this?

data: {"error":{"message":"data did not match any variant of untagged enum ModelWrapper at line 275732 column 3","trace":null}}

have you tried with other mistral models?

Can you show some more details about how exactly you're hitting it, IE what you're passing to the endpoint?

Hey @bartowski thanks so much for putting out the awesome quants.

I'm hitting an issue loading this model in tabbyAPI. I've only hit it with this model, none of your others. Has anyone seen this?

data: {"error":{"message":"data did not match any variant of untagged enum ModelWrapper at line 275732 column 3","trace":null}}
.

cd to the tabbyAPI folder

source venv/bin/activate

pip install -U tokenizers

have you tried with other mistral models?

Can you show some more details about how exactly you're hitting it, IE what you're passing to the endpoint?

Yes, I've been running Mistral Large 2407 no problem.

@iamadog thanks! Updating tokenizers worked. Super pumped to play with this model.

bartowski changed discussion title from TabbyAPI Error to TabbyAPI Error (resolved: tokenizer update)

Sign up or log in to comment