Transformers
ONNX
Inference Endpoints

Error encounter loading the model in Colab

#1
by leonyap27 - opened

Refer to the image below to line 9, I managed to load another model.
image.png

Kindly assist. Thanks.

GovTech - AI Practice org

Hi @leonyap27 , thanks for trying out LionGuard! Please use the code in https://huggingface.co/govtech/lionguard-v1/blob/main/inference.py instead. LionGuard is a classifier model, so you won't be able to load it as a transformers / language model through AutoModel.from_pretrained(...).

Hi, I'm not officially from the GovTech team who worked on this project, but until they can communicate, I figured I can help as I have used it.

LionGuard does not fit the Automatic model loading type as it is a ONNX model loading a Ridge Classifier. Thus, you can't use the method you have with AutoModel.from_pretrained. You have to use the inference code provided by the authors, which helps you with loading the ONNX model and running inference on it. Hope this helps.

Edit: I'm sorry, I just saw the new message from the official authors, please refer to that.

shaunkhoo changed discussion status to closed

Sign up or log in to comment