Unable to load model from the Transformers library.

#1
by AidenM27 - opened

The instructions tell us to run the following.

Load model directly

from transformers import AutoModel
model = AutoModel.from_pretrained("PowerInfer/ReluLLaMA-7B-PowerInfer-GGUF")

However, upon running these, I get the following error.

OSError: PowerInfer/ReluLLaMA-7B-PowerInfer-GGUF does not appear to have a file named pytorch_model.bin, tf_model.h5, model.ckpt or flax_model.msgpack.

Please advise. Thanks!

Sign up or log in to comment