An error occurred when using Exllamav2 to load the model.
When I use Exllamav2 to load the model, I encounter the following error:
File ~/anaconda3/envs/chainlit/lib/python3.10/site-packages/exllamav2/config.py:134, in ExLlamaV2Config.prepare(self)
132 for prefix in expect_keys:
133 if not any(key.startswith(prefix) for key in self.tensor_file_map):
--> 134 raise ValueError(f" ## Could not find {prefix}.* in model")
136 # Model dimensions
138 self.head_dim = self.hidden_size // self.num_attention_heads
ValueError: ## Could not find model.layers.0.input_layernorm.* in model
Exllamav2 version: 0.0.7
Do you have any suggestions on how to load the model?
You need to make sure ooba is up to date. You may need to install exllamav2 manually as well if the latest version has not been pre-built as a binary wheel and updated in ooba.