Llama.cpp error while loading
I am using text generation webui (oobabooga) for loading these models and it can't load them and i get errors. What seems to be the problem? How can i load and use these models? I get these errors:
File "D:\text-generation-webui\modules\ui_model_menu.py", line 220, in load_model_wrapper
shared.model, shared.tokenizer = load_model(selected_model, loader)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\text-generation-webui\modules\models.py", line 87, in load_model
output = load_func_maploader
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\text-generation-webui\modules\models.py", line 250, in llamacpp_loader
model, tokenizer = LlamaCppModel.from_pretrained(model_file)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\text-generation-webui\modules\llamacpp_model.py", line 102, in from_pretrained
result.model = Llama(**params)
^^^^^^^^^^^^^^^
File "D:\text-generation-webui\installer_files\env\Lib\site-packages\llama_cpp\llama.py", line 285, in init
self._model = _LlamaModel(
^^^^^^^^^^^^
File "D:\text-generation-webui\installer_files\env\Lib\site-packages\llama_cpp_internals.py", line 52, in init
self.model = llama_cpp.llama_load_model_from_file(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\text-generation-webui\installer_files\env\Lib\site-packages\llama_cpp\llama_cpp.py", line 714, in llama_load_model_from_file
return _lib.llama_load_model_from_file(path_model, params)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
OSError: exception: access violation reading 0x0000000000000028
Llama.cpp doesn't support this kind of model (T5)