runtime error

Exit code: 1. Reason: ERROR:__main__:Failed to load model or tokenizer: The checkpoint you are trying to load has model type `gemma2` but Transformers does not recognize this architecture. This could be because of an issue with the checkpoint, or because your version of Transformers is out of date. Traceback (most recent call last): File "/usr/local/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py", line 945, in from_pretrained config_class = CONFIG_MAPPING[config_dict["model_type"]] File "/usr/local/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py", line 647, in __getitem__ raise KeyError(key) KeyError: 'gemma2' During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/home/user/app/app.py", line 27, in <module> model = AutoModelForCausalLM.from_pretrained("google/gemma-2-9b") File "/usr/local/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 523, in from_pretrained config, kwargs = AutoConfig.from_pretrained( File "/usr/local/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py", line 947, in from_pretrained raise ValueError( ValueError: The checkpoint you are trying to load has model type `gemma2` but Transformers does not recognize this architecture. This could be because of an issue with the checkpoint, or because your version of Transformers is out of date.

Container logs:

Fetching error logs...