Required Python 3.9+
Summary:
I've got an error on Python 3.8 when I run the code.
import transformers
pipeline = transformers.pipeline("text-generation", model="pfnet/plamo-13b", trust_remote_code=True)
print(pipeline("The future of artificial intelligence technology is ", max_new_tokens=32))
According to ChatGPT (GPT-4), we need Python 3.9+ to run the code.
I solve the problem by Anaconda.
Please write the do so in README.md.
Details:
The error is as follows:
/mnt/my_raid/github/pdf-agent/venv/bin/python /mnt/my_raid/github/pdf-agent/plamo.py
Loading checkpoint shards: 100%|ββββββββββ| 3/3 [00:02<00:00, 1.47it/s]
Traceback (most recent call last):
File "/mnt/my_raid/github/pdf-agent/plamo.py", line 2, in <module>
pipeline = transformers.pipeline("text-generation", model="pfnet/plamo-13b", trust_remote_code=True)
File "/mnt/my_raid/github/pdf-agent/venv/lib/python3.8/site-packages/transformers/pipelines/__init__.py", line 921, in pipeline
tokenizer = AutoTokenizer.from_pretrained(
File "/mnt/my_raid/github/pdf-agent/venv/lib/python3.8/site-packages/transformers/models/auto/tokenization_auto.py", line 719, in from_pretrained
tokenizer_class = get_class_from_dynamic_module(class_ref, pretrained_model_name_or_path, **kwargs)
File "/mnt/my_raid/github/pdf-agent/venv/lib/python3.8/site-packages/transformers/dynamic_module_utils.py", line 497, in get_class_from_dynamic_module
return get_class_in_module(class_name, final_module.replace(".py", ""))
File "/mnt/my_raid/github/pdf-agent/venv/lib/python3.8/site-packages/transformers/dynamic_module_utils.py", line 199, in get_class_in_module
module = importlib.import_module(module_path)
File "/usr/lib/python3.8/importlib/__init__.py", line 127, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "<frozen importlib._bootstrap>", line 1014, in _gcd_import
File "<frozen importlib._bootstrap>", line 991, in _find_and_load
File "<frozen importlib._bootstrap>", line 975, in _find_and_load_unlocked
File "<frozen importlib._bootstrap>", line 671, in _load_unlocked
File "<frozen importlib._bootstrap_external>", line 848, in exec_module
File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
File "/mnt/NVM/cache/huggingface/modules/transformers_modules/pfnet/plamo-13b/49cc9f008452826562392c8940b4dc7ac1e4d83c/tokenization_plamo.py", line 14, in <module>
class PlamoTokenizer(PreTrainedTokenizer): # type: ignore
File "/mnt/NVM/cache/huggingface/modules/transformers_modules/pfnet/plamo-13b/49cc9f008452826562392c8940b4dc7ac1e4d83c/tokenization_plamo.py", line 63, in PlamoTokenizer
def __getstate__(self) -> dict[str, Any]:
TypeError: 'type' object is not subscriptable
The ChatGPT's answer is as follows:
The error you're encountering, TypeError: 'type' object is not subscriptable, usually occurs in Python when you're trying to subscript a type object which doesn't support indexing or subscripting. This error commonly happens in Python versions below 3.9 when trying to use the built-in dict or list types as subscriptable types, as in Python 3.9+.
So, I have installed Python 3.10 by Anaconda.
Then, the code runed and I have got the result:
[{'generated_text': 'The future of artificial intelligence technology is fascinating, but itβs also scary. There are many scenarios that could play out, and thereβs'}]
According to
@shunk031
, we cope with this error by from __future__ import annotation
.
But, python 3.8 is obsolete.
Thank you for pointing this out!
Indeed, Python 3.8 isn't compatible with dict for type hint, resulting in the error you experienced. We'll address this issue by adding from __future__ import annotation
or by using typing.Dict
, to ensure better support for Python 3.8.
It's worth noting we conducted our model testing on Python 3.10. While other Python versions may also work correctly, we don't guarantee them at this moment. We'll be sure to include this information in our README to keep users informed.
Once again, we appreciate your alertness and collaboration.
Thanks. I will close this discussion.