Error does not appear to have a file named jinaai/jina-bert-implementation--configuration_bert.py
OSError: jinaai/jina-embeddings-v2-small-en does not appear to have a file named jinaai/jina-bert-implementation--configuration_bert.py. Checkout 'https://huggingface.co/jinaai/jina-embeddings-v2-small-en/main' for available files.
I'm getting issues using your repo with various tools, llm(1), sentence transformers, hugging face. Is there something wrong with the repo? Thanks
It must be something in the code. I'm getting the same error.
Related issues:
- https://twitter.com/simonw/status/1716644983917392330?s=20
- https://github.com/xenova/transformers.js/issues/371
It's probably due to trust_remote_code
parameter.
Cool, but without it one gets this:
ValueError: Loading jinaai/jina-embeddings-v2-small-en requires you to execute the configuration file in that repo on your local machine. Make sure you have read the code there to avoid malicious use, then set the option `trust_remote_code=True` to remove this error.
I'm also getting the same error(s) — both with and without the trust_remote_code
parameter
Are you perhaps using an old version of the transformers
library?
In order to make the repositories easier to maintain, we use the remote repository reference feature of the Auto class. This was added in 4.29.0 (look for "Code on the Hub from another repo" in the release notes).
Do you have some examples we can use to reproduce the error you are observing?
I don't think so, I was using transformers 4.34.1
.
After running model = AutoModel.from_pretrained('jinaai/jina-embeddings-v2-base-en', trust_remote_code=True)
, I got the same OSError:
OSError: jinaai/jina-embeddings-v2-small-en does not appear to have a file named jinaai/jina-bert-implementation--configuration_bert.py. Checkout 'https://huggingface.co/jinaai/jina-embeddings-v2-small-en/main' for available files.
I got the same error , seems the jina-bert-implementation--modeling_bert.py file is not found on hugging face repo.
The author of transformers package is working on this issue.
https://github.com/huggingface/transformers/issues/27035
if you use langchain :
from langchain.embeddings import HuggingFaceEmbeddings
model_name = jina-embeddings-v2-base-en
model_kwargs = {'device': 'cuda'}
encode_kwargs = {'normalize_embeddings': True} # set True to compute cosine similarity
model = HuggingFaceEmbeddings(
model_name=model_name,
model_kwargs=model_kwargs,
encode_kwargs=encode_kwargs,
)
will see:
The repository for jina-embeddings-v2-base-en/ contains custom code which must be executed to correctlyload the model. You can inspect the repository content at hjina-embeddings-v2-base-en/.
You can avoid this prompt in future by passing the argument `trust_remote_code=True`.
Do you wish to run the custom code? [y/N]
avoid this by modify /transformers/models/auto/auto_factory.py
line 451: trust_remote_code = kwargs.pop("trust_remote_code", None)
to trust_remote_code = True
thanks! I'll close this issue for now, please feel free to open if you have further questions!
Fixed by updated transformers package