Cannot access gated repo

#123
by Kandhuri - opened

Hi, I am Kiran, I have been granted access to use llma 3 by hugging face, but when try to access the model from the VScode it says Cannot access gated repo. I am using hugging face token (Read only) to access the model from vscode.

Hi this worked.
if you are using the Vscosde follow the instructions below

from huggingface_hub import login
from transformers import AutoTokenizer, AutoModelForCausalLM
from transformers import pipeline

login(token = 'your_token')

tokenizer = AutoTokenizer.from_pretrained(
"meta-llama/Meta-Llama-3-8B",
cache_dir="/kaggle/working/"
)

model = AutoModelForCausalLM.from_pretrained(
"meta-llama/Meta-Llama-3-8B",
cache_dir="/kaggle/working/",
device_map="auto",
)

osanseviero changed discussion status to closed

Gated model You have been granted access to this model. However, Exception has occurred: OSError
We couldn't connect to 'https://huggingface.co' to load this file, couldn't find it in the cached files and it looks like meta-llama/Meta-Llama-3-8B is not the path to a directory containing a file named config.json.
Checkout your internet connection or see how to run the library in offline mode at 'https://huggingface.co/docs/transformers/installation#offline-mode'.
requests.exceptions.HTTPError: 403 Client Error: Forbidden for url: https://huggingface.co/meta-llama/Meta-Llama-3-8B/resolve/main/config.json

The above exception was the direct cause of the following exception:

huggingface_hub.utils._errors.HfHubHTTPError: (Request ID: Root=1-66684564-4b77863a5ca2c26c4729723a;c2d90b76-e1ee-496b-8e5b-825662520211)

403 Forbidden: Authorization error..
Cannot access content at: https://huggingface.co/meta-llama/Meta-Llama-3-8B/resolve/main/config.json.
If you are trying to create or update content,make sure you have a token with the write role.

The above exception was the direct cause of the following exception:

huggingface_hub.utils._errors.LocalEntryNotFoundError: An error happened while trying to locate the file on the Hub and we cannot find the requested files in the local cache. Please check your connection and try again or make sure your Internet connection is on.

The above exception was the direct cause of the following exception:

File "/home/hailay/Desktop/neumann/Finetune.py", line 10, in
tokenizer = AutoTokenizer.from_pretrained(
OSError: We couldn't connect to 'https://huggingface.co' to load this file, couldn't find it in the cached files and it looks like meta-llama/Meta-Llama-3-8B is not the path to a directory containing a file named config.json.
Checkout your internet connection or see how to run the library in offline mode at 'https://huggingface.co/docs/transformers/installation#offline-mode'.

I have the same problem:

when I write :
from huggingface_hub import login
from transformers import AutoTokenizer, AutoModelForCausalLM
from transformers import pipeline

login(token = 'my token')

tokenizer = AutoTokenizer.from_pretrained(
"meta-llama/Meta-Llama-3-8B",
cache_dir="/kaggle/working/"
)

model = AutoModelForCausalLM.from_pretrained(
"meta-llama/Meta-Llama-3.1-8B",
cache_dir="/kaggle/working/",
device_map="auto",

I am getting this message:
The token has not been saved to the git credentials helper. Pass add_to_git_credential=True in this function directly or --add-to-git-credential if using via huggingface-cli if you want to set the git credential as well.
Token is valid (permission: fineGrained).
Your token has been saved to /Users/my user/.cache/huggingface/token
Login successful

HTTPError Traceback (most recent call last)
File /opt/anaconda3/envs/thesis/lib/python3.11/site-packages/huggingface_hub/utils/_errors.py:304, in hf_raise_for_status(response, endpoint_name)
303 try:
--> 304 response.raise_for_status()
305 except HTTPError as e:

File /opt/anaconda3/envs/thesis/lib/python3.11/site-packages/requests/models.py:1024, in Response.raise_for_status(self)
1023 if http_error_msg:
-> 1024 raise HTTPError(http_error_msg, response=self)

HTTPError: 403 Client Error: Forbidden for url: https://huggingface.co/meta-llama/Meta-Llama-3-8B/resolve/main/config.json

The above exception was the direct cause of the following exception:

GatedRepoError Traceback (most recent call last)
File /opt/anaconda3/envs/thesis/lib/python3.11/site-packages/transformers/utils/hub.py:402, in cached_file(path_or_repo_id, filename, cache_dir, force_download, resume_download, proxies, token, revision, local_files_only, subfolder, repo_type, user_agent, _raise_exceptions_for_gated_repo, _raise_exceptions_for_missing_entries, _raise_exceptions_for_connection_errors, _commit_hash, **deprecated_kwargs)
400 try:
401 # Load from URL or cache if already cached
--> 402 resolved_file = hf_hub_download(
403 path_or_repo_id,
404 filename,
405 subfolder=None if len(subfolder) == 0 else subfolder,
406 repo_type=repo_type,
407 revision=revision,
...
Make sure to have access to it at https://huggingface.co/meta-llama/Meta-Llama-3-8B.
403 Client Error. (Request ID: Root=1-66b75bdd-24e5552314a689bc452c2afa;f41afa8a-51a6-4bb9-aa77-0101110d127f)

Cannot access gated repo for url https://huggingface.co/meta-llama/Meta-Llama-3-8B/resolve/main/config.json.
Access to model meta-llama/Meta-Llama-3-8B is restricted and you are not in the authorized list. Visit https://huggingface.co/meta-llama/Meta-Llama-3-8B to ask for access.

iin mine it says like this

OSError: You are trying to access a gated repo.
Make sure to have access to it at https://huggingface.co/meta-llama/Meta-Llama-3.1-8B.
403 Client Error. (Request ID: Root=1-66bde90e-307a3c0419da184f621cea2a;56d907ca-6175-4ec2-88c8-0ec3a17003d0)

Cannot access gated repo for url https://huggingface.co/meta-llama/Meta-Llama-3.1-8B/resolve/main/config.json.
Your request to access model meta-llama/Meta-Llama-3.1-8B is awaiting a review from the repo authors.

very sad cant access this

Fill out the form at the end of the "Expand to review and Access" section and you will be shortly granted access.
It is now required to access almost all the meta models.

imageedit_2_9809813995.png
Screenshot from 2024-08-19 11-34-17.png

i had similar problem, where i can't access the repo event though I got access to it.
I throws a error "unable to access gated repo"

Solution:

login in your hugging face token using

$ huggingface-cli login

passing the token with the line

def get_huggingface_token():
# Define the path to the token file (updated path)
token_file = Path.home() / ".cache" / "huggingface" / "token"

# Check if the token file exists
if token_file.exists():
    with open(token_file, "r") as file:
        token = file.read().strip()
        return token
else:
    raise FileNotFoundError("Hugging Face token file not found. Please run 'huggingface-cli login'.")

Fetch the token using the function

HF_TOKEN = get_huggingface_token()

hi i also have a problem, i was granted access but i cannot download the model:
huggingface_hub.errors.LocalEntryNotFoundError: An error happened while trying to locate the file on the Hub and we cannot find the requested files in the local cache. Please check your connection and try again or make sure your Internet connection is on.

thanks.

@zt672 , that sounds like you have an internet issue.

The following should work:

huggingface-cli download meta-llama/Meta-Llama-3-8B --local-dir llama-3-8b --token hf_token

If none of the above work, check your transformers version. I updated mine from 4.40 to 4.43 and it worked fine.

I'm not sure about that, but I just changed the type of my token from 'fine grained' to 'read'. The issue was fixed.

It's weird. Even though I use the type of my token with 'read' that xxxgosh commented, I still have same issue.

Log:

(venv-20241113-0900) geunsik-lim@ai02:~/fine-tuning-axolotl$
(venv-20241113-0900) geunsik-lim@ai02:~/fine-tuning-axolotl$ huggingface-cli download meta-llama/Meta-Llama-3-8B --local-dir llama-3-8b --token hf_token
Fetching 17 files:   0%|                                                                                                                                                                                                                                               | 0/17 [00:00<?, ?it/s]
Traceback (most recent call last):
  File "/home/geunsik-lim/.local/lib/python3.10/site-packages/huggingface_hub/utils/_errors.py", line 304, in hf_raise_for_status
    response.raise_for_status()
  File "/home/geunsik-lim/.local/lib/python3.10/site-packages/requests/models.py", line 1024, in raise_for_status
    raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 401 Client Error: Unauthorized for url: https://huggingface.co/meta-llama/Meta-Llama-3-8B/resolve/8cde5ca8380496c9a6cc7ef3a8b46a0372a1d920/.gitattributes

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/usr/local/bin/huggingface-cli", line 8, in <module>
    sys.exit(main())
  File "/home/geunsik-lim/.local/lib/python3.10/site-packages/huggingface_hub/commands/huggingface_cli.py", line 51, in main
    service.run()
  File "/home/geunsik-lim/.local/lib/python3.10/site-packages/huggingface_hub/commands/download.py", line 146, in run
    print(self._download())  # Print path to downloaded files
  File "/home/geunsik-lim/.local/lib/python3.10/site-packages/huggingface_hub/commands/download.py", line 180, in _download
    return snapshot_download(
  File "/home/geunsik-lim/.local/lib/python3.10/site-packages/huggingface_hub/utils/_validators.py", line 114, in _inner_fn
    return fn(*args, **kwargs)
  File "/home/geunsik-lim/.local/lib/python3.10/site-packages/huggingface_hub/_snapshot_download.py", line 294, in snapshot_download
    thread_map(
  File "/home/geunsik-lim/.local/lib/python3.10/site-packages/tqdm/contrib/concurrent.py", line 69, in thread_map
    return _executor_map(ThreadPoolExecutor, fn, *iterables, **tqdm_kwargs)
  File "/home/geunsik-lim/.local/lib/python3.10/site-packages/tqdm/contrib/concurrent.py", line 51, in _executor_map
    return list(tqdm_class(ex.map(fn, *iterables, chunksize=chunksize), **kwargs))
  File "/home/geunsik-lim/.local/lib/python3.10/site-packages/tqdm/std.py", line 1181, in __iter__
    for obj in iterable:
  File "/usr/lib/python3.10/concurrent/futures/_base.py", line 621, in result_iterator
    yield _result_or_cancel(fs.pop())
  File "/usr/lib/python3.10/concurrent/futures/_base.py", line 319, in _result_or_cancel
    return fut.result(timeout)
  File "/usr/lib/python3.10/concurrent/futures/_base.py", line 458, in result
    return self.__get_result()
  File "/usr/lib/python3.10/concurrent/futures/_base.py", line 403, in __get_result
    raise self._exception
  File "/usr/lib/python3.10/concurrent/futures/thread.py", line 58, in run
    result = self.fn(*self.args, **self.kwargs)
  File "/home/geunsik-lim/.local/lib/python3.10/site-packages/huggingface_hub/_snapshot_download.py", line 268, in _inner_hf_hub_download
    return hf_hub_download(
  File "/home/geunsik-lim/.local/lib/python3.10/site-packages/huggingface_hub/utils/_validators.py", line 114, in _inner_fn
    return fn(*args, **kwargs)
  File "/home/geunsik-lim/.local/lib/python3.10/site-packages/huggingface_hub/file_download.py", line 1202, in hf_hub_download
    return _hf_hub_download_to_local_dir(
  File "/home/geunsik-lim/.local/lib/python3.10/site-packages/huggingface_hub/file_download.py", line 1440, in _hf_hub_download_to_local_dir
    _raise_on_head_call_error(head_call_error, force_download, local_files_only)
  File "/home/geunsik-lim/.local/lib/python3.10/site-packages/huggingface_hub/file_download.py", line 1823, in _raise_on_head_call_error
    raise head_call_error
  File "/home/geunsik-lim/.local/lib/python3.10/site-packages/huggingface_hub/file_download.py", line 1722, in _get_metadata_or_catch_error
    metadata = get_hf_file_metadata(url=url, proxies=proxies, timeout=etag_timeout, headers=headers)
  File "/home/geunsik-lim/.local/lib/python3.10/site-packages/huggingface_hub/utils/_validators.py", line 114, in _inner_fn
    return fn(*args, **kwargs)
  File "/home/geunsik-lim/.local/lib/python3.10/site-packages/huggingface_hub/file_download.py", line 1645, in get_hf_file_metadata
    r = _request_wrapper(
  File "/home/geunsik-lim/.local/lib/python3.10/site-packages/huggingface_hub/file_download.py", line 372, in _request_wrapper
    response = _request_wrapper(
  File "/home/geunsik-lim/.local/lib/python3.10/site-packages/huggingface_hub/file_download.py", line 396, in _request_wrapper
    hf_raise_for_status(response)
  File "/home/geunsik-lim/.local/lib/python3.10/site-packages/huggingface_hub/utils/_errors.py", line 321, in hf_raise_for_status
    raise GatedRepoError(message, response) from e
huggingface_hub.utils._errors.GatedRepoError: 401 Client Error. (Request ID: Root=1-67354178-22381c4f3eed232b3c41196d;a82cbc02-b3fb-46c8-897d-1f1f5b2cc607)

Cannot access gated repo for url https://huggingface.co/meta-llama/Meta-Llama-3-8B/resolve/8cde5ca8380496c9a6cc7ef3a8b46a0372a1d920/.gitattributes.
Access to model meta-llama/Meta-Llama-3-8B is restricted. You must have access to it and be authenticated to access it. Please log in.
(venv-20241113-0900) geunsik-lim@ai02:~/fine-tuning-axolotl$

end.

Sign up or log in to comment