wfwizard commited on
Commit
9fb4a10
·
verified ·
1 Parent(s): fdb5b43

Update unit1/dummy_agent_library.ipynb

Browse files

Updated documentation comments to include setup instructions for Google Colab Secrets

Files changed (1) hide show
  1. unit1/dummy_agent_library.ipynb +2 -2
unit1/dummy_agent_library.ipynb CHANGED
@@ -31,7 +31,7 @@
31
  "In the Hugging Face ecosystem, there is a convenient feature called Serverless API that allows you to easily run inference on many models. There's no installation or deployment required.\n",
32
  "\n",
33
  "To run this notebook, **you need a Hugging Face token** that you can get from https://hf.co/settings/tokens. A \"Read\" token type is sufficient.\n",
34
- "- If you are running this notebook on Google Colab, you can set it up in the \"settings\" tab under \"secrets\". Make sure to call it \"HF_TOKEN\" and restart the session to load the environment variable (Runtime -> Restart session).\n",
35
  "- If you are running this notebook locally, you can set it up as an [environment variable](https://huggingface.co/docs/huggingface_hub/en/package_reference/environment_variables). Make sure you restart the kernel after installing or updating huggingface_hub. You can update huggingface_hub by modifying the above `!pip install -q huggingface_hub -U`"
36
  ]
37
  },
@@ -44,7 +44,7 @@
44
  "import os\n",
45
  "from huggingface_hub import InferenceClient\n",
46
  "\n",
47
- "## You need a token from https://hf.co/settings/tokens, ensure that you select 'read' as the token type. If you run this on Google Colab, you can set it up in the \"settings\" tab under \"secrets\". Make sure to call it \"HF_TOKEN\"\n",
48
  "# HF_TOKEN = os.environ.get(\"HF_TOKEN\")\n",
49
  "\n",
50
  "client = InferenceClient(model=\"moonshotai/Kimi-K2.5\")"
 
31
  "In the Hugging Face ecosystem, there is a convenient feature called Serverless API that allows you to easily run inference on many models. There's no installation or deployment required.\n",
32
  "\n",
33
  "To run this notebook, **you need a Hugging Face token** that you can get from https://hf.co/settings/tokens. A \"Read\" token type is sufficient.\n",
34
+ "- If you are running this notebook on Google Colab, you can set it up by clicking the 🔑 (Secrets) icon on the left palette, add "SECRET_NAME"(\"HF_TOKEN\"), and toggle "Notebook access", restart the session to load the environment variable (Runtime -> Restart session).\n",
35
  "- If you are running this notebook locally, you can set it up as an [environment variable](https://huggingface.co/docs/huggingface_hub/en/package_reference/environment_variables). Make sure you restart the kernel after installing or updating huggingface_hub. You can update huggingface_hub by modifying the above `!pip install -q huggingface_hub -U`"
36
  ]
37
  },
 
44
  "import os\n",
45
  "from huggingface_hub import InferenceClient\n",
46
  "\n",
47
+ "## You need a token from https://hf.co/settings/tokens, ensure that you select 'read' as the token type. If you run this on Google Colab, you can set it up by clicking the 🔑 (Secrets) icon on the left palette, add "SECRET_NAME"(\"HF_TOKEN\"), and toggle "Notebook access", restart the session to load the environment variable (Runtime -> Restart session)."\n",
48
  "# HF_TOKEN = os.environ.get(\"HF_TOKEN\")\n",
49
  "\n",
50
  "client = InferenceClient(model=\"moonshotai/Kimi-K2.5\")"