releasing app to HF space
Browse files- Dockerfile +11 -0
- LICENSE +21 -0
- README.md +184 -6
- app.py +80 -0
- chainlit.md +3 -0
- requirements.txt +5 -0
Dockerfile
ADDED
@@ -0,0 +1,11 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
FROM python:3.9
|
2 |
+
RUN useradd -m -u 1000 user
|
3 |
+
USER user
|
4 |
+
ENV HOME=/home/user \
|
5 |
+
PATH=/home/user/.local/bin:$PATH
|
6 |
+
WORKDIR $HOME/app
|
7 |
+
COPY --chown=user . $HOME/app
|
8 |
+
COPY ./requirements.txt ~/app/requirements.txt
|
9 |
+
RUN pip install -r requirements.txt
|
10 |
+
COPY . .
|
11 |
+
CMD ["chainlit", "run", "app.py", "--port", "7860"]
|
LICENSE
ADDED
@@ -0,0 +1,21 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
MIT License
|
2 |
+
|
3 |
+
Copyright (c) 2024 Nithin Kamavaram
|
4 |
+
|
5 |
+
Permission is hereby granted, free of charge, to any person obtaining a copy
|
6 |
+
of this software and associated documentation files (the "Software"), to deal
|
7 |
+
in the Software without restriction, including without limitation the rights
|
8 |
+
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
9 |
+
copies of the Software, and to permit persons to whom the Software is
|
10 |
+
furnished to do so, subject to the following conditions:
|
11 |
+
|
12 |
+
The above copyright notice and this permission notice shall be included in all
|
13 |
+
copies or substantial portions of the Software.
|
14 |
+
|
15 |
+
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
16 |
+
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
17 |
+
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
18 |
+
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
19 |
+
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
20 |
+
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
21 |
+
SOFTWARE.
|
README.md
CHANGED
@@ -1,11 +1,189 @@
|
|
1 |
---
|
2 |
-
title:
|
3 |
-
emoji:
|
4 |
-
colorFrom:
|
5 |
-
colorTo:
|
6 |
sdk: docker
|
7 |
pinned: false
|
8 |
-
license: openrail
|
9 |
---
|
10 |
|
11 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
---
|
2 |
+
title: BeyondChatGPT Demo
|
3 |
+
emoji: π
|
4 |
+
colorFrom: pink
|
5 |
+
colorTo: yellow
|
6 |
sdk: docker
|
7 |
pinned: false
|
|
|
8 |
---
|
9 |
|
10 |
+
<p align = "center" draggable=βfalseβ ><img src="https://github.com/AI-Maker-Space/LLM-Dev-101/assets/37101144/d1343317-fa2f-41e1-8af1-1dbb18399719"
|
11 |
+
width="200px"
|
12 |
+
height="auto"/>
|
13 |
+
</p>
|
14 |
+
|
15 |
+
|
16 |
+
## <h1 align="center" id="heading">:wave: Welcome to Beyond ChatGPT!!</h1>
|
17 |
+
|
18 |
+
For a step-by-step YouTube video walkthrough, watch this! [Deploying Chainlit app on Hugging Face](https://www.youtube.com/live/pRbbZcL0NMI?si=NAYhMZ_suAY84f06&t=2119)
|
19 |
+
|
20 |
+
![Beyond ChatGPT: Build Your First LLM Application](https://github.com/AI-Maker-Space/Beyond-ChatGPT/assets/48775140/cb7a74b8-28af-4d12-a008-8f5a51d47b4c)
|
21 |
+
|
22 |
+
## π€ Your First LLM App
|
23 |
+
|
24 |
+
> If you need an introduction to `git`, or information on how to set up API keys for the tools we'll be using in this repository - check out our [Interactive Dev Environment for LLM Development](https://github.com/AI-Maker-Space/Interactive-Dev-Environment-for-LLM-Development/tree/main) which has everything you'd need to get started in this repository!
|
25 |
+
|
26 |
+
In this repository, we'll walk you through the steps to create a Large Language Model (LLM) application using Chainlit, then containerize it using Docker, and finally deploy it on Huggingface Spaces.
|
27 |
+
|
28 |
+
Are you ready? Let's get started!
|
29 |
+
|
30 |
+
<details>
|
31 |
+
<summary>π₯οΈ Accessing "gpt-3.5-turbo" (ChatGPT) like a developer</summary>
|
32 |
+
|
33 |
+
1. Head to [this notebook](https://colab.research.google.com/drive/1mOzbgf4a2SP5qQj33ZxTz2a01-5eXqk2?usp=sharing) and follow along with the instructions!
|
34 |
+
|
35 |
+
2. Complete the notebook and try out your own system/assistant messages!
|
36 |
+
|
37 |
+
That's it! Head to the next step and start building your application!
|
38 |
+
|
39 |
+
</details>
|
40 |
+
|
41 |
+
|
42 |
+
<details>
|
43 |
+
<summary>ποΈ Building Your First LLM App</summary>
|
44 |
+
|
45 |
+
1. Clone [this](https://github.com/AI-Maker-Space/Beyond-ChatGPT/tree/main) repo.
|
46 |
+
|
47 |
+
``` bash
|
48 |
+
git clone https://github.com/AI-Maker-Space/Beyond-ChatGPT.git
|
49 |
+
```
|
50 |
+
|
51 |
+
2. Navigate inside this repo
|
52 |
+
``` bash
|
53 |
+
cd Beyond-ChatGPT
|
54 |
+
```
|
55 |
+
|
56 |
+
3. Install the packages required for this python envirnoment in `requirements.txt`.
|
57 |
+
``` bash
|
58 |
+
pip install -r requirements.txt
|
59 |
+
```
|
60 |
+
|
61 |
+
4. Open your `.env` file. Replace the `###` in your `.env` file with your OpenAI Key and save the file.
|
62 |
+
``` bash
|
63 |
+
OPENAI_API_KEY=sk-###
|
64 |
+
```
|
65 |
+
|
66 |
+
5. Let's try deploying it locally. Make sure you're in the python environment where you installed Chainlit and OpenAI. Run the app using Chainlit. This may take a minute to run.
|
67 |
+
``` bash
|
68 |
+
chainlit run app.py -w
|
69 |
+
```
|
70 |
+
|
71 |
+
<p align = "center" draggable=βfalseβ>
|
72 |
+
<img src="https://github.com/AI-Maker-Space/LLMOps-Dev-101/assets/37101144/54bcccf9-12e2-4cef-ab53-585c1e2b0fb5">
|
73 |
+
</p>
|
74 |
+
|
75 |
+
Great work! Let's see if we can interact with our chatbot.
|
76 |
+
|
77 |
+
<p align = "center" draggable=βfalseβ>
|
78 |
+
<img src="https://github.com/AI-Maker-Space/LLMOps-Dev-101/assets/37101144/854e4435-1dee-438a-9146-7174b39f7c61">
|
79 |
+
</p>
|
80 |
+
|
81 |
+
Awesome! Time to throw it into a docker container and prepare it for shipping!
|
82 |
+
</details>
|
83 |
+
|
84 |
+
|
85 |
+
|
86 |
+
<details>
|
87 |
+
<summary>π³ Containerizing our App</summary>
|
88 |
+
|
89 |
+
1. Let's build the Docker image. We'll tag our image as `llm-app` using the `-t` parameter. The `.` at the end means we want all of the files in our current directory to be added to our image.
|
90 |
+
|
91 |
+
``` bash
|
92 |
+
docker build -t llm-app .
|
93 |
+
```
|
94 |
+
|
95 |
+
2. Run and test the Docker image locally using the `run` command. The `-p`parameter connects our **host port #** to the left of the `:` to our **container port #** on the right.
|
96 |
+
|
97 |
+
``` bash
|
98 |
+
docker run -p 7860:7860 llm-app
|
99 |
+
```
|
100 |
+
|
101 |
+
3. Visit http://localhost:7860 in your browser to see if the app runs correctly.
|
102 |
+
|
103 |
+
<p align = "center" draggable=βfalseβ>
|
104 |
+
<img src="https://github.com/AI-Maker-Space/LLMOps-Dev-101/assets/37101144/2c764f25-09a0-431b-8d28-32246e0ca1b7">
|
105 |
+
</p>
|
106 |
+
|
107 |
+
Great! Time to ship!
|
108 |
+
</details>
|
109 |
+
|
110 |
+
|
111 |
+
<details>
|
112 |
+
<summary>π Deploying Your First LLM App</summary>
|
113 |
+
|
114 |
+
1. Let's create a new Huggingface Space. Navigate to [Huggingface](https://huggingface.co) and click on your profile picture on the top right. Then click on `New Space`.
|
115 |
+
|
116 |
+
<p align = "center" draggable=βfalseβ>
|
117 |
+
<img src="https://github.com/AI-Maker-Space/LLMOps-Dev-101/assets/37101144/f0656408-28b8-4876-9887-8f0c4b882bae">
|
118 |
+
</p>
|
119 |
+
|
120 |
+
2. Setup your space as shown below:
|
121 |
+
|
122 |
+
- Owner: Your username
|
123 |
+
- Space Name: `llm-app`
|
124 |
+
- License: `Openrail`
|
125 |
+
- Select the Space SDK: `Docker`
|
126 |
+
- Docker Template: `Blank`
|
127 |
+
- Space Hardware: `CPU basic - 2 vCPU - 16 GB - Free`
|
128 |
+
- Repo type: `Public`
|
129 |
+
|
130 |
+
<p align = "center" draggable=βfalseβ>
|
131 |
+
<img src="https://github.com/AI-Maker-Space/LLMOps-Dev-101/assets/37101144/8f16afd1-6b46-4d9f-b642-8fefe355c5c9">
|
132 |
+
</p>
|
133 |
+
|
134 |
+
3. You should see something like this. We're now ready to send our files to our Huggingface Space. After cloning, move your files to this repo and push it along with your docker file. You DO NOT need to create a Dockerfile. Make sure NOT TO push your `.env` file. This should automatically be ignored.
|
135 |
+
|
136 |
+
<p align = "center" draggable=βfalseβ>
|
137 |
+
<img src="https://github.com/AI-Maker-Space/LLMOps-Dev-101/assets/37101144/cbf366e2-7613-4223-932a-72c67a73f9c6">
|
138 |
+
</p>
|
139 |
+
|
140 |
+
4. After pushing all files, navigate to the settings in the top right to add your OpenAI API key.
|
141 |
+
|
142 |
+
<p align = "center" draggable=βfalseβ>
|
143 |
+
<img src="https://github.com/AI-Maker-Space/LLMOps-Dev-101/assets/37101144/a1123a6f-abdd-4f76-bea4-39acf9928762">
|
144 |
+
</p>
|
145 |
+
|
146 |
+
5. Scroll down to `Variables and secrets` and click on `New secret` on the top right.
|
147 |
+
|
148 |
+
<p align = "center" draggable=βfalseβ>
|
149 |
+
<img src="https://github.com/AI-Maker-Space/LLMOps-Dev-101/assets/37101144/a8a4a25d-752b-4036-b572-93381370c2db">
|
150 |
+
</p>
|
151 |
+
|
152 |
+
6. Set the name to `OPENAI_API_KEY` and add your OpenAI key under `Value`. Click save.
|
153 |
+
|
154 |
+
<p align = "center" draggable=βfalseβ>
|
155 |
+
<img src="https://github.com/AI-Maker-Space/LLMOps-Dev-101/assets/37101144/0a897538-1779-48ff-bcb4-486af30f7a14">
|
156 |
+
</p>
|
157 |
+
|
158 |
+
7. To ensure your key is being used, we recommend you `Restart this Space`.
|
159 |
+
|
160 |
+
<p align = "center" draggable=βfalseβ>
|
161 |
+
<img src="https://github.com/AI-Maker-Space/LLMOps-Dev-101/assets/37101144/fb1d83af-6ebe-4676-8bf5-b6d88f07c583">
|
162 |
+
</p>
|
163 |
+
|
164 |
+
8. Congratulations! You just deployed your first LLM! πππ Get on linkedin and post your results and experience! Make sure to tag us at #AIMakerspace !
|
165 |
+
|
166 |
+
Here's a template to get your post started!
|
167 |
+
|
168 |
+
```
|
169 |
+
ππ Exciting News! ππ
|
170 |
+
|
171 |
+
ποΈΒ Today, I'm thrilled to announce that I've successfully built and shipped my first-ever LLM using the powerful combination of Chainlit, Docker, and the OpenAI API! π₯οΈ
|
172 |
+
|
173 |
+
Check it out π
|
174 |
+
[LINK TO APP]
|
175 |
+
|
176 |
+
A big shoutout to the @**AI Makerspace** for all making this possible. Couldn't have done it without the incredible community there. π€π
|
177 |
+
|
178 |
+
Looking forward to building with the community! πβ¨Β Here's to many more creations ahead! π₯π
|
179 |
+
|
180 |
+
Who else is diving into the world of AI? Let's connect! ππ‘
|
181 |
+
|
182 |
+
#FirstLLM #Chainlit #Docker #OpenAI #AIMakerspace
|
183 |
+
```
|
184 |
+
|
185 |
+
</details>
|
186 |
+
|
187 |
+
<p></p>
|
188 |
+
|
189 |
+
### That's it for now! And so it begins.... :)
|
app.py
ADDED
@@ -0,0 +1,80 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
# You can find this code for Chainlit python streaming here (https://docs.chainlit.io/concepts/streaming/python)
|
2 |
+
|
3 |
+
# OpenAI Chat completion
|
4 |
+
import os
|
5 |
+
from openai import AsyncOpenAI # importing openai for API usage
|
6 |
+
import chainlit as cl # importing chainlit for our app
|
7 |
+
from chainlit.prompt import Prompt, PromptMessage # importing prompt tools
|
8 |
+
from chainlit.playground.providers import ChatOpenAI # importing ChatOpenAI tools
|
9 |
+
from dotenv import load_dotenv
|
10 |
+
|
11 |
+
load_dotenv()
|
12 |
+
|
13 |
+
# ChatOpenAI Templates
|
14 |
+
system_template = """You are a helpful assistant who always speaks in a pleasant tone!
|
15 |
+
"""
|
16 |
+
|
17 |
+
user_template = """{input}
|
18 |
+
Think through your response step by step.
|
19 |
+
"""
|
20 |
+
|
21 |
+
|
22 |
+
@cl.on_chat_start # marks a function that will be executed at the start of a user session
|
23 |
+
async def start_chat():
|
24 |
+
settings = {
|
25 |
+
"model": "gpt-3.5-turbo",
|
26 |
+
"temperature": 0,
|
27 |
+
"max_tokens": 500,
|
28 |
+
"top_p": 1,
|
29 |
+
"frequency_penalty": 0,
|
30 |
+
"presence_penalty": 0,
|
31 |
+
}
|
32 |
+
|
33 |
+
cl.user_session.set("settings", settings)
|
34 |
+
|
35 |
+
|
36 |
+
@cl.on_message # marks a function that should be run each time the chatbot receives a message from a user
|
37 |
+
async def main(message: cl.Message):
|
38 |
+
settings = cl.user_session.get("settings")
|
39 |
+
|
40 |
+
client = AsyncOpenAI()
|
41 |
+
|
42 |
+
print(message.content)
|
43 |
+
|
44 |
+
prompt = Prompt(
|
45 |
+
provider=ChatOpenAI.id,
|
46 |
+
messages=[
|
47 |
+
PromptMessage(
|
48 |
+
role="system",
|
49 |
+
template=system_template,
|
50 |
+
formatted=system_template,
|
51 |
+
),
|
52 |
+
PromptMessage(
|
53 |
+
role="user",
|
54 |
+
template=user_template,
|
55 |
+
formatted=user_template.format(input=message.content),
|
56 |
+
),
|
57 |
+
],
|
58 |
+
inputs={"input": message.content},
|
59 |
+
settings=settings,
|
60 |
+
)
|
61 |
+
|
62 |
+
print([m.to_openai() for m in prompt.messages])
|
63 |
+
|
64 |
+
msg = cl.Message(content="")
|
65 |
+
|
66 |
+
# Call OpenAI
|
67 |
+
async for stream_resp in await client.chat.completions.create(
|
68 |
+
messages=[m.to_openai() for m in prompt.messages], stream=True, **settings
|
69 |
+
):
|
70 |
+
token = stream_resp.choices[0].delta.content
|
71 |
+
if not token:
|
72 |
+
token = ""
|
73 |
+
await msg.stream_token(token)
|
74 |
+
|
75 |
+
# Update the prompt object with the completion
|
76 |
+
prompt.completion = msg.content
|
77 |
+
msg.prompt = prompt
|
78 |
+
|
79 |
+
# Send and close the message stream
|
80 |
+
await msg.send()
|
chainlit.md
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
# Beyond ChatGPT
|
2 |
+
|
3 |
+
This Chainlit app was created following instructions from [this repository!](https://github.com/AI-Maker-Space/Beyond-ChatGPT)
|
requirements.txt
ADDED
@@ -0,0 +1,5 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
chainlit==0.7.700
|
2 |
+
cohere==4.37
|
3 |
+
openai==1.3.5
|
4 |
+
tiktoken==0.5.1
|
5 |
+
python-dotenv==1.0.0
|