File size: 2,757 Bytes
1f87be8
 
8377e05
 
 
 
 
 
 
 
 
8162023
8377e05
8162023
 
 
 
 
8377e05
 
8162023
 
 
8377e05
8162023
 
 
 
8377e05
14d1c28
8162023
14d1c28
8162023
8377e05
 
 
 
 
 
 
 
 
 
 
1eeb88b
8377e05
14d1c28
1eeb88b
 
 
 
654582d
1eeb88b
 
 
 
 
 
 
8377e05
 
 
1eeb88b
8377e05
 
 
 
 
 
 
 
 
 
 
3f6c270
 
 
 
 
 
 
 
 
 
 
 
8377e05
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
# Beyond-ChatGPT
Chainlit App using Python streaming for Level 0 MLOps

LLM Application with Chainlit, Docker, and Huggingface Spaces
In this guide, we'll walk you through the steps to create a Language Learning Model (LLM) application using Chainlit, then containerize it using Docker, and finally deploy it on Huggingface Spaces.

Prerequisites
A GitHub account
Docker installed on your local machine
A Huggingface Spaces account


### Building our App
Clone [this](https://github.com/AI-Maker-Space/Beyond-ChatGPT/tree/main) repo.

``` bash
git clone https://github.com/AI-Maker-Space/Beyond-ChatGPT.git
```

Navigate inside this repo
```
cd Beyond-ChatGPT
```

Install the packages required for this python envirnoment in `requirements.txt`.
```
pip install -r requirements.txt
```

Open your `.env` file. Replace the `###` in your `.env` file with your OpenAI Key and save the file.
```
OPENAI_API_KEY=sk-###
```

Let's try deploying it locally. Make sure you're in the python environment where you installed Chainlit and OpenAI.

Run the app using Chainlit

```
chainlit run app.py -w
```

Great work! Let's see if we can interact with our chatbot.

Time to throw it into a docker container a prepare it for shipping

Build the Docker image. We'll tag our image as `llm-app` using the `-t` parameter. The `.` at the end means we want all of the files in our current directory to be added to our image.
``` bash
docker build -t llm-app .
```

Run and test the Docker image locally using the `run` command. The `-p`parameter connects our **host port #** to the left of the `:` to our **container port #** on the right.
``` bash
docker run -p 7860:7860 llm-app
```

Visit http://localhost:7860 in your browser to see if the app runs correctly.

Great! Time to ship!

### Deploy to Huggingface Spaces

Make sure you're logged into Huggingface Spaces CLI

``` bash
huggingface-cli login
```

Follow the prompts to authenticate.


Deploy to Huggingface Spaces


Create a new Huggingface Space

- Owner: Your username
- Space Name: `llm-app`
- License: `Openrail`
- Select the Space SDK: `Docker`
- Docker Template: `Blank`
- Space Hardware: `CPU basic - 2 vCPU - 16 GB - Free`
- Repo type: `Public`



Deploying on Huggingface Spaces using a custom Docker image involves using their web interface. Go to Huggingface Spaces and create a new space, then set it up to use your Docker image from the Huggingface Container Registry.

Access the Application

Once deployed, access your app at:

ruby
Copy code
https://huggingface.co/spaces/your-username/llm-app
Conclusion
You've successfully created an LLM application with Chainlit, containerized it with Docker, and deployed it on Huggingface Spaces. Visit the link to interact with your deployed application.