Spaces:
Sleeping
Sleeping
# Beyond-ChatGPT | |
Chainlit App using Python streaming for Level 0 MLOps | |
LLM Application with Chainlit, Docker, and Huggingface Spaces | |
In this guide, we'll walk you through the steps to create a Language Learning Model (LLM) application using Chainlit, then containerize it using Docker, and finally deploy it on Huggingface Spaces. | |
Prerequisites | |
A GitHub account | |
Docker installed on your local machine | |
A Huggingface Spaces account | |
### Building our App | |
Clone this repo | |
Navigate inside this repo | |
### Install requirements using `pip install -r requirements.txt`????????? | |
Add your OpenAI Key to `.env` file and save the file. | |
Let's try deploying it locally. Make sure you're in the python environment where you installed Chainlit and OpenAI. | |
Run the app using Chainlit | |
``` | |
chainlit run app.py -w | |
``` | |
Great work! Let's see if we can interact with our chatbot. | |
Time to throw it into a docker container a prepare it for shipping | |
Build the Docker Image | |
``` bash | |
docker build -t llm-app . | |
``` | |
Test the Docker Image Locally (Optional) | |
``` bash | |
docker run -p 7860:7860 llm-app | |
``` | |
Visit http://localhost:7860 in your browser to see if the app runs correctly. | |
Great! Time to ship! | |
### Deploy to Huggingface Spaces | |
Make sure you're logged into Huggingface Spaces CLI | |
``` bash | |
huggingface-cli login | |
``` | |
Follow the prompts to authenticate. | |
Deploy to Huggingface Spaces | |
Deploying on Huggingface Spaces using a custom Docker image involves using their web interface. Go to Huggingface Spaces and create a new space, then set it up to use your Docker image from the Huggingface Container Registry. | |
Access the Application | |
Once deployed, access your app at: | |
ruby | |
Copy code | |
https://huggingface.co/spaces/your-username/llm-app | |
Conclusion | |
You've successfully created an LLM application with Chainlit, containerized it with Docker, and deployed it on Huggingface Spaces. Visit the link to interact with your deployed application. | |