--- license: apache-2.0 --- # FLUX.1 [schnell] -- Flumina Server App This repository contains an implementation of FLUX.1 [dev] inference on Fireworks AI's new Flumina Server App toolkit. ![Example output](example.png) ## Deploying FLUX.1 [dev] to Fireworks On-Demand FLUX.1 [dev] (bfloat16) is available on Fireworks via [on-demand deployments](https://docs.fireworks.ai/guides/ondemand-deployments). It can be deployed in a few simple steps: ### Prerequisite: Clone the Weights into this Repository 1. Authenticate with HuggingFace with `huggingface-cli login` 2. Go to the model [repository](https://huggingface.co/black-forest-labs/FLUX.1-schnell) and accept the license agreement 3. From the root of this repository, clone the model into the `data/` folder by running `clone_weights.sh` ### Prerequisite: Install the Flumina CLI The Flumina CLI is included with the [fireworks-ai](https://pypi.org/project/fireworks-ai/) Python package. It can be installed with pip like so: ```bash pip install 'fireworks-ai[flumina]>=0.15.7' ``` Also get an API key from the [Fireworks site](https://fireworks.ai/account/api-keys) and set it in the Flumina CLI: ```bash flumina set-api-key YOURAPIKEYHERE ``` ### Creating an On-Demand Deployment `flumina deploy` can be used to create an on-demand deployment. When invoked with a model name that exists already, it will create a new deployment in your account which has that model: ```bash flumina deploy accounts/fireworks/models/flux-1-schnell ``` When successful, the CLI will print out example commands to call your new deployment, for example: ```bash curl -X POST 'https://api.fireworks.ai/inference/v1/workflows/accounts/fireworks/models/flux-1-schnell/control_net?deployment=accounts/u-6jamesr6-63834f/deployments/bc45761f' \ -H 'Authorization: Bearer API_KEY' \ -F "prompt=" \ -F "control_image=" \ -F "control_mode=" \ -F "aspect_ratio=16:9" \ -F "guidance_scale=3.5" \ -F "num_inference_steps=4" \ -F "seed=0" \ -F "controlnet_conditioning_scale=" curl -X POST 'https://api.fireworks.ai/inference/v1/workflows/accounts/fireworks/models/flux-1-schnell/text_to_image?deployment=accounts/u-6jamesr6-63834f/deployments/bc45761f' \ -H 'Authorization: Bearer API_KEY' \ -H "Content-Type: application/json" \ -d '{ "prompt": "", "aspect_ratio": "16:9", "guidance_scale": 3.5, "num_inference_steps": 4, "seed": 0 }' ``` Your deployment can also be administered using the Flumina CLI. Useful commands include: * `flumina list deployments` to show all of your deployments * `flumina get deployment` to get details about a specific deployment * `flumina delete deployment` to delete a deployment ## Add-ons Add-ons are packages you can upload to Fireworks and load into your deployed model. This Flumina app implements two types of add-ons: "controlnet_union" and "lora". ### Deploying an existing add-on to your FLUX on-demand deployment Existing addons can be deployed to Fireworks via the `flumina create deployed_addon` command. The command takes two arguments: * The resource name of the addon to be deployed * The resource name of the deployment on which to deploy the add-on Both resource names can either be an ID (e.g. `my-addon` or `012345` for addons and deployments respectively) in which case it is assumed that the resource is in your Fireworks account or they can be fully-qualified names like `accounts/my-account/models/my-addon` or `accounts/my-account/deployments/my-deployment`. For example: ```bash flumina create deployed_addon accounts/fireworks/models/flux-1-dev-controlnet-union 0123456 ``` This will deploy the `flux-1-dev-controlnet-union` add-on in the fireworks account to your deployment named `0123456`. ### Creating custom addons (ControlNet and LoRA adapters) This model supports addons (specifically ControlNet Union and LoRA adapters) in the HuggingFace diffusers format. Here is an example of taking an existing LoRA addon from HuggingFace and deploying it to Fireworks: ```bash git clone https://huggingface.co/brushpenbob/flux-midjourney-anime cd flux-midjourney-anime flumina init addon lora --allow-non-empty flumina create model flux-1-dev-midjourney-anime --base-model accounts/fireworks/models/flux-1-dev ``` Then you can deploy it to your on-demand deployment like so: ```bash flumina create deployed_addon flux-1-dev-midjourney-anime 0123456 ``` ## What is Flumina? Flumina is Fireworks.ai’s new system for hosting Server Apps that allows users to deploy deep learning inference to production in minutes, not weeks. ## What does Flumina offer for FLUX models? Flumina offers the following benefits: * Clear, precise definition of the server-side workload by looking at the server app implementation (you are here) * Extensibility interface, which allows for dynamic loading/dispatching of add-ons server-side. For FLUX: * ControlNet (Union) adapters * LoRA adapters * Off-the-shelf support for standing up on-demand capacity for the Server App on Fireworks * Further, customization of the logic of the deployment by modifying the Server App and deploying the modified version. ## Deploying Custom FLUX.1 [schnell] Apps to Fireworks On-demand Flumina base app upload is coming soon!