FP8 quantized version of AuraFlow v0.3

Quantization

import torch
from huggingface_hub import cached_download
from safetensors.torch import load_file, save_file

ckpt_path = cached_download(
    "https://huggingface.co/fal/AuraFlow-v0.3/resolve/main/aura_flow_0.3.safetensors",
)

state_dict = load_file(ckpt_path)

for key, value in state_dict.items():
    state_dict[key] = value.to(torch.float8_e4m3fn)

save_file(state_dict, "./aura_flow_0.3.float8_e4m3fn.safetensors")
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference API
Unable to determine this model's library. Check the docs .

Model tree for p1atdev/AuraFlow-v0.3-fp8

Base model

fal/AuraFlow-v0.3
Quantized
(1)
this model