metadata
license: apache-2.0
base_model: fal/AuraFlow-v0.3
base_model_relation: quantized
FP8 quantized version of AuraFlow v0.3
Quantization
import torch
from huggingface_hub import cached_download
from safetensors.torch import load_file, save_file
ckpt_path = cached_download(
"https://huggingface.co/fal/AuraFlow-v0.3/resolve/main/aura_flow_0.3.safetensors",
)
state_dict = load_file(ckpt_path)
for key, value in state_dict.items():
state_dict[key] = value.to(torch.float8_e4m3fn)
save_file(state_dict, "./aura_flow_0.3.float8_e4m3fn.safetensors")