Model Card for Model ID
ํ๊ตญ์ด ์ ์ฉ ๊ฐ๋ฅ์ฑ ํ ์คํธ์ฉ (์๋ ๋ชจ๋ธ์ ๋นํด์ ์ด๋ฏธ์ง ์์ฑ ์ฑ๋ฅ์ด ๋ฎ์)
Model Details
ํ๊ตญ์ด ์ ์ฉ ๋ฐฉ๋ฒ
- black-forest-labs/FLUX.1-dev๋ฅผ ๊ธฐ๋ฐ์ผ๋ก ์ฌ์ฉ
- Bingsu/clip-vit-large-patch14-ko๋ฅผ ์ ์ฉ
- T5 Encoder์ ํ๊ตญ์ด vocab ํ์ฅ์ ์ ์ฉ + fine-tuning (์ฝ๋ฉ ์ฟผํ ์ด์๋ก ์ธํด์, 1 X A100์ 2์๊ฐ ํ์ต)
example
import torch
from diffusers import FluxPipeline
pipe = FluxPipeline.from_pretrained("jaeyong2/FLUX.1-dev-ko", torch_dtype=torch.bfloat16)
pipe.enable_model_cpu_offload() #save some VRAM by offloading the model to CPU. Remove this if you have enough GPU power
prompt = "ํ๋ ์๋ฅผ ๋ฌ๋ฆฌ๋ ๊ณ ์์ด'"
image = pipe(
prompt,
height=1024,
width=1024,
guidance_scale=3.5,
num_inference_steps=50,
max_sequence_length=512,
generator=torch.Generator("cpu").manual_seed(0)
).images[0]
- Downloads last month
- 8
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.
Model tree for jaeyong2/FLUX.1-dev-ko-Preview
Base model
black-forest-labs/FLUX.1-dev