jimmycarter
commited on
Commit
•
32a7748
1
Parent(s):
8079e2c
Upload README.md
Browse files
README.md
CHANGED
@@ -92,7 +92,7 @@ For usage in ComfyUI, [a single transformer file is provided](https://huggingfac
|
|
92 |
|
93 |
## Fine-tuning
|
94 |
|
95 |
-
The model can be easily finetuned using [SimpleTuner](https://github.com/bghira/SimpleTuner) and the `--flux_attention_masked_training` training option. SimpleTuner has extensive support for parameter-efficient fine-tuning via [LyCORIS](https://github.com/KohakuBlueleaf/LyCORIS), in addition to full-rank fine-tuning.
|
96 |
|
97 |
# Non-technical Report on Schnell De-distillation
|
98 |
|
|
|
92 |
|
93 |
## Fine-tuning
|
94 |
|
95 |
+
The model can be easily finetuned using [SimpleTuner](https://github.com/bghira/SimpleTuner) and the `--flux_attention_masked_training` training option **and the model found in [jimmycarter/LibreFlux-SimpleTuner](https://huggingface.co/jimmycarter/LibreFlux-SimpleTuner)**. This is the same model with the custom pipeline removed, which currently interferes with the ability for SimpleTuner to finetune with it. SimpleTuner has extensive support for parameter-efficient fine-tuning via [LyCORIS](https://github.com/KohakuBlueleaf/LyCORIS), in addition to full-rank fine-tuning. For inference, use the custom pipline from this repo and [follow the example in SimpleTuner to patch in your LyCORIS weights](https://github.com/bghira/SimpleTuner/blob/main/documentation/LYCORIS.md).
|
96 |
|
97 |
# Non-technical Report on Schnell De-distillation
|
98 |
|