Update README.md
Browse files
README.md
CHANGED
@@ -22,11 +22,6 @@ same vae license on sdxl-vae-fp16-fix
|
|
22 |
|
23 |
SDXL-VAE-FP16-Fix is the [SDXL VAE](https://huggingface.co/stabilityai/sdxl-vae)*, but modified to run in fp16 precision without generating NaNs.
|
24 |
|
25 |
-
| VAE | Decoding in `float32` / `bfloat16` precision | Decoding in `float16` precision |
|
26 |
-
| --------------------- | -------------------------------------------- | ------------------------------- |
|
27 |
-
| SDXL-VAE | ✅ ![](./images/orig-fp32.png) | ⚠️ ![](./images/orig-fp16.png) |
|
28 |
-
| SDXL-VAE-FP16-Fix | ✅ ![](./images/fix-fp32.png) | ✅ ![](./images/fix-fp16.png) |
|
29 |
-
|
30 |
## Details
|
31 |
|
32 |
SDXL-VAE generates NaNs in fp16 because the internal activation values are too big:
|
|
|
22 |
|
23 |
SDXL-VAE-FP16-Fix is the [SDXL VAE](https://huggingface.co/stabilityai/sdxl-vae)*, but modified to run in fp16 precision without generating NaNs.
|
24 |
|
|
|
|
|
|
|
|
|
|
|
25 |
## Details
|
26 |
|
27 |
SDXL-VAE generates NaNs in fp16 because the internal activation values are too big:
|