Weights are broken
#1
by
patrickvonplaten
- opened
When loading the state dict of this file, one can see that the weights are broken.
E.g. doing the following:
from safetensors.torch import load_file
sd = load_file("./pytorch_lora_weights.safetensors")
print(sd.values)
shows:
...
[ 0.4089, -0.4093, 0.0417, ..., 0.3376, -0.0011, -0.3596]]), tensor([[nan, nan, nan, nan],
[nan, nan, nan, nan],
[nan, nan, nan, nan],
...,
[nan, nan, nan, nan],
[nan, nan, nan, nan],
[nan, nan, nan, nan]]), tensor([[nan, nan, nan, ..., nan, nan, nan],
[nan, nan, nan, ..., nan, nan, nan],
[nan, nan, nan, ..., nan, nan, nan],
[nan, nan, nan, ..., nan, nan, nan]]), tensor([[nan, nan, nan, nan],
[nan, nan, nan, nan],
[nan, nan, nan, nan],
...,
[nan, nan, nan, nan],
[nan, nan, nan, nan],
[nan, nan, nan, nan]]), tensor([[nan, nan, nan, ..., nan, nan, nan],
[nan, nan, nan, ..., nan, nan, nan],
[nan, nan, nan, ..., nan, nan, nan],
[nan, nan, nan, ..., nan, nan, nan]]), tensor([[nan, nan, nan, nan],
[nan, nan, nan, nan],
[nan, nan, nan, nan],
...,
[nan, nan, nan, nan],
[nan, nan, nan, nan],
[nan, nan, nan, nan]]), tensor([[ 0.3282, -0.1082, 0.1383, ..., -0.2528, -0.2138, -0.3022],
[ 0.2439, 0.0952, 0.1135, ..., 0.2367, -0.1400, 0.0796],
[-0.1357, 0.2351, -0.4367, ..., -0.5447, -0.3476, -0.0175],
[-0.2212, 0.2084, -0.5737, ..., -0.0218, -0.1498, 0.6869]]), tensor([[nan, nan, nan, nan],
[nan, nan, nan, nan],
[nan, nan, nan, nan],
...,
[nan, nan, nan, nan],
[nan, nan, nan, nan],
[nan, nan, nan, nan]])])
E.g. for the key: sd['unet.up_blocks.1.attentions.2.transformer_blocks.1.attn2.processor.to_v_lora.up.weight']