duplicate space doesn't work

#1
by xi0v - opened
Build failed with exit code: 1
Build logs:

===== Build Queued at 2024-10-18 11:08:29 / Commit SHA: a64fccd =====

--> FROM docker.io/library/python:3.10@sha256:88687413ef82a3d5f47c3d020d24ac82285459e9870757aae7c071d49fe6de1b
DONE 0.0s

--> RUN apt-get update && apt-get install -y fakeroot &&     mv /usr/bin/apt-get /usr/bin/.apt-get &&     echo '#!/usr/bin/env sh\nfakeroot /usr/bin/.apt-get $@' > /usr/bin/apt-get &&     chmod +x /usr/bin/apt-get && 	rm -rf /var/lib/apt/lists/* && 	useradd -m -u 1000 user
CACHED

--> COPY --chown=1000:1000 --from=root / /
CACHED

--> RUN apt-get update && apt-get install -y 	git 	git-lfs 	ffmpeg 	libsm6 	libxext6 	cmake 	rsync 	libgl1-mesa-glx 	&& rm -rf /var/lib/apt/lists/* 	&& git lfs install
CACHED

--> WORKDIR /home/user/app
CACHED

--> RUN pip install --no-cache-dir pip==22.3.1 && 	pip install --no-cache-dir 	datasets 	"huggingface-hub>=0.19" "hf-transfer>=0.1.4" "protobuf<4" "click<8.1" "pydantic~=1.0"
CACHED

--> Restoring cache
DONE 9.1s

--> RUN --mount=target=/tmp/pre-requirements.txt,source=pre-requirements.txt 	pip install --no-cache-dir -r /tmp/pre-requirements.txt
Collecting pip>=23.0.0
  Downloading pip-24.2-py3-none-any.whl (1.8 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.8/1.8 MB 34.3 MB/s eta 0:00:00
Installing collected packages: pip
  Attempting uninstall: pip
    Found existing installation: pip 22.3.1
    Uninstalling pip-22.3.1:
      Successfully uninstalled pip-22.3.1
Successfully installed pip-24.2
DONE 1.9s

--> RUN --mount=target=/tmp/requirements.txt,source=requirements.txt     pip install --no-cache-dir -r /tmp/requirements.txt
Collecting git+https://github.com/huggingface/huggingface_hub (from -r /tmp/requirements.txt (line 5))
  Cloning https://github.com/huggingface/huggingface_hub to /tmp/pip-req-build-kzg68lei
  Running command git clone --filter=blob:none --quiet https://github.com/huggingface/huggingface_hub /tmp/pip-req-build-kzg68lei
  Resolved https://github.com/huggingface/huggingface_hub to commit 07896ee75b37da0d1744c9d03472485b985b3213
  Installing build dependencies: started
  Installing build dependencies: finished with status 'done'
  Getting requirements to build wheel: started
  Getting requirements to build wheel: finished with status 'done'
  Preparing metadata (pyproject.toml): started
  Preparing metadata (pyproject.toml): finished with status 'done'
Collecting git+https://github.com/huggingface/transformers (from -r /tmp/requirements.txt (line 6))
  Cloning https://github.com/huggingface/transformers to /tmp/pip-req-build-ocsxwwlt
  Running command git clone --filter=blob:none --quiet https://github.com/huggingface/transformers /tmp/pip-req-build-ocsxwwlt
  Resolved https://github.com/huggingface/transformers to commit 5a5b590d060ea59433b2f666453f3314d86f98b1
  Installing build dependencies: started
  Installing build dependencies: finished with status 'done'
  Getting requirements to build wheel: started
  Getting requirements to build wheel: finished with status 'done'
  Preparing metadata (pyproject.toml): started
  Preparing metadata (pyproject.toml): finished with status 'done'
Collecting git+https://github.com/huggingface/accelerate (from -r /tmp/requirements.txt (line 7))
  Cloning https://github.com/huggingface/accelerate to /tmp/pip-req-build-4uh5zq2q
  Running command git clone --filter=blob:none --quiet https://github.com/huggingface/accelerate /tmp/pip-req-build-4uh5zq2q
  Resolved https://github.com/huggingface/accelerate to commit a84327e59652b79b1f6e3be58be634fbd35184f3
  Installing build dependencies: started
  Installing build dependencies: finished with status 'done'
  Getting requirements to build wheel: started
  Getting requirements to build wheel: finished with status 'done'
  Preparing metadata (pyproject.toml): started
  Preparing metadata (pyproject.toml): finished with status 'done'
Collecting git+https://github.com/huggingface/diffusers (from -r /tmp/requirements.txt (line 8))
  Cloning https://github.com/huggingface/diffusers to /tmp/pip-req-build-qdo_4rqp
  Running command git clone --filter=blob:none --quiet https://github.com/huggingface/diffusers /tmp/pip-req-build-qdo_4rqp
  Resolved https://github.com/huggingface/diffusers to commit 5704376d0309031a124fcb8a957fc70282ce13eb
  Installing build dependencies: started
  Installing build dependencies: finished with status 'done'
  Getting requirements to build wheel: started
  Getting requirements to build wheel: finished with status 'done'
  Preparing metadata (pyproject.toml): started
  Preparing metadata (pyproject.toml): finished with status 'done'
Collecting git+https://github.com/huggingface/peft (from -r /tmp/requirements.txt (line 9))
  Cloning https://github.com/huggingface/peft to /tmp/pip-req-build-nnkg0kzx
  Running command git clone --filter=blob:none --quiet https://github.com/huggingface/peft /tmp/pip-req-build-nnkg0kzx
  Resolved https://github.com/huggingface/peft to commit 57a452ac1122da23fadd95aa9e6ef04136346f02
  Installing build dependencies: started
  Installing build dependencies: finished with status 'done'
  Getting requirements to build wheel: started
  Getting requirements to build wheel: finished with status 'done'
  Preparing metadata (pyproject.toml): started
  Preparing metadata (pyproject.toml): finished with status 'done'
Collecting git+https://github.com/Lightning-AI/pytorch-lightning (from -r /tmp/requirements.txt (line 13))
  Cloning https://github.com/Lightning-AI/pytorch-lightning to /tmp/pip-req-build-tjkvc1dl
  Running command git clone --filter=blob:none --quiet https://github.com/Lightning-AI/pytorch-lightning /tmp/pip-req-build-tjkvc1dl
  Resolved https://github.com/Lightning-AI/pytorch-lightning to commit 8ad3e29816a63d8ce5c00ac104b14729a4176f4f
  Running command git submodule update --init --recursive -q
  Installing build dependencies: started
  Installing build dependencies: finished with status 'done'
  Getting requirements to build wheel: started
  Getting requirements to build wheel: finished with status 'done'
  Preparing metadata (pyproject.toml): started
  Preparing metadata (pyproject.toml): finished with status 'done'
Collecting torch (from -r /tmp/requirements.txt (line 1))
  Downloading torch-2.5.0-cp310-cp310-manylinux1_x86_64.whl.metadata (28 kB)
Collecting torchaudio (from -r /tmp/requirements.txt (line 2))
  Downloading torchaudio-2.5.0-cp310-cp310-manylinux1_x86_64.whl.metadata (6.4 kB)
Collecting torchvision (from -r /tmp/requirements.txt (line 3))
  Downloading torchvision-0.20.0-cp310-cp310-manylinux1_x86_64.whl.metadata (6.1 kB)
Collecting safetensors (from -r /tmp/requirements.txt (line 4))
  Downloading safetensors-0.4.5-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (3.8 kB)
Collecting xformers (from -r /tmp/requirements.txt (line 10))
  Downloading xformers-0.0.28.post1-cp310-cp310-manylinux_2_28_x86_64.whl.metadata (1.0 kB)
Collecting optimum-quanto (from -r /tmp/requirements.txt (line 11))
  Downloading optimum_quanto-0.2.5-py3-none-any.whl.metadata (13 kB)
Collecting sentencepiece (from -r /tmp/requirements.txt (line 12))
  Downloading sentencepiece-0.2.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (7.7 kB)
Collecting aria2 (from -r /tmp/requirements.txt (line 14))
  Downloading aria2-0.0.1b0-py3-none-manylinux_2_17_x86_64.whl.metadata (28 kB)
Collecting gdown (from -r /tmp/requirements.txt (line 15))
  Downloading gdown-5.2.0-py3-none-any.whl.metadata (5.8 kB)
Collecting gguf>=0.9.1 (from -r /tmp/requirements.txt (line 16))
  Downloading gguf-0.10.0-py3-none-any.whl.metadata (3.5 kB)
Collecting bitsandbytes (from -r /tmp/requirements.txt (line 17))
  Downloading bitsandbytes-0.44.1-py3-none-manylinux_2_24_x86_64.whl.metadata (3.5 kB)
Requirement already satisfied: numpy in /usr/local/lib/python3.10/site-packages (from -r /tmp/requirements.txt (line 18)) (2.1.2)
Collecting psutil (from -r /tmp/requirements.txt (line 19))
  Downloading psutil-6.1.0-cp36-abi3-manylinux_2_12_x86_64.manylinux2010_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (22 kB)
Requirement already satisfied: filelock in /usr/local/lib/python3.10/site-packages (from torch->-r /tmp/requirements.txt (line 1)) (3.16.1)
Requirement already satisfied: typing-extensions>=4.8.0 in /usr/local/lib/python3.10/site-packages (from torch->-r /tmp/requirements.txt (line 1)) (4.12.2)
Collecting networkx (from torch->-r /tmp/requirements.txt (line 1))
  Downloading networkx-3.4.1-py3-none-any.whl.metadata (6.3 kB)
Collecting jinja2 (from torch->-r /tmp/requirements.txt (line 1))
  Downloading jinja2-3.1.4-py3-none-any.whl.metadata (2.6 kB)
Requirement already satisfied: fsspec in /usr/local/lib/python3.10/site-packages (from torch->-r /tmp/requirements.txt (line 1)) (2024.6.1)
Collecting nvidia-cuda-nvrtc-cu12==12.4.127 (from torch->-r /tmp/requirements.txt (line 1))
  Downloading nvidia_cuda_nvrtc_cu12-12.4.127-py3-none-manylinux2014_x86_64.whl.metadata (1.5 kB)
Collecting nvidia-cuda-runtime-cu12==12.4.127 (from torch->-r /tmp/requirements.txt (line 1))
  Downloading nvidia_cuda_runtime_cu12-12.4.127-py3-none-manylinux2014_x86_64.whl.metadata (1.5 kB)
Collecting nvidia-cuda-cupti-cu12==12.4.127 (from torch->-r /tmp/requirements.txt (line 1))
  Downloading nvidia_cuda_cupti_cu12-12.4.127-py3-none-manylinux2014_x86_64.whl.metadata (1.6 kB)
Collecting nvidia-cudnn-cu12==9.1.0.70 (from torch->-r /tmp/requirements.txt (line 1))
  Downloading nvidia_cudnn_cu12-9.1.0.70-py3-none-manylinux2014_x86_64.whl.metadata (1.6 kB)
Collecting nvidia-cublas-cu12==12.4.5.8 (from torch->-r /tmp/requirements.txt (line 1))
  Downloading nvidia_cublas_cu12-12.4.5.8-py3-none-manylinux2014_x86_64.whl.metadata (1.5 kB)
Collecting nvidia-cufft-cu12==11.2.1.3 (from torch->-r /tmp/requirements.txt (line 1))
  Downloading nvidia_cufft_cu12-11.2.1.3-py3-none-manylinux2014_x86_64.whl.metadata (1.5 kB)
Collecting nvidia-curand-cu12==10.3.5.147 (from torch->-r /tmp/requirements.txt (line 1))
  Downloading nvidia_curand_cu12-10.3.5.147-py3-none-manylinux2014_x86_64.whl.metadata (1.5 kB)
Collecting nvidia-cusolver-cu12==11.6.1.9 (from torch->-r /tmp/requirements.txt (line 1))
  Downloading nvidia_cusolver_cu12-11.6.1.9-py3-none-manylinux2014_x86_64.whl.metadata (1.6 kB)
Collecting nvidia-cusparse-cu12==12.3.1.170 (from torch->-r /tmp/requirements.txt (line 1))
  Downloading nvidia_cusparse_cu12-12.3.1.170-py3-none-manylinux2014_x86_64.whl.metadata (1.6 kB)
Collecting nvidia-nccl-cu12==2.21.5 (from torch->-r /tmp/requirements.txt (line 1))
  Downloading nvidia_nccl_cu12-2.21.5-py3-none-manylinux2014_x86_64.whl.metadata (1.8 kB)
Collecting nvidia-nvtx-cu12==12.4.127 (from torch->-r /tmp/requirements.txt (line 1))
  Downloading nvidia_nvtx_cu12-12.4.127-py3-none-manylinux2014_x86_64.whl.metadata (1.7 kB)
Collecting nvidia-nvjitlink-cu12==12.4.127 (from torch->-r /tmp/requirements.txt (line 1))
  Downloading nvidia_nvjitlink_cu12-12.4.127-py3-none-manylinux2014_x86_64.whl.metadata (1.5 kB)
Collecting triton==3.1.0 (from torch->-r /tmp/requirements.txt (line 1))
  Downloading triton-3.1.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (1.3 kB)
Collecting sympy==1.13.1 (from torch->-r /tmp/requirements.txt (line 1))
  Downloading sympy-1.13.1-py3-none-any.whl.metadata (12 kB)
Collecting mpmath<1.4,>=1.1.0 (from sympy==1.13.1->torch->-r /tmp/requirements.txt (line 1))
  Downloading mpmath-1.3.0-py3-none-any.whl.metadata (8.6 kB)
Collecting pillow!=8.3.*,>=5.3.0 (from torchvision->-r /tmp/requirements.txt (line 3))
  Downloading pillow-11.0.0-cp310-cp310-manylinux_2_28_x86_64.whl.metadata (9.1 kB)
Requirement already satisfied: packaging>=20.9 in /usr/local/lib/python3.10/site-packages (from huggingface_hub==0.26.0.dev0->-r /tmp/requirements.txt (line 5)) (24.1)
Requirement already satisfied: pyyaml>=5.1 in /usr/local/lib/python3.10/site-packages (from huggingface_hub==0.26.0.dev0->-r /tmp/requirements.txt (line 5)) (6.0.2)
Requirement already satisfied: requests in /usr/local/lib/python3.10/site-packages (from huggingface_hub==0.26.0.dev0->-r /tmp/requirements.txt (line 5)) (2.32.3)
Requirement already satisfied: tqdm>=4.42.1 in /usr/local/lib/python3.10/site-packages (from huggingface_hub==0.26.0.dev0->-r /tmp/requirements.txt (line 5)) (4.66.5)
Collecting regex!=2019.12.17 (from transformers==4.46.0.dev0->-r /tmp/requirements.txt (line 6))
  Downloading regex-2024.9.11-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (40 kB)
Collecting tokenizers<0.21,>=0.20 (from transformers==4.46.0.dev0->-r /tmp/requirements.txt (line 6))
  Downloading tokenizers-0.20.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (6.7 kB)
Collecting importlib-metadata (from diffusers==0.31.0.dev0->-r /tmp/requirements.txt (line 8))
  Downloading importlib_metadata-8.5.0-py3-none-any.whl.metadata (4.8 kB)
INFO: pip is looking at multiple versions of xformers to determine which version is compatible with other requirements. This could take a while.
Collecting xformers (from -r /tmp/requirements.txt (line 10))
  Downloading xformers-0.0.28-cp310-cp310-manylinux_2_28_x86_64.whl.metadata (1.0 kB)
  Downloading xformers-0.0.27.post2-cp310-cp310-manylinux2014_x86_64.whl.metadata (1.0 kB)
  Downloading xformers-0.0.27.post1-cp310-cp310-manylinux2014_x86_64.whl.metadata (1.0 kB)
  Downloading xformers-0.0.27-cp310-cp310-manylinux2014_x86_64.whl.metadata (1.0 kB)
  Downloading xformers-0.0.26.post1-cp310-cp310-manylinux2014_x86_64.whl.metadata (1.0 kB)
  Downloading xformers-0.0.25.post1-cp310-cp310-manylinux2014_x86_64.whl.metadata (1.0 kB)
  Downloading xformers-0.0.25-cp310-cp310-manylinux2014_x86_64.whl.metadata (1.0 kB)
INFO: pip is still looking at multiple versions of xformers to determine which version is compatible with other requirements. This could take a while.
  Downloading xformers-0.0.24-cp310-cp310-manylinux2014_x86_64.whl.metadata (1.0 kB)
  Downloading xformers-0.0.23.post1-cp310-cp310-manylinux2014_x86_64.whl.metadata (1.0 kB)
  Downloading xformers-0.0.23-cp310-cp310-manylinux2014_x86_64.whl.metadata (1.0 kB)
  Downloading xformers-0.0.22.post7-cp310-cp310-manylinux2014_x86_64.whl.metadata (1.0 kB)
  Downloading xformers-0.0.22-cp310-cp310-manylinux2014_x86_64.whl.metadata (1.0 kB)
INFO: This is taking longer than usual. You might need to provide the dependency resolver with stricter constraints to reduce runtime. See https://pip.pypa.io/warnings/backtracking for guidance. If you want to abort this run, press Ctrl + C.
  Downloading xformers-0.0.21-cp310-cp310-manylinux2014_x86_64.whl.metadata (1.0 kB)
  Downloading xformers-0.0.20-cp310-cp310-manylinux2014_x86_64.whl.metadata (1.1 kB)
Collecting pyre-extensions==0.0.29 (from xformers->-r /tmp/requirements.txt (line 10))
  Downloading pyre_extensions-0.0.29-py3-none-any.whl.metadata (4.0 kB)
Collecting xformers (from -r /tmp/requirements.txt (line 10))
  Downloading xformers-0.0.19-cp310-cp310-manylinux2014_x86_64.whl.metadata (1.1 kB)
  Downloading xformers-0.0.18-cp310-cp310-manylinux2014_x86_64.whl.metadata (1.1 kB)
Collecting pyre-extensions==0.0.23 (from xformers->-r /tmp/requirements.txt (line 10))
  Downloading pyre_extensions-0.0.23-py3-none-any.whl.metadata (4.0 kB)
Collecting xformers (from -r /tmp/requirements.txt (line 10))
  Downloading xformers-0.0.17-cp310-cp310-manylinux2014_x86_64.whl.metadata (1.1 kB)
  Downloading xformers-0.0.16-cp310-cp310-manylinux2014_x86_64.whl.metadata (1.1 kB)
  Downloading xformers-0.0.13.tar.gz (292 kB)
  Preparing metadata (setup.py): started
  Preparing metadata (setup.py): finished with status 'error'
  error: subprocess-exited-with-error
  
  Γ— python setup.py egg_info did not run successfully.
  β”‚ exit code: 1
  ╰─> [6 lines of output]
      Traceback (most recent call last):
        File "<string>", line 2, in <module>
        File "<pip-setuptools-caller>", line 34, in <module>
        File "/tmp/pip-install-418o4h8x/xformers_32abe032fe7a4a21b129683d1e58b564/setup.py", line 18, in <module>
          import torch
      ModuleNotFoundError: No module named 'torch'
      [end of output]
  
  note: This error originates from a subprocess, and is likely not a problem with pip.
error: metadata-generation-failed

Γ— Encountered error while generating package metadata.
╰─> See above for output.

note: This is an issue with the package mentioned above, not pip.
hint: See above for details.

--> ERROR: process "/bin/sh -c pip install --no-cache-dir -r /tmp/requirements.txt" did not complete successfully: exit code: 1

It also seems like fp8 does not work at all?
Is torch on HF's servers not having fp8 still a problem!

And there is something I don't understand, What does;

parser.add_argument("--fix", action="store_true", help="Only fix the keys of the local model.")

Actually do?

Looks like the dev version of Diffusers is having a seizure. I don't use it internally nearly enough in this space and I'll fix the version. But this space is a give up version as the title says, so I don't recommend it for practical use. I don't have time to refine...

It also seems like fp8 does not work at all?
Is torch on HF's servers not having fp8 still a problem!

It works if you don't do Serverless inference. Serverless' support is not yet available. As a matter of fact, torch version-wise supports it, but Diffusers or Transformers are failing out somewhere. Probably one of the base classes of the load system.

--fix

This is an option to correct the file format that cannot be converted by Diffusers, which is different from the BFL official format when converted by ComfyUI. Well, I won't use it. The current version of Diffusers should have officially supported it by now, so I'm just leaving it there for now.

Got it.
I have an fp8 flux dev model that I merged some LoRAs into but I can't even test it out. Do you know of any space that supports single file flux inference?
Can you make flux lora the explorer support it or is it practically impossible? @John6666

Do you know of any space that supports single file flux inference?

I don't know. But I can implement it, since it's almost possible with Diffusers itself. But I'm not sure if it's officially possible to do fp8 single files?
If so, I just need to put in the code to convert just the transformer part, but I wonder if they'll support it officially...
It's bedtime here, so it's tomorrow, but the work itself shouldn't be too difficult.

But I'm a person of leisure, and I'm busy with too many tasks to do in my spare time.
I'm feeling a bit rattled by HF's flux-related issues in general, so maybe we should go ahead and fix the flux-related stuff first?

Edit:
I don't use it, but it looks like it was crashing due to xformers dependencies. xformers is always too strict about specifying dependencies, isn't it? I've seen this before.

I don't know. But I can implement it, since it's almost possible with Diffusers itself. But I'm not sure if it's officially possible to do fp8 single files?

I'm pretty sure it's okay if the file is a valid .safetensors file.

If so, I just need to put in the code to convert just the transformer part, but I wonder if they'll support it officially...

I believe they already fo support single file inference with flux
https://github.com/huggingface/diffusers/blob/5704376d0309031a124fcb8a957fc70282ce13eb/docs/source/en/api/loaders/single_file.md?plain=1#L54

Take your time with the implementation!

I don't use it, but it looks like it was crashing due to xformers dependencies. xformers is always too strict about specifying dependencies, isn't it? I've seen this before.

Good ol Xformers try not to self destruct when a package gets updated (impossible)

If so, I just need to put in the code to convert just the transformer part, but I wonder if they'll support it officially...

Might have found something interesting
https://github.com/huggingface/diffusers/issues/9667#issuecomment-2411112543

https://github.com/huggingface/diffusers/issues/9667#issuecomment-2411112543

Great, so Diffusers practically officially supported.
The only hassle with the implementation is the branching. I'll think of something while I sleep.

I've got it working for now.
https://huggingface.co/spaces/John6666/flux-lora-the-explorer

Great!
I'll be testing it out.

Sign up or log in to comment