Running vLLM Failed Because of Python.h Missing Complained

#13
by taozhang9527 - opened

Followed the instructions to run vLLM for AI21-Jamba-1.5-Mini model. The model was successfully loaded into the GPUs, but failed with the following error message:

Loading safetensors checkpoint shards: 100% Completed | 21/21 [00:06<00:00,  3.40it/s]
Loading safetensors checkpoint shards: 100% Completed | 21/21 [00:06<00:00,  3.39it/s]

INFO 10-01 22:59:01 model_runner.py:1025] Loading model weights took 24.0217 GB
(VllmWorkerProcess pid=902540) INFO 10-01 22:59:03 model_runner.py:1025] Loading model weights took 24.0217 GB
(VllmWorkerProcess pid=902542) INFO 10-01 22:59:03 model_runner.py:1025] Loading model weights took 24.0217 GB
(VllmWorkerProcess pid=902541) INFO 10-01 22:59:03 model_runner.py:1025] Loading model weights took 24.0217 GB
/tmp/tmpdi4nael4/main.c:5:10: fatal error: Python.h: No such file or directory
    5 | #include <Python.h>
      |          ^~~~~~~~~~
/tmp/tmp25ds2kfb/main.c:5:10: fatal error: Python.h: No such file or directory
    5 | #include <Python.h>
      |          ^~~~~~~~~~
compilation terminated.
compilation terminated.
/tmp/tmp2i0nxbjd/main.c:5:10: fatal error: Python.h: No such file or directory
    5 | #include <Python.h>
      |          ^~~~~~~~~~
compilation terminated.
/tmp/tmpfnvaw3b3/main.c:5:10: fatal error: Python.h: No such file or directory
compilation terminated.
/tmp/tmpfnvaw3b3/main.c:5:10: fatal error: Python.h: No such file or directory
    5 | #include <Python.h>
      |          ^~~~~~~~~~
compilation terminated.
(VllmWorkerProcess pid=902540) INFO 10-01 22:59:06 model_runner_base.py:120] Writing input of failed execution to /tmp/err_execute_model_input_20241001-22590
6.pkl...
INFO 10-01 22:59:06 model_runner_base.py:120] Writing input of failed execution to /tmp/err_execute_model_input_20241001-225906.pkl...
(VllmWorkerProcess pid=902541) INFO 10-01 22:59:06 model_runner_base.py:120] Writing input of failed execution to /tmp/err_execute_model_input_20241001-22590
6.pkl...

Hardware: R760 with 4xA100
OS: Ubuntu 22.04
Python Version: 3.11.10
vLLM version: 0.6.2
transformers version: 4.45.1

you need to install "python3.11-devel" package or something like that, idk if "apt" have this

Thank you, this fixed the issue. It also fixed the issue #14. We can close the issue now.

benicio-standard changed discussion status to closed

Sign up or log in to comment