Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Installation]: pip install vllm-0.6.2.zip err:setuptools-scm was unable to detect version for /tmp/pip-req-build-7ptioibj #9182

Closed
1 task done
uRENu opened this issue Oct 9, 2024 · 7 comments · Fixed by #11435
Labels
installation Installation problems

Comments

@uRENu
Copy link

uRENu commented Oct 9, 2024

Your current environment

pip install auto_gptq modelscope xformers==0.0.27.post2 torchvision==0.19 torchaudio torch==2.4.0 torchtext numpy wheel
pip install setuptools>=74.1.1 setuptools_scm==8.1.0 pyportfolioopt
pip install "pillow==10.*" -U
pip install sentencepiece charset_normalizer cpm_kernels tiktoken -U
pip install matplotlib scikit-learn tqdm tensorboard -U
pip install datasets huggingface-hub transformers==4.45.0 -U
pip install accelerate transformers_stream_generator -U
pip install pydantic==1.7.4 typer==0.3.0
pip install fastrlock cupy-cuda11x==12.1.0 
pip install vllm-flash-attn==2.6.1 datamodel_code_generator

How you are installing vllm

pip install vllm-0.6.2.zip

Before submitting a new issue...

  • Make sure you already searched for relevant issues, and asked the chatbot living at the bottom right corner of the documentation page, which can answer lots of frequently asked questions.
@uRENu uRENu added the installation Installation problems label Oct 9, 2024
@uRENu
Copy link
Author

uRENu commented Oct 9, 2024

err:

Processing vllm-0.6.2.zip
Installing build dependencies ... done
Getting requirements to build wheel ... error
error: subprocess-exited-with-error

× Getting requirements to build wheel did not run successfully.
│ exit code: 1
╰─> [26 lines of output]
/tmp/pip-build-env-nr7os5qd/overlay/lib/python3.10/site-packages/torch/_subclasses/functional_tensor.py:258: UserWarning: Failed to initialize NumPy: No module named 'numpy' (Triggered internally at ../torch/csrc/utils/tensor_numpy.cpp:84.)
cpu = _conversion_method_template(device=torch.device("cpu"))
Traceback (most recent call last):
File "/home/wangrenzhong/anaconda3/envs/py310/lib/python3.10/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 353, in
main()
File "/home/wangrenzhong/anaconda3/envs/py310/lib/python3.10/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 335, in main
json_out['return_val'] = hook(**hook_input['kwargs'])
File "/home/wangrenzhong/anaconda3/envs/py310/lib/python3.10/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 118, in get_requires_for_build_wheel
return hook(config_settings)
File "/tmp/pip-build-env-nr7os5qd/overlay/lib/python3.10/site-packages/setuptools/build_meta.py", line 332, in get_requires_for_build_wheel
return self._get_build_requires(config_settings, requirements=[])
File "/tmp/pip-build-env-nr7os5qd/overlay/lib/python3.10/site-packages/setuptools/build_meta.py", line 302, in _get_build_requires
self.run_setup()
File "/tmp/pip-build-env-nr7os5qd/overlay/lib/python3.10/site-packages/setuptools/build_meta.py", line 318, in run_setup
exec(code, locals())
File "", line 485, in
File "", line 357, in get_vllm_version
File "/tmp/pip-build-env-nr7os5qd/overlay/lib/python3.10/site-packages/setuptools_scm/_get_version_impl.py", line 158, in get_version
_version_missing(config)
File "/tmp/pip-build-env-nr7os5qd/overlay/lib/python3.10/site-packages/setuptools_scm/_get_version_impl.py", line 112, in _version_missing
raise LookupError(
LookupError: setuptools-scm was unable to detect version for /tmp/pip-req-build-caovnllw.

  Make sure you're either building from a fully intact git repository or PyPI tarballs. Most other sources (such as GitHub's tarballs, a git checkout without the .git folder) don't contain the necessary metadata and will not work.
  
  For example, if you're using pip, instead of https://github.com/user/proj/archive/master.zip use git+https://github.com/user/proj.git#egg=proj
  [end of output]

note: This error originates from a subprocess, and is likely not a problem with pip.
error: subprocess-exited-with-error

× Getting requirements to build wheel did not run successfully.
│ exit code: 1
╰─> See above for output.

@DarkLight1337
Copy link
Member

cc @dtrifiro

@dtrifiro
Copy link
Contributor

dtrifiro commented Oct 9, 2024

Are you trying to build from source? If so, the error is telling you what's wrong:

  Make sure you're either building from a fully intact git repository or PyPI tarballs. Most other sources (such as GitHub's tarballs, a git checkout without the .git folder) don't contain the necessary metadata and will not work.
  
  For example, if you're using pip, instead of https://github.com/user/proj/archive/master.zip use git+https://github.com/user/proj.git#egg=proj
  [end of output]

Which can be solved by installing from a git+https ref:

pip install git+https://github.com/vllm-project/[email protected]

If you're not trying to build from source, I recommend installing the prebuilt wheel

@simon-mo
Copy link
Collaborator

simon-mo commented Oct 9, 2024

Hmm can we still make this workflow work? I would imagine there are folks trying to vendor vLLM without git.

@dtrifiro
Copy link
Contributor

dtrifiro commented Oct 10, 2024

@simon-mo

The right way to provide a tarball, is building a source distribution:

python setup.py sdist
# or, if using modern tools:
python -m build --sdist

this will create a tar archive which can be built. It will look something like this: vllm-0.6.3.dev146+g9b064faa6.d20241010.tar.gz

Another approach would be hacking setup.py to ignore the LookupError:

    try:
        version = get_version(
        write_to="vllm/_version.py",  # TODO: move this to pyproject.toml
    )
    except LookupError:
        version = "0.0.0"

Although this is not ideal:

  • this is hacky
  • we lose any kind of versioning information
  • this would only work temporarily: I'm not sure if this can be done when moving this definition to pyproject.toml
  • I'm sure there's plenty other ways this could backfire

@simon-mo
Copy link
Collaborator

We do provide the tarball on PyPI

@paulcx
Copy link

paulcx commented Oct 16, 2024

@dtrifiro This issue exsited in 0.6.3 docker image.

16T06:22:59.319607711Z /usr/local/lib/python3.12/dist-packages/vllm/connections.py:8: RuntimeWarning: Failed to read commit hash:

16T06:22:59.319678095Z No module named 'vllm._version'

16T06:22:59.319688347Z   from vllm.version import __version__ as VLLM_VERSION

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
installation Installation problems
Projects
None yet
Development

Successfully merging a pull request may close this issue.

5 participants