Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Misc]: When encountering torch.OutOfMemoryError and the program fails to start, the following exception message should not be displayed, as it can be confusing: #11437

Closed
1 task done
shiquan1988 opened this issue Dec 23, 2024 · 0 comments · Fixed by #11438
Labels

Comments

@shiquan1988
Copy link
Contributor

Anything you want to discuss about vllm.

Exception ignored in: <function LLM.del at 0x7fef2c991090>
Traceback (most recent call last):
File "/root/vllm/vllm/entrypoints/llm.py", line 236, in del
if self.llm_engine and hasattr(self.llm_engine, "shutdown"):
AttributeError: 'LLM' object has no attribute 'llm_engine'

Before submitting a new issue...

  • Make sure you already searched for relevant issues, and asked the chatbot living at the bottom right corner of the documentation page, which can answer lots of frequently asked questions.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant