Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update the vllm integration #1293

Open
rlouf opened this issue Nov 27, 2024 · 5 comments
Open

Update the vllm integration #1293

rlouf opened this issue Nov 27, 2024 · 5 comments
Assignees
Labels
enhancement vLLM Things involving vLLM support
Milestone

Comments

@rlouf
Copy link
Member

rlouf commented Nov 27, 2024

We need to update the vllm integration to follow the new user interface.

@rlouf rlouf added enhancement vLLM Things involving vLLM support labels Nov 27, 2024
@rlouf rlouf added this to the 1.0 milestone Nov 27, 2024
@rlouf rlouf self-assigned this Dec 8, 2024
@justHungryMan
Copy link

Does this mean that updating vLLM will use up-to-date outlines?

@chris-aeviator
Copy link

chris-aeviator commented Dec 14, 2024

this is not to rant, just to highlight: vllm <-> outlines integration is a pain right now and causing suffering for production usage in a lot of teams I talk to.

It would be awesome if outlines could at least state, very prominently, what version compatabilties to expect (and keep it updated) in the readme.

@rlouf
Copy link
Member Author

rlouf commented Dec 14, 2024

It would be awesome if said teams could open an issue either here or on outlines-core whenever they encounter an issue. We can only help solve the issues we are aware of.

@chris-aeviator
Copy link

chris-aeviator commented Dec 14, 2024

While I agree I also know for a fact that outlines is aware of multiple integration issues with vllm regarding

  • version compatibilities between outlines and vllm (unsure how to connect two repos via issues, vllm repo is in a messy situation)
  • 2-4x slower decoding
  • numpy 1
grafik

@rlouf
Copy link
Member Author

rlouf commented Dec 14, 2024

All 3 being either worked on with the vLLM team or solved in the latest versions. We even created vLLM tags across repos to track those. Let us know if you think there is anything else we can do.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement vLLM Things involving vLLM support
Projects
None yet
Development

No branches or pull requests

3 participants