Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

perf: Upgrade vLLM version to 0.6.3.post1 #7858

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

kthui
Copy link
Contributor

@kthui kthui commented Dec 6, 2024

What does the PR do?

Upgrade vLLM version to 0.6.3.post1

Checklist

  • PR title reflects the change and is of format <commit_type>: <Title>
  • Changes are described in the pull request.
  • Related issues are referenced.
  • Populated github labels field
  • Added test plan and verified test passes.
  • Verified that the PR passes existing CI.
  • Verified copyright is correct on all changed files.
  • Added succinct git squash message before merging ref.
  • All template sections are filled out.
  • Optional: Additional screenshots for behavior/output changes with before/after.

Commit Type:

Check the conventional commit type
box here and add the label to the github PR.

  • build
  • ci
  • docs
  • feat
  • fix
  • perf
  • refactor
  • revert
  • style
  • test

Related PRs:

triton-inference-server/vllm_backend#76

Where should the reviewer start?

N/A

Test plan:

N/A

  • CI Pipeline ID: See the related PR.

Caveats:

N/A

Background

N/A

Related Issues: (use one of the action keywords Closes / Fixes / Resolves / Relates to)

N/A

@kthui kthui added the PR: perf A code change that improves performance label Dec 6, 2024
@kthui kthui marked this pull request as ready for review December 6, 2024 18:46
build.py Show resolved Hide resolved
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
PR: perf A code change that improves performance
Development

Successfully merging this pull request may close these issues.

3 participants