Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Enable mixtral 8x7b autotp #5257

Merged
merged 7 commits into from
Mar 27, 2024
Merged

Conversation

Yejing-Lai
Copy link
Contributor

This PR aims to enable mixtral 8x7b (MoE model) autotp.

@Yejing-Lai
Copy link
Contributor Author

Hi @mrwyattii @delock. Please kindly review. Thanks!

@loadams
Copy link
Contributor

loadams commented Mar 12, 2024

Hi @mrwyattii @delock. Please kindly review. Thanks!

@delock - do we want this merged in after your CPU autoTP PR?

@delock
Copy link
Collaborator

delock commented Mar 13, 2024

Hi @mrwyattii @delock. Please kindly review. Thanks!

@delock - do we want this merged in after your CPU autoTP PR?

Hi @loadams , this can be merged before CPU autoTP workflow PR. I'll keep on working on that PR.

@loadams loadams enabled auto-merge March 13, 2024 23:00
@Yejing-Lai
Copy link
Contributor Author

Hi @loadams. From the failure log it seems an env issue. Could you run the CI again to check if env issue?

@loadams
Copy link
Contributor

loadams commented Mar 18, 2024

Hi @loadams. From the failure log it seems an env issue. Could you run the CI again to check if env issue?

Hi @Yejing-Lai - yes we have a known env issue that we are working to resolve and will merge this PR when fixed.

@loadams loadams added this pull request to the merge queue Mar 27, 2024
@github-merge-queue github-merge-queue bot removed this pull request from the merge queue due to failed status checks Mar 27, 2024
@tjruwase tjruwase added this pull request to the merge queue Mar 27, 2024
Merged via the queue into microsoft:master with commit 8d98e17 Mar 27, 2024
12 checks passed
rraminen pushed a commit to ROCm/DeepSpeed that referenced this pull request May 9, 2024
This PR aims to enable mixtral 8x7b (MoE model) autotp.

Co-authored-by: Logan Adams <[email protected]>
dbyoung18 pushed a commit to dbyoung18/DeepSpeed that referenced this pull request Jun 11, 2024
This PR aims to enable mixtral 8x7b (MoE model) autotp.

Co-authored-by: Logan Adams <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants