-
Notifications
You must be signed in to change notification settings - Fork 4.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Enable mixtral 8x7b autotp #5257
Conversation
Hi @mrwyattii @delock. Please kindly review. Thanks! |
@delock - do we want this merged in after your CPU autoTP PR? |
Hi @loadams , this can be merged before CPU autoTP workflow PR. I'll keep on working on that PR. |
Hi @loadams. From the failure log it seems an env issue. Could you run the CI again to check if env issue? |
Hi @Yejing-Lai - yes we have a known env issue that we are working to resolve and will merge this PR when fixed. |
This PR aims to enable mixtral 8x7b (MoE model) autotp. Co-authored-by: Logan Adams <[email protected]>
This PR aims to enable mixtral 8x7b (MoE model) autotp. Co-authored-by: Logan Adams <[email protected]>
This PR aims to enable mixtral 8x7b (MoE model) autotp.