You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have read the paper PowerInfer-2: Fast Large Language Model Inference on a Smartphone. Will the related code open-sourced?
By the way, the core innovation of the work is how to use the heterogeneous compute system on mobile phone to run a large model. Are there any tutorials about how to use NPU/GPU of Snapdragon 8 Gen 3?
The text was updated successfully, but these errors were encountered:
Prerequisites
Before submitting your issue, please ensure the following:
Feature Description
I have read the paper PowerInfer-2: Fast Large Language Model Inference on a Smartphone. Will the related code open-sourced?
By the way, the core innovation of the work is how to use the heterogeneous compute system on mobile phone to run a large model. Are there any tutorials about how to use NPU/GPU of Snapdragon 8 Gen 3?
The text was updated successfully, but these errors were encountered: