How to reassemble sharded files of gguf into Jan ? (Sharded GGUF support?) #2925
-
How to reassemble sharded files of gguf into Jan ? For example, I want to execute Mixtral-8x22B-Instruct-v0.1-GGUF from Huggingface available here : The file parts are sharded like this :
How can we execute this on Jan, is it supported or planned to be supported ? |
Beta Was this translation helpful? Give feedback.
Replies: 3 comments
-
Today I discovered this documentation page, but I doesn't tell how to assemble sharded parts : https://jan.ai/docs/models/manage-models |
Beta Was this translation helpful? Give feedback.
-
I have found on Jan Hub a Mistral 8X7B Instruct Q4 and it is fine, my problem is resolved, but I think that this would be a great feature into Jan to import sharded GGUF files parts. |
Beta Was this translation helpful? Give feedback.
-
You can import Hugging Face model directly. Guides here: https://jan.ai/changelog/2024-04-25-llama3-command-r-hugginface#import-huggingface-models-directly |
Beta Was this translation helpful? Give feedback.
You can import Hugging Face model directly. Guides here: https://jan.ai/changelog/2024-04-25-llama3-command-r-hugginface#import-huggingface-models-directly