Replies: 2 comments
-
Hello @PixifyAI, Flux is definitely on the roadmap. I'm not sure that gguf quantizations could be used in biniou, but it's an interesting alternative. However, it's seems there's not any kind of documentation and these quants seems to be designed to work on Comfy. I will give it a try. |
Beta Was this translation helpful? Give feedback.
-
Hi @PixifyAI, Commit f296fff introduces support for Flux models though model Freepik/flux.1-lite-8B-alpha. However, this monster model still requires a lot of resources at (least 64GB RAM) and it is really slow. Unfortunately, quantization of Flux models are not an option for biniou (I've made a lot of test on a lot of them, but it was definitely a dead end). Freepik/flux.1-lite-8B-alpha is the best I can do at this time, but I'll be very happy to integrate lighter Flux models if they eventually became available. |
Beta Was this translation helpful? Give feedback.
-
Hey could try to add one of the flux quants and see if will work?
https://huggingface.co/city96/FLUX.1-dev-gguf
Beta Was this translation helpful? Give feedback.
All reactions