Dicussion: Recommended best model base on User's hardware #3611
nguyenhoangthuan99
started this conversation in
Feature Ideas
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Problem
We are currently curating AI models that are optimized specifically for local assistant use cases. The aim is to provide users with models that perform extremely well, even under hardware constraints.
Our app will detect the user's hardware (e.g. VRAM, RAM, GPU/CPU) and recommend the best model for them to use based on their resources. We are considering thresholds like 80% VRAM usage or 50% RAM usage to ensure that the models can run efficiently and smoothly without overwhelming the system.
Basic Idea
Solution
Beta Was this translation helpful? Give feedback.
All reactions