Skip to content
#

llama-cpp-python

Here are 22 public repositories matching this topic...

A quick and optimized solution to manage llama based gguf quantized models, download gguf files, retreive messege formatting, add more models from hf repos and more. It's super easy to use and comes prepacked with best preconfigured open source models: dolphin phi-2 2.7b, mistral 7b v0.2, mixtral 8x7b v0.1, solar 10.7b and zephyr 3b

  • Updated Jan 13, 2024
  • Python

Experimental interface environment for open source LLM, designed to democratize the use of AI. Powered by llama-cpp, llama-cpp-python and Gradio.

  • Updated Dec 7, 2024
  • Python

Llama.cpp é uma biblioteca desenvolvida em C++ para a implementação eficiente de grandes modelos de linguagem, como o LLaMA da Meta. Otimizada para rodar em diversas plataformas, incluindo dispositivos com recursos limitados, oferece performance, velocidade de inferência e uso eficiente da memória, essenciais para a execução de grandes. modelos

  • Updated Jun 2, 2024
  • Python

Improve this page

Add a description, image, and links to the llama-cpp-python topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the llama-cpp-python topic, visit your repo's landing page and select "manage topics."

Learn more