Skip to content

Latest commit

 

History

History
20 lines (16 loc) · 681 Bytes

README.md

File metadata and controls

20 lines (16 loc) · 681 Bytes

SeKernel_for_LLM_UI

This is the repository for the UI for the SeKernel_for_LLM module

Requirements

  • pyqt5
  • llama-cpp-python
  • markdown
  • pyqtspinner
  • pyttsx3

How to:

  • Clone the repo git clone https://github.com/perpendicularai/SeKernel_for_LLM_UI.git
  • Ensure that you have llama-cpp-python installed and running
  • Add your model to the kernel.py script
  • Launch the UI by running python sekernel_ui.py -- Please note : Only internet-connected chat is supported. If you have the skills, you can checkout the plugins.py module to add more functionality to your UI.

Short-films

perpboy_glossy_all.mp4