This is the repository for the UI for the SeKernel_for_LLM module
- pyqt5
- llama-cpp-python
- markdown
- pyqtspinner
- pyttsx3
- Clone the repo
git clone https://github.com/perpendicularai/SeKernel_for_LLM_UI.git
- Ensure that you have llama-cpp-python installed and running
- Add your model to the
kernel.py
script - Launch the UI by running
python sekernel_ui.py
-- Please note : Only internet-connected chat is supported. If you have the skills, you can checkout the plugins.py module to add more functionality to your UI.