This repo contains some streamlit applications that help at interacting with Llama3 and plot results as diagrams.
To execute this simple app setup a new virtual environment then (for linux) run
git clone https://github.com/FlorianMehnert/kp_digital_humanities.git dh_5
cd dh_5
python -m venv dh_5
source dh_5/bin/activate
pip install -r requirements.txt
curl -fsSL https://ollama.com/install.sh | sh
ollama pull llama3
streamlit run st_llama_automate.py
if you run the app using streamlit run llama.py
your app should look like this:
upon entering the start computation button the llm tries to answer your predefined questions based on the paragraphs in the original text. questions entered in the sidebar are answered one by one upon finishing the current generation process