demo.mp4
- Drag-and-drop AI Agents Builder:
- High-level, batteries-included prompting techniques (MCTS, Self-Refinement, BoN, ToT, etc.)
- Low-level primitives for parallel/sequential sampling (loops, if-else, merge branches)
- Verifiers (Code nodes, LLM-as-a-judge, software integrations, etc.)
- Debug with Evals Visualizer:
- Common reasoning benchmarks (GSM8k, MATH, ARC, etc.)
- Scorers via LLM-as-a-judge
- Custom datasets via CSV, JSONL, HF Datasets
- One-Click Deployment of a Batch Inference API:
- Self-hosting of async batch APIs for full flexbility
- Submit/manage batch jobs via UI for ease of use
- Fault tolerance and job persistence for long-running jobs
- Easy-to-hack, eg., one can add new workflow nodes by simply creating a single Python file.
- JSON configs of workflow graphs, enabling easy sharing and version control.
- Lightweight via minimal dependencies, avoiding bloated LLM frameworks.
- Canvas
- Async/Batch Execution
- Evals
- Spur API
- New Nodes
- LLM Nodes
- If-Else
- Merge Branches
- Tools
- Loops
- Pipeline optimization via DSPy and related methods
- Templates
- Compile Spurs to Code
- Multimodal support
- Containerization of Code Verifiers
- Leaderboard
- Generate Spurs via AI
Your feedback will be massively appreciated. Please tell us which features on that list you like to see next or request entirely new ones.
You can get PySpur up and running in three quick steps.
-
Clone the repository:
git clone https://github.com/PySpur-com/PySpur.git cd pyspur
-
Start the docker services:
sudo docker compose up --build -d
This will start a local instance of PySpur that will store spurs and their runs in a local SQLite file.
-
Access the portal:
Go to
http://localhost:6080/
in your browser.Enter
pyspur
/canaryhattan
as username/password.