v3.0.0-beta.46
Pre-release
Pre-release
3.0.0-beta.46 (2024-09-20)
Bug Fixes
- no thread limit when using a GPU (#322) (2204e7a)
- improve
defineChatSessionFunction
types and docs (#322) (2204e7a) - format numbers printed in the CLI (#322) (2204e7a)
- revert
electron-builder
version used in Electron template (#323) (6c644ff)
Shipped with llama.cpp
release b3787
To use the latest
llama.cpp
release available, runnpx -n node-llama-cpp source download --release latest
. (learn more)