Releases: withcatai/node-llama-cpp
v3.0.0-beta.14
3.0.0-beta.14 (2024-03-16)
Bug Fixes
DisposedError
was thrown when calling.dispose()
(#178) (315a3eb)- adapt to breaking
llama.cpp
changes (#178) (315a3eb)
Features
- async model and context loading (#178) (315a3eb)
- automatically try to resolve
Failed to detect a default CUDA architecture
CUDA compilation error (#178) (315a3eb) - detect
cmake
binary issues and suggest fixes on detection (#178) (315a3eb)
Shipped with llama.cpp
release b2440
To use the latest
llama.cpp
release available, runnpx --no node-llama-cpp download --release latest
. (learn more)
v3.0.0-beta.13
3.0.0-beta.13 (2024-03-03)
Bug Fixes
- adapt to
llama.cpp
breaking change (#175) (5a70576) - return user-defined llama tokens (#175) (5a70576)
Features
- gguf parser (#168) (bcaab4f)
- use the best compute layer available by default (#175) (5a70576)
- more guardrails to prevent loading an incompatible prebuilt binary (#175) (5a70576)
inspect
command (#175) (5a70576)GemmaChatWrapper
(#175) (5a70576)TemplateChatWrapper
(#175) (5a70576)
Shipped with llama.cpp
release b2329
To use the latest
llama.cpp
release available, runnpx --no node-llama-cpp download --release latest
. (learn more)
v3.0.0-beta.12
3.0.0-beta.12 (2024-02-24)
Bug Fixes
Features
Shipped with llama.cpp
release b2254
To use the latest
llama.cpp
release available, runnpx --no node-llama-cpp download --release latest
. (learn more)
v2.8.8
v3.0.0-beta.11
3.0.0-beta.11 (2024-02-18)
Features
- completion and infill (#164) (ede69c1)
- support configuring more options for
getLlama
when using"lastBuild"
(#164) (ede69c1) - export
resolveChatWrapperBasedOnWrapperTypeName
(#165) (624fa30)
Shipped with llama.cpp
release b2174
To use the latest
llama.cpp
release available, runnpx --no node-llama-cpp download --release latest
. (learn more)
v2.8.7
v3.0.0-beta.10
3.0.0-beta.10 (2024-02-11)
Features
- get VRAM state (#161) (46235a2)
chatWrapper
getter on aLlamaChatSession
(#161) (46235a2)- minP support (#162) (47b476f)
Shipped with llama.cpp
release b2127
To use the latest
llama.cpp
release available, runnpx --no node-llama-cpp download --release latest
. (learn more)
v3.0.0-beta.9
3.0.0-beta.9 (2024-02-05)
Bug Fixes
- don't block a node process from exiting (#157) (74fb35c)
- respect
logLevel
andlogger
params when using"lastBuild"
(#157) (74fb35c) - print logs on the same event loop cycle (#157) (74fb35c)
Shipped with llama.cpp
release b2074
To use the latest
llama.cpp
release available, runnpx --no node-llama-cpp download --release latest
. (learn more)
v3.0.0-beta.8
3.0.0-beta.8 (2024-02-05)
Bug Fixes
Shipped with llama.cpp
release b2060
To use the latest
llama.cpp
release available, runnpx --no node-llama-cpp download --release latest
. (learn more)
v3.0.0-beta.7
3.0.0-beta.7 (2024-02-05)
Bug Fixes
Shipped with llama.cpp
release b2060
To use the latest
llama.cpp
release available, runnpx --no node-llama-cpp download --release latest
. (learn more)