Skip to content

Releases: withcatai/node-llama-cpp

v3.0.0-beta.14

16 Mar 22:46
315a3eb
Compare
Choose a tag to compare
v3.0.0-beta.14 Pre-release
Pre-release

3.0.0-beta.14 (2024-03-16)

Bug Fixes

  • DisposedError was thrown when calling .dispose() (#178) (315a3eb)
  • adapt to breaking llama.cpp changes (#178) (315a3eb)

Features

  • async model and context loading (#178) (315a3eb)
  • automatically try to resolve Failed to detect a default CUDA architecture CUDA compilation error (#178) (315a3eb)
  • detect cmake binary issues and suggest fixes on detection (#178) (315a3eb)

Shipped with llama.cpp release b2440

To use the latest llama.cpp release available, run npx --no node-llama-cpp download --release latest. (learn more)

v3.0.0-beta.13

03 Mar 22:24
5a70576
Compare
Choose a tag to compare
v3.0.0-beta.13 Pre-release
Pre-release

3.0.0-beta.13 (2024-03-03)

Bug Fixes

Features


Shipped with llama.cpp release b2329

To use the latest llama.cpp release available, run npx --no node-llama-cpp download --release latest. (learn more)

v3.0.0-beta.12

24 Feb 22:46
fa6cf2e
Compare
Choose a tag to compare
v3.0.0-beta.12 Pre-release
Pre-release

3.0.0-beta.12 (2024-02-24)

Bug Fixes

Features


Shipped with llama.cpp release b2254

To use the latest llama.cpp release available, run npx --no node-llama-cpp download --release latest. (learn more)

v2.8.8

23 Feb 12:26
d841fff
Compare
Choose a tag to compare

2.8.8 (2024-02-23)

Bug Fixes

v3.0.0-beta.11

18 Feb 20:52
624fa30
Compare
Choose a tag to compare
v3.0.0-beta.11 Pre-release
Pre-release

3.0.0-beta.11 (2024-02-18)

Features

  • completion and infill (#164) (ede69c1)
  • support configuring more options for getLlama when using "lastBuild" (#164) (ede69c1)
  • export resolveChatWrapperBasedOnWrapperTypeName (#165) (624fa30)

Shipped with llama.cpp release b2174

To use the latest llama.cpp release available, run npx --no node-llama-cpp download --release latest. (learn more)

v2.8.7

18 Feb 23:09
7450aae
Compare
Choose a tag to compare

2.8.7 (2024-02-18)

Bug Fixes

v3.0.0-beta.10

11 Feb 23:35
47b476f
Compare
Choose a tag to compare
v3.0.0-beta.10 Pre-release
Pre-release

3.0.0-beta.10 (2024-02-11)

Features


Shipped with llama.cpp release b2127

To use the latest llama.cpp release available, run npx --no node-llama-cpp download --release latest. (learn more)

v3.0.0-beta.9

05 Feb 21:08
Compare
Choose a tag to compare
v3.0.0-beta.9 Pre-release
Pre-release

3.0.0-beta.9 (2024-02-05)

Bug Fixes

  • don't block a node process from exiting (#157) (74fb35c)
  • respect logLevel and logger params when using "lastBuild" (#157) (74fb35c)
  • print logs on the same event loop cycle (#157) (74fb35c)

Shipped with llama.cpp release b2074

To use the latest llama.cpp release available, run npx --no node-llama-cpp download --release latest. (learn more)

v3.0.0-beta.8

05 Feb 01:55
d4a39f5
Compare
Choose a tag to compare
v3.0.0-beta.8 Pre-release
Pre-release

3.0.0-beta.8 (2024-02-05)

Bug Fixes


Shipped with llama.cpp release b2060

To use the latest llama.cpp release available, run npx --no node-llama-cpp download --release latest. (learn more)

v3.0.0-beta.7

05 Feb 00:23
9b9677f
Compare
Choose a tag to compare
v3.0.0-beta.7 Pre-release
Pre-release

3.0.0-beta.7 (2024-02-05)

Bug Fixes


Shipped with llama.cpp release b2060

To use the latest llama.cpp release available, run npx --no node-llama-cpp download --release latest. (learn more)