Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Not worlking as intended. #79

Closed
3 tasks
arthurwolf opened this issue Oct 22, 2023 · 3 comments · Fixed by #80
Closed
3 tasks

Not worlking as intended. #79

arthurwolf opened this issue Oct 22, 2023 · 3 comments · Fixed by #80
Labels
bug Something isn't working released

Comments

@arthurwolf
Copy link

arthurwolf commented Oct 22, 2023

Issue description

Following instructions as-is just doesn' t work.

Expected Behavior

Working.

Actual Behavior

Not working.

Steps to reproduce

I follow the exact instructions at https://www.npmjs.com/package/node-llama-cpp (npm install and copy/paste code into .js file)

I try running (only changing the line with the .gguf file to using a file on my harddrive) it I get:

╭─arthur at aquarelle in ~/dev/ai/llmi/src on main✘✘✘ 23-10-22 - 19:43:11
╰─⠠⠵ node structure.js                                                                                                                                                                                                                                                                                                                                                                                                                                                                            on main↑1|✚1…2
(node:2540491) Warning: To load an ES module, set "type": "module" in the package.json or use the .mjs extension.
(Use `node --trace-warnings ...` to show where the warning was created)
/home/arthur/dev/ai/llmi/src/structure.js:2
import {LlamaModel, LlamaContext, LlamaChatSession} from "node-llama-cpp";
^^^^^^

SyntaxError: Cannot use import statement outside a module
    at internalCompileFunction (node:internal/vm:73:18)
    at wrapSafe (node:internal/modules/cjs/loader:1153:20)
    at Module._compile (node:internal/modules/cjs/loader:1197:27)
    at Module._extensions..js (node:internal/modules/cjs/loader:1287:10)
    at Module.load (node:internal/modules/cjs/loader:1091:32)
    at Module._load (node:internal/modules/cjs/loader:938:12)
    at Function.executeUserEntryPoint [as runMain] (node:internal/modules/run_main:83:12)
    at node:internal/main/run_main_module:23:47

Google recommends I change to:

//import {LlamaModel, LlamaContext, LlamaChatSession} from "node-llama-cpp";
const { LlamaModel, LlamaContext, LlamaChatSession } = require('node-llama-cpp');

So I do that (am I wrong or is it impossible for the example given in the README to work...?), and I get:

╭─arthur at aquarelle in ~/dev/ai/llmi/src on main✘✘✘ 23-10-22 - 19:43:15
╰─⠠⠵ node structure.js                                                                                                                                                                                                                                                                                                                                                                                                                                                                            on main↑1|✚1…2
/home/arthur/dev/ai/llmi/src/structure.js:15
const a1 = await session.prompt(q1);
           ^^^^^

SyntaxError: await is only valid in async functions and the top level bodies of modules
    at internalCompileFunction (node:internal/vm:73:18)
    at wrapSafe (node:internal/modules/cjs/loader:1153:20)
    at Module._compile (node:internal/modules/cjs/loader:1197:27)
    at Module._extensions..js (node:internal/modules/cjs/loader:1287:10)
    at Module.load (node:internal/modules/cjs/loader:1091:32)
    at Module._load (node:internal/modules/cjs/loader:938:12)
    at Function.executeUserEntryPoint [as runMain] (node:internal/modules/run_main:83:12)
    at node:internal/main/run_main_module:23:47

Node.js v20.5.1

So I put the awaits inside an async:

// Async hell.
(async () => {

    const q1 = "Hi there, how are you?";
    console.log("User: " + q1);
    
    const a1 = await session.prompt(q1);
    console.log("AI: " + a1);
    
    
    const q2 = "Summerize what you said";
    console.log("User: " + q2);
    
    const a2 = await session.prompt(q2);
    console.log("AI: " + a2);
    
})();

now I get:

╭─arthur at aquarelle in ~/dev/ai/llmi/src on main✘✘✘ 23-10-22 - 19:44:31
╰─⠠⠵ node structure.js                                                                                                                                                                                                                                                                                                                                                                                                                                                                            on main↑1|✚1…2
/home/arthur/dev/ai/llmi/src/structure.js:3
const { LlamaModel, LlamaContext, LlamaChatSession } = require('node-llama-cpp');
                                                       ^

Error [ERR_REQUIRE_ESM]: require() of ES Module /home/arthur/dev/ai/llmi/src/node_modules/node-llama-cpp/dist/index.js from /home/arthur/dev/ai/llmi/src/structure.js not supported.
Instead change the require of index.js in /home/arthur/dev/ai/llmi/src/structure.js to a dynamic import() which is available in all CommonJS modules.
    at Object.<anonymous> (/home/arthur/dev/ai/llmi/src/structure.js:3:56) {
  code: 'ERR_REQUIRE_ESM'
}

Node.js v20.5.1
╭─arthur at aquarelle in ~/dev/ai/llmi/src on main✘✘✘ 23-10-22 - 19:44:51
╰─⠠⠵         

At that point I just give up...

(Note this is after nearly an hour trying to get this module to work with ts-node, and utterly failing, despite trying DOZENS of things from Google and ChatGPT... I use thousands of modules from npm in ts projects, this is the first time I get this much trouble... whfich is why I failed back to trying to run it with node (instead of ts-node) to simplify the issue, and as you can see above, even that fails...)

I'm at a loss...

Any help welcome.

My Environment

Latest Ubuntu, Node 20.5.1

Additional Context

No response

Relevant Features Used

  • Metal support
  • CUDA support
  • Grammar

Are you willing to resolve this issue by submitting a Pull Request?

Yes, I have the time, and I know how to start.

@arthurwolf arthurwolf added bug Something isn't working requires triage Requires triaging labels Oct 22, 2023
@paul-oms
Copy link

paul-oms commented Oct 22, 2023

It's failing because you're using Imports outside of a node module. Nothing to do with this project. The node error message describes a fix - Warning: To load an ES module, set "type": "module" in the package.json or use the .mjs extension.

Therefore, rename your original file that uses Imports from .js to .mjs and run node filename.mjs and it will work

@giladgd
Copy link
Contributor

giladgd commented Oct 23, 2023

@arthurwolf node-llama-cpp is an ES module, so you can only import it and cannot use it with require, and to do so your project has to be an ES module as well.

Just like the error you've got says:

(node:2540491) Warning: To load an ES module, set "type": "module" in the package.json or use the .mjs extension.

You should add "type": "module" to the package.json of your project, and then the code will work for you.

I'll add a section in the getting started guide to explain this better.

@giladgd giladgd closed this as completed Oct 23, 2023
@giladgd giladgd removed the requires triage Requires triaging label Oct 23, 2023
@giladgd giladgd mentioned this issue Oct 25, 2023
7 tasks
@github-actions
Copy link

🎉 This issue has been resolved in version 2.7.4 🎉

The release is available on:

Your semantic-release bot 📦🚀

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working released
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants