Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: function calling support in a chat session's prompt function #101

Closed
giladgd opened this issue Nov 19, 2023 · 3 comments
Closed

feat: function calling support in a chat session's prompt function #101

giladgd opened this issue Nov 19, 2023 · 3 comments
Assignees
Labels
help wanted Extra attention is needed new feature New feature or request released on @beta released roadmap Part of the roadmap for node-llama-cpp (https://github.com/orgs/withcatai/projects/1)
Milestone

Comments

@giladgd
Copy link
Contributor

giladgd commented Nov 19, 2023

Make it possible to provide functions that the model can call as part of the response.

It should be as simple as something like that:

const res = await chatSession.prompt("What is the current weather?", {
    functions: {
        getWeather: {
            description: "Get the current weather for a location"
            params: {
                location: {
                    type: "string"
                }
            },
            handler({location}) {
                console.log("Providing fake weather for location:", location);

                return {
                    temperature: 32,
                    raining: true,
                    unit: "celsius"
                };
            }
        },
        getCurrentLocation: {
            description: "Get the current location",
            handler() {
                console.log("Providing fake location");

                return "New York, New York, United States".
            }
        }
    }
});
console.log(res);

If you have ideas of a text format I can use to prompt the model with, please share.
I'm looking for a format that can achieve all of these:

  • A way to give the model the list of possible functions, while utilizing as few tokens for this as possible.
    It can also be OK to give the model only a brief overview of the available functions and let it fetch more info about a function on demand.
  • Make it easy to stream regular text, while still distinguishing between regular text that the model writes and a function call it tries to do

I thought about implementing support for this format as part of GeneralChatPromptWrapper, but I'm not really sure whether this the safest way to distinguish between text and function calling:

You are a helpful, respectful and honest assistant. Always answer as helpfully as possible.
If a question does not make any sense, or is not factually coherent, explain why instead of answering something not correct.
If you don't know the answer to a question, please don't share false information.

Available functions:
` ` `
function getWeather(params: {location: string});
function getCurrentLocation();
` ` `

You can call these functions by writing a text like that:
[[call: myFunction({param1: "value"})]]

### Human:
What is the current weather?

### Assistant:

Then when the model will write text, it may go like that:

I'll get the current location to fetch the weather for it.
[[call: getCurrentLocation()]]

I'll then detect the function call in the model response and evaluate this text:

[[result: "New York, New York, United States"]]

So the model can then continue to provide completion:

I'll now get the current weather for New York, New York, United States.
[[call: getWeather({location: "New York, New York, United States"})]]

I'll then detect the function call in the model response and evaluate this text:

[[result: {temperature: 32, raining: true, unit: "celsius"}]]

So the model can then continue to provide completion:

The current weather for New York, New York, United States is 32 degrees celsius and it's currently raining

I plan to use grammar tricks to make sure the model can only call existing functions and with the right parameter types.

How you can help

  • If you have an idea of a better format, please let me know
  • If you have an idea of a good way to implement this in the LlamaChat (see LlamaChatPromptWrapper) format, this would also be very helpful

I'm currently working on a major change in this module, so if you'd like to help with implementing any of this, please let me know beforehand so your work won't become incompatible with the new changes

@giladgd giladgd self-assigned this Nov 19, 2023
@giladgd giladgd converted this from a draft issue Nov 19, 2023
@giladgd giladgd added new feature New feature or request roadmap Part of the roadmap for node-llama-cpp (https://github.com/orgs/withcatai/projects/1) help wanted Extra attention is needed labels Nov 19, 2023
@giladgd giladgd moved this from Todo to In Progress in node-llama-cpp: roadmap Dec 9, 2023
@giladgd giladgd mentioned this issue Dec 9, 2023
17 tasks
@giladgd giladgd added this to the v3.0.0 milestone Dec 16, 2023
@giladgd giladgd mentioned this issue Jan 17, 2024
7 tasks
Copy link

🎉 This issue has been resolved in version 3.0.0-beta.2 🎉

The release is available on:

Your semantic-release bot 📦🚀

Copy link

🎉 This issue has been resolved in version 3.0.0-beta.4 🎉

The release is available on:

Your semantic-release bot 📦🚀

Copy link

github-actions bot commented Sep 24, 2024

🎉 This PR is included in version 3.0.0 🎉

The release is available on:

Your semantic-release bot 📦🚀

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
help wanted Extra attention is needed new feature New feature or request released on @beta released roadmap Part of the roadmap for node-llama-cpp (https://github.com/orgs/withcatai/projects/1)
Projects
Development

No branches or pull requests

1 participant